Science.gov

Sample records for fusion simulation project

  1. Fusion Simulation Project Workshop Report

    NASA Astrophysics Data System (ADS)

    Kritz, Arnold; Keyes, David

    2009-03-01

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved 46 physicists, applied mathematicians and computer scientists, from 21 institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a 3-day workshop in May 2007.

  2. AI/Simulation Fusion Project at Lawrence Livermore National Laboratory

    SciTech Connect

    Erickson, S.A.

    1984-04-25

    This presentation first discusses the motivation for the AI Simulation Fusion project. After discussing very briefly what expert systems are in general, what object oriented languages are in general, and some observed features of typical combat simulations, it discusses why putting together artificial intelligence and combat simulation makes sense. We then talk about the first demonstration goal for this fusion project.

  3. Scientific and computational challenges of the fusion simulation project (FSP)

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2008-07-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER — a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied

  4. Fusion Simulation Project. Workshop Sponsored by the U.S. Department of Energy, Rockville, MD, May 16-18, 2007

    SciTech Connect

    Kritz, A.; Keyes, D.

    2007-05-18

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  5. Fusion Simulation Project. Workshop sponsored by the U.S. Department of Energy Rockville, MD, May 16-18, 2007

    SciTech Connect

    2007-05-16

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  6. Lessons Learned from ASCI applied to the Fusion Simulation Project (FSP)

    NASA Astrophysics Data System (ADS)

    Post, Douglass

    2003-10-01

    The magnetic fusion program has proposed a 20M dollar per year project to develop a computational predictive capability for magnetic fusion experiments. The DOE NNSA launched a program in 1996, the Accelerated Strategic Computing Initiative (ASCI) to achieve the same goal for nuclear weapons to allow certification of the stockpile without testing. We present a "lessons learned" analysis of the 3B dollary 7 year ASCI program with the goal of improving the FSP to maximize the likelihood of success. The major lessons from ASCI are: 1. Build on your institution's successful history; 2.Teams are the key element; 3. Sound Software Project Management is essential: 4. Requirements, schedule and resources must be consistent; 5. Practices, not processes, are important; 6. Minimize and mitigate risks; 7. Minimize the computer science research aspect and maximize the physics elements; and 8. Verification and Validation are essential. We map this experience and recommendations into the FSP.

  7. SKIDS data fusion project

    NASA Astrophysics Data System (ADS)

    Greenway, Phil

    1992-04-01

    The European Community's strategic research initiative in information technology (ESPRIT) has been in place for nearly five years. An early example of the pan-European collaborative projects being conducted under this initiative is 'SKIDS': Signal and Knowledge Integration with Decisional Control for Multisensory Systems. This four year project, which is approaching completion, aims to build a real-time multisensor perception machine. This machine will be capable of performing data fusion, interpretation, situation assessment, and resource allocation tasks, under the constraints of both time and resource availability, and in the presence of uncertain data. Of the many possible applications, the surveillance and monitoring of a semi-automated 'factory environment' has been chosen as a challenging and representative test scenario. This paper presents an overview of the goals and objectives of the project, the makeup of the consortium, and roles of the members within it, and the main technical achievements to data. In particular, the following are discussed: relevant application domains, and the generic requirements that can be inferred from them; sensor configuration, including choice, placement, etc.; control paradigms, including the possible trade-offs between centralized, hierarchical, and decentralized approaches; the corresponding hardware architectural choices, including the need for parallel processing; and the appropriate software architecture and infra-structure required to support the chosen task oriented approach. Specific attention is paid to the functional decomposition of the system and how the requirements for control impact the organization of the identified interpretation tasks. Future work and outstanding problems are considered in some concluding remarks. By virtue of limited space, this paper is descriptive rather than explanatory.

  8. Advanced fusion concepts: project summaries

    SciTech Connect

    1980-12-01

    This report contains descriptions of the activities of all the projects supported by the Advanced Fusion Concepts Branch of the Office of Fusion Energy, US Department of Energy. These descriptions are project summaries of each of the individual projects, and contain the following: title, principle investigators, funding levels, purpose, approach, progress, plans, milestones, graduate students, graduates, other professional staff, and recent publications. Information is given for each of the following programs: (1) reverse-field pinch, (2) compact toroid, (3) alternate fuel/multipoles, (4) stellarator/torsatron, (5) linear magnetic fusion, (6) liners, and (7) Tormac. (MOW)

  9. Fusion Simulation Program

    SciTech Connect

    Project Staff

    2012-02-29

    Under this project, General Atomics (GA) was tasked to develop the experimental validation plans for two high priority ISAs, Boundary and Pedestal and Whole Device Modeling in collaboration with the theory, simulation and experimental communities. The following sections have been incorporated into the final FSP Program Plan (www.pppl.gov/fsp), which was delivered to the US Department of Energy (DOE). Additional deliverables by GA include guidance for validation, development of metrics to evaluate success and procedures for collaboration with experiments. These are also part of the final report.

  10. Fusion Plasma Theory project summaries

    SciTech Connect

    Not Available

    1993-10-01

    This Project Summary book is a published compilation consisting of short descriptions of each project supported by the Fusion Plasma Theory and Computing Group of the Advanced Physics and Technology Division of the Department of Energy, Office of Fusion Energy. The summaries contained in this volume were written by the individual contractors with minimal editing by the Office of Fusion Energy. Previous summaries were published in February of 1982 and December of 1987. The Plasma Theory program is responsible for the development of concepts and models that describe and predict the behavior of a magnetically confined plasma. Emphasis is given to the modelling and understanding of the processes controlling transport of energy and particles in a toroidal plasma and supporting the design of the International Thermonuclear Experimental Reactor (ITER). A tokamak transport initiative was begun in 1989 to improve understanding of how energy and particles are lost from the plasma by mechanisms that transport them across field lines. The Plasma Theory program has actively-participated in this initiative. Recently, increased attention has been given to issues of importance to the proposed Tokamak Physics Experiment (TPX). Particular attention has been paid to containment and thermalization of fast alpha particles produced in a burning fusion plasma as well as control of sawteeth, current drive, impurity control, and design of improved auxiliary heating. In addition, general models of plasma behavior are developed from physics features common to different confinement geometries. This work uses both analytical and numerical techniques. The Fusion Theory program supports research projects at US government laboratories, universities and industrial contractors. Its support of theoretical work at universities contributes to the office of Fusion Energy mission of training scientific manpower for the US Fusion Energy Program.

  11. Integrated simulation and modeling capability for alternate magnetic fusion concepts

    SciTech Connect

    Cohen, B. I.; Hooper, E.B.; Jarboe, T. R.; LoDestro, L. L.; Pearlstein, L. D.; Prager, S. C.; Sarff, J. S.

    1998-11-03

    This document summarizes a strategic study addressing the development of a comprehensive modeling and simulation capability for magnetic fusion experiments with particular emphasis on devices that are alternatives to the mainline tokamak device. A code development project in this area supports two defined strategic thrust areas in the Magnetic Fusion Energy Program: (1) comprehensive simulation and modeling of magnetic fusion experiments and (2) development, operation, and modeling of magnetic fusion alternate- concept experiment

  12. Fusion Simulation Program Definition. Final report

    SciTech Connect

    Cary, John R.

    2012-09-05

    We have completed our contributions to the Fusion Simulation Program Definition Project. Our contributions were in the overall planning with concentration in the definition of the area of Software Integration and Support. We contributed to the planning of multiple meetings, and we contributed to multiple planning documents.

  13. Simulation of Fusion Plasmas

    ScienceCinema

    Holland, Chris [UC San Diego, San Diego, California, United States

    2016-07-12

    The upcoming ITER experiment (www.iter.org) represents the next major milestone in realizing the promise of using nuclear fusion as a commercial energy source, by moving into the “burning plasma” regime where the dominant heat source is the internal fusion reactions. As part of its support for the ITER mission, the US fusion community is actively developing validated predictive models of the behavior of magnetically confined plasmas. In this talk, I will describe how the plasma community is using the latest high performance computing facilities to develop and refine our models of the nonlinear, multiscale plasma dynamics, and how recent advances in experimental diagnostics are allowing us to directly test and validate these models at an unprecedented level.

  14. Magnetic fusion and project ITER

    SciTech Connect

    Park, H.K.

    1992-09-01

    It has already been demonstrated that our economics and international relationship are impacted by an energy crisis. For the continuing prosperity of the human race, a new and viable energy source must be developed within the next century. It is evident that the cost will be high and will require a long term commitment to achieve this goal due to a high degree of technological and scientific knowledge. Energy from the controlled nuclear fusion is a safe, competitive, and environmentally attractive but has not yet been completely conquered. Magnetic fusion is one of the most difficult technological challenges. In modem magnetic fusion devices, temperatures that are significantly higher than the temperatures of the sun have been achieved routinely and the successful generation of tens of million watts as a result of scientific break-even is expected from the deuterium and tritium experiment within the next few years. For the practical future fusion reactor, we need to develop reactor relevant materials and technologies. The international project called ``International Thermonuclear Experimental Reactor (ITER)`` will fulfill this need and the success of this project will provide the most attractive long-term energy source for mankind.

  15. SECAD-- a Schema-based Environment for Configuring, Analyzing and Documenting Integrated Fusion Simulations. Final report

    SciTech Connect

    Shasharina, Svetlana

    2012-05-23

    SECAD is a project that developed a GUI for running integrated fusion simulations as implemented in FACETS and SWIM SciDAC projects. Using the GUI users can submit simulations locally and remotely and visualize the simulation results.

  16. Project Icarus: Nuclear Fusion Propulsion Concept Comparison

    NASA Astrophysics Data System (ADS)

    Stanic, M.

    Project Icarus will use nuclear fusion as the primary propulsion, since achieving breakeven is imminent within the next decade. Therefore, fusion technology provides confidence in further development and fairly high technological maturity by the time the Icarus mission would be plausible. Currently there are numerous (over 2 dozen) different fusion approaches that are simultaneously being developed around the World and it is difficult to predict which of the concepts is going to be the most successful one. This study tried to estimate current technological maturity and possible technological extrapolation of fusion approaches for which appropriate data could be found. Figures of merit that were assessed include: current technological state, mass and volume estimates, possible gain values, main advantages and disadvantages of the concept and an attempt to extrapolate current technological state for the next decade or two. Analysis suggests that Magnetic Confinement Fusion (MCF) concepts are not likely to deliver sufficient performance due to size, mass, gain and large technological barriers of the concept. However, ICF and PJMIF did show potential for delivering necessary performance, assuming appropriate techno- logical advances. This paper is a submission of the Project Icarus Study Group.

  17. Component Framework for Coupled Integrated Fusion Plasma Simulation

    SciTech Connect

    Elwasif, Wael R; Bernholdt, David E; Berry, Lee A; Batchelor, Donald B

    2007-01-01

    Fusion Successful simulation of the complex physics that affect magnetically confined fusion plasma remains an important target milestone towards the development of viable fusion energy. Major advances in the underlying physics formulations, mathematical modeling, and computational tools and techniques are needed to enable a complete fusion simulation on the emerging class of large scale capability parallel computers that are coming on-line in the next few years. Several pilot projects are currently being undertaken to explore different (partial) code integration and coupling problems, and possible solutions that may guide the larger integration endeavor. In this paper, we present the design and implementation details of one such project, a component based approach to couple existing codes to model the interaction between high power radio frequency (RF) electromagnetic waves, and magnetohydrodynamics (MHD) aspects of the burning plasma. The framework and component design utilize a light coupling approach based on high level view of constituent codes that facilitates rapid incorporation of new components into the integrated simulation framework. The work illustrates the viability of the light coupling approach to better understand physics and stand-alone computer code dependencies and interactions, as a precursor to a more tightly coupled integrated simulation environment.

  18. FASTER project - data fusion for trafficability assessment

    NASA Astrophysics Data System (ADS)

    Skocki, K.; Nevatia, Y.

    2013-09-01

    Martian surface missions since Sojourner mission typically use robotic rover platform for carrying the science instrumentation. Such concept, successfully demonstrated by twin MER rovers, is however risky due to low trafficability soil patches unrecognized. Idea of soil traversability assessment is the base for FASTER project activities. This article shortly presents topics of special interest for planetary rover safe path finding and decision making process. The data fusion aspect of such process is analyzed shortly.

  19. Development of our laser fusion integration simulation

    NASA Astrophysics Data System (ADS)

    Li, Jinghong; Zhai, Chuanlei; Li, Shuanggui; Li, Xin; Zheng, Wudi; Yong, Heng; Zeng, Qinghong; Hang, Xudeng; Qi, Jin; Yang, Rong; Cheng, Juan; Song, Peng; Gu, Peijun; Zhang, Aiqing; An, Hengbin; Xu, Xiaowen; Guo, Hong; Cao, Xiaolin; Mo, Zeyao; Pei, Wenbing; Jiang, Song; Zhu, Shao-ping

    2013-11-01

    In the target design of the Inertial Confinement Fusion (ICF) program, it is common practice to apply radiation hydrodynamics code to study the key physical processes happening in ICF process, such as hohlraum physics, radiation drive symmetry, capsule implosion physics in the radiation-drive approach of ICF. Recently, many efforts have been done to develop our 2D integrated simulation capability of laser fusion with a variety of optional physical models and numerical methods. In order to effectively integrate the existing codes and to facilitate the development of new codes, we are developing an object-oriented structured-mesh parallel code-supporting infrastructure, called JASMIN. Based on two-dimensional three-temperature hohlraum physics code LARED-H and two-dimensional multi-group radiative transfer code LARED-R, we develop a new generation two-dimensional laser fusion code under the JASMIN infrastructure, which enable us to simulate the whole process of laser fusion from the laser beams' entrance into the hohlraum to the end of implosion. In this paper, we will give a brief description of our new-generation two-dimensional laser fusion code, named LARED-Integration, especially in its physical models, and present some simulation results of holhraum.

  20. World power energetics. Fusion reactors. ITER project

    NASA Astrophysics Data System (ADS)

    Velikhov, E. P.

    1996-10-01

    The prospects of various energy sources have to be evaluated on the basis of economical, energy and political factors, and ecological consequences. The gradual replacement of energy technologies based on burning of fossil fuels by the new 'clean' ones not yielding greenhouse gases is called for so as to conserve the atmosphere at least in the present state. From this point, one of the most promising energy technologies is controlled fusion. Today, we are in the stage of transition from proof-of-principle plasma physics experiments to practical realization of this concept. The place of future fusion power reactors in the global system is being discussed widely. In 1985, the Government Agreement on the design of the International Thermonuclear Experimental Reactor (ITER) was signed by Russia, Japan, The European Community, and the United States of America. That was the starting point of this enormous project; and now we are in the second phase, i.e. the Engineering Design Activities, to be completed by 1998. The focal point for design is the Joint Central Team, with about 200 scientists and engineers from Russia, Japan, the European Community, and the USA working jointly. The national Home Teams provide strong support for the design and research and development programs on the basis of equal contributions to the Project. One of the key problems to be solved concerns fusion reactor materials, including the creation of a complete database on appropriate materials irradiated up to a neutron fluence of 10 23 n · cm -3, the development of new alloys and relevant engineering technologies.

  1. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    SciTech Connect

    William M. Tang

    2011-02-09

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  2. Simulation of MTF experiments at General Fusion

    NASA Astrophysics Data System (ADS)

    Reynolds, Meritt; Froese, Aaron; Barsky, Sandra; Devietien, Peter; Toth, Gabor; Brennan, Dylan; Hooper, Bick

    2016-10-01

    General Fusion (GF) aims to develop a magnetized target fusion (MTF) power plant based on compression of magnetically-confined plasma by liquid metal. GF is testing this compression concept by collapsing solid aluminum liners onto spheromak or tokamak plasmas. To simulate the evolution of the compressing plasma in these experiments, we integrated a moving-mesh method into a finite-volume MHD code (VAC). The single-fluid model includes temperature-dependent resistivity and anisotropic heat transport. The trajectory of the liner is based on experiments and LS-DYNA simulations. During compression the geometry remains axially symmetric, but the MHD simulation is fully 3D to capture ideal and resistive plasma instabilities. We compare simulation to experiment through the primary diagnostic of Mirnov probes embedded in the inner coaxial surface against which the magnetic flux and plasma are compressed by the imploding liner. The MHD simulation reproduces the appearance of n=1 mode activity observed in experiments performed in negative D-shape geometry (MRT and PROSPECTOR machines). The same code predicts more favorable compression in spherical tokamak geometry, having positive D-shape (SPECTOR machine).

  3. Gyrokinetic Simulation of TAE in Fusion plasmas

    NASA Astrophysics Data System (ADS)

    Wang, Zhixuan

    Linear gyrokinetic simulation of fusion plasmas finds a radial localization of the toroidal Alfvén eigenmodes (TAE) due to the non-perturbative energetic particles (EP) contribution. The EP-driven TAE has a radial mode width much smaller than that predicted by the magnetohydrodynamic (MHD) theory. The TAE radial position stays around the strongest EP pressure gradients when the EP profile evolves. The non-perturbative EP contribution is also the main cause for the breaking of the radial symmetry of the ballooning mode structure and for the dependence of the TAE frequency on the toroidal mode number. These phenomena are beyond the picture of the conventional MHD theory. Linear gyrokinetic simulation of the electron cyclotron heating (ECH) experiments on DIII-D successfully recover the TAE and RSAE. The EP profile, rather than the electron temperature, is found to be the key factor determining whether TAE or RSAE is the dominant mode in the system in our simulation. Investigation on the nonlinear gyrokinetic simulation model reveals a missing nonlinear term which has important contributions to the zonal magnetic fields. A new fluid-electron hybrid model is proposed to keep this nonlinear term in the lowest order fluid part. Nonlinear simulation of TAE using DIII-D parameters confirms the importance of this new term for the zonal magnetic fields. It is also found that zonal structures dominated by zonal electric fields are forced driven at about twice the linear growth rate of TAE in the linear phase. The zonal flows then limit the nonlinear saturation level by tearing the eigenmode structures apart. In the nonlinear phase of the simulation, the major frequency in the system chirps down by about 30% and stays there.

  4. Purdue Contribution of Fusion Simulation Program

    SciTech Connect

    Jeffrey Brooks

    2011-09-30

    The overall science goal of the FSP is to develop predictive simulation capability for magnetically confined fusion plasmas at an unprecedented level of integration and fidelity. This will directly support and enable effective U.S. participation in research related to the International Thermonuclear Experimental Reactor (ITER) and the overall mission of delivering practical fusion energy. The FSP will address a rich set of scientific issues together with experimental programs, producing validated integrated physics results. This is very well aligned with the mission of the ITER Organization to coordinate with its members the integrated modeling and control of fusion plasmas, including benchmarking and validation activities. [1]. Initial FSP research will focus on two critical areas: 1) the plasma edge and 2) whole device modeling including disruption avoidance. The first of these problems involves the narrow plasma boundary layer and its complex interactions with the plasma core and the surrounding material wall. The second requires development of a computationally tractable, but comprehensive model that describes all equilibrium and dynamic processes at a sufficient level of detail to provide useful prediction of the temporal evolution of fusion plasma experiments. The initial driver for the whole device model (WDM) will be prediction and avoidance of discharge-terminating disruptions, especially at high performance, which are a critical impediment to successful operation of machines like ITER. If disruptions prove unable to be avoided, their associated dynamics and effects will be addressed in the next phase of the FSP. The FSP plan targets the needed modeling capabilities by developing Integrated Science Applications (ISAs) specific to their needs. The Pedestal-Boundary model will include boundary magnetic topology, cross-field transport of multi-species plasmas, parallel plasma transport, neutral transport, atomic physics and interactions with the plasma wall

  5. Quality assurance in the Antares laser fusion construction project

    SciTech Connect

    Reichelt, W.H.

    1984-01-01

    The Antares CO/sub 2/ laser facility came on line in November 1983 as an experimental physics facility; it is the world's largest CO/sub 2/ laser fusion system. Antares is a major component of the Department of Energy's Inertial Confinement Fusion Program. Antares is a one-of-a-kind laser system that is used in an experimental environment. Given limited project funds and tight schedules, the quality assurance program was tailored to achieve project goals without imposing oppressive constraints. The discussion will review the Antares quality assurance program and the utility of various portions to completion of the project.

  6. Fusion Cross Sections of Astrophysics Interest Within the STELLA Project

    NASA Astrophysics Data System (ADS)

    Courtin, Sandrine; Fruet, Guillaume; Jenkins, David G.; Heine, Marcel; Montanari, Daniele; Morris, Luke G.; Lotay, Gavin; Regan, Patrick H.; Kirsebom, Oliver S.; Della Negra, Serge; Hammache, Faïrouz; de Sereville, Nicolas; Bastin, Beyhan; de Oliveira, François; Randisi, Giacomo; Stodel, Christelle; Beck, Christian; Haas, Florent

    Low energy fusion between light heavy-ions is a key feature of the evolution of massive stars. In systems of astrophysical interest, the process may be strongly affected by molecular configurations of the compound nucleus, leading to resonant S factors. In particular, the 12C+12C fusion reaction has been the object of numerous experimental investigations. The STELLA project has been developed to extend these investigations to lower energies towards the Gamow window.

  7. Submodeling Simulations in Fusion Welds: Part II

    NASA Astrophysics Data System (ADS)

    Bonifaz, E. A.

    2013-11-01

    In part I, three-dimensional transient non-linear sub modeling heat transfer simulations were performed to study the thermal histories and thermal cycles that occur during the welding process at the macro, meso and micro scales. In the present work, the corresponding non-uniform temperature changes were imposed as load conditions on structural calculations to study the evolution of localized plastic strains and residual stresses at these sub-level scales. To reach the goal, a three-dimensional finite element elastic-plastic model (ABAQUS code) was developed. The sub-modeling technique proposed to be used in coupling phase-field (and/or digital microstructures) codes with finite element codes, was used to mesh a local part of the model with a refined mesh based on interpolation of the solution from an initial, relatively coarse, macro global model. The meso-sub-model is the global model for the subsequent micro sub-model. The strategy used to calculate temperatures, strains and residual stresses at the macro, meso and micro scale level, is very flexible to be used to any number of levels. The objective of this research was to initiate the development of microstructural models to identify fusion welding process parameters for preserving the single crystal nature of gas turbine blades during repair procedures. The multi-scale submodeling approach can be used to capture weld pool features at the macro-meso scale level, and micro residual stress and secondary dendrite arm spacing features at the micro scale level.

  8. Simulation of National Intelligence Process with Fusion

    DTIC Science & Technology

    2008-03-01

    modelled as zero mean Gaussian noise. The state estimate provided by a Kalman filter is statistically optimal in that it minimizes the mean squared error...the fusion methods above contain logical and mathematical algorithms based on either continuous or discrete quantifiable data, so to use these methods...method for capturing statistics about the performance of different architectures, it fails to capture the synergy of intelligence or information fusion

  9. Particle Simulations of Knudsen Layer Effects on DT Fusion

    NASA Astrophysics Data System (ADS)

    Cohen, Bruce; Dimits, Andris; Zimmerman, George; Wilks, Scott

    2014-10-01

    Kinetic effects have been shown to degrade fusion reactivities near an absorbing bounding surface in some circumstances, the so-called Knudsen layer (KL) effect. There is renewed interest in the KL effect in the context of inertial fusion. We report particle simulations (1D Cartesian in space, 3D in velocity) of the transport of deuterium and tritium (DT) plasma in a system with a partially absorbing boundary and including Coulomb collisions and the effects of non-Maxwellian velocity distribution functions on fusion reactivity. Ion-ion Coulomb collisions are implemented with a pairwise scheme that conserves number, momentum, and energy. The influences of the albedo and temperature of the boundary, ion slowing on electrons, ambi-polar electric fields, fusion alphas, and a Cu minority species are studied. Reductions in fusion reactivity are quantified. For DT at 9 keV, the Gamow peak in the fusion reactivity is at 29 keV; but the KL decrements in the ion tail from Maxwellian are observed to occur at higher energies so that the Maxwellian-averaged formula for the fusion reactivity using the space-time local temperatures and densities gives a good fit to the kinetic fusion rate. Kinetic effects are nevertheless important in determining end losses, velocity tail decrements and anisotropy, and ion axial plasma profiles for density, kinetic energy, fluxes, and flows. Work performed for the USDOE under contract DE-AC52-07NA27344 at Lawrence Livermore Nat. Lab.

  10. Spherically symmetric simulation of plasma liner driven magnetoinertial fusion

    SciTech Connect

    Samulyak, Roman; Parks, Paul; Wu Lingling

    2010-09-15

    Spherically symmetric simulations of the implosion of plasma liners and compression of plasma targets in the concept of the plasma jet driven magnetoinertial fusion have been performed using the method of front tracking. The cases of single deuterium and xenon liners and double layer deuterium-xenon liners compressing various deuterium-tritium targets have been investigated, optimized for maximum fusion energy gains, and compared with theoretical predictions and scaling laws of [P. Parks, Phys. Plasmas 15, 062506 (2008)]. In agreement with the theory, the fusion gain was significantly below unity for deuterium-tritium targets compressed by Mach 60 deuterium liners. The most optimal setup for a given chamber size contained a target with the initial radius of 20 cm compressed by a 10 cm thick, Mach 60 xenon liner, achieving a fusion energy gain of 10 with 10 GJ fusion yield. Simulations also showed that composite deuterium-xenon liners reduce the energy gain due to lower target compression rates. The effect of heating of targets by alpha particles on the fusion energy gain has also been investigated.

  11. Secondary fusion coupled deuteron/triton transport simulation and thermal-to-fusion neutron convertor measurement

    SciTech Connect

    Wang, G. B.; Wang, K.; Liu, H. G.; Li, R. D.

    2013-07-01

    A Monte Carlo tool RSMC (Reaction Sequence Monte Carlo) was developed to simulate deuteron/triton transportation and reaction coupled problem. The 'Forced particle production' variance reduction technique was used to improve the simulation speed, which made the secondary product play a major role. The mono-energy 14 MeV fusion neutron source was employed as a validation. Then the thermal-to-fusion neutron convertor was studied with our tool. Moreover, an in-core conversion efficiency measurement experiment was performed with {sup 6}LiD and {sup 6}LiH converters. Threshold activation foils was used to indicate the fast and fusion neutron flux. Besides, two other pivotal parameters were calculated theoretically. Finally, the conversion efficiency of {sup 6}LiD is obtained as 1.97x10{sup -4}, which matches well with the theoretical result. (authors)

  12. Projective simulation for artificial intelligence

    PubMed Central

    Briegel, Hans J.; De las Cuevas, Gemma

    2012-01-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation. PMID:22590690

  13. Projective simulation for artificial intelligence

    NASA Astrophysics Data System (ADS)

    Briegel, Hans J.; de Las Cuevas, Gemma

    2012-05-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.

  14. Projective simulation for artificial intelligence.

    PubMed

    Briegel, Hans J; De las Cuevas, Gemma

    2012-01-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.

  15. Project Simulation for Business Communications

    ERIC Educational Resources Information Center

    Franjoine, Dorothy

    1978-01-01

    The article describes a ten-day simulation project in the correspondence unit of business communications. The students learn to recognize that the previous weeks of letter writing and punctuation practice are required duties in their future careers. At the end of the project the students are shown how they were rated on daily activities in their…

  16. Human Sensing Fusion Project for Safety and Health Society

    NASA Astrophysics Data System (ADS)

    Maenaka, Kazusuke

    This paper introduces objectives and status of “Human sensing fusion project” in the Exploratory Research for Advanced Technology (ERATO) scheme produced by Japan Science and Technology Agency (JST). This project was started in December 2007 and the laboratory with 11 members opened on April 2008. The aim of this project is to realize a human activity-monitoring device with many kinds of sensors in ultimate small size so that the device can be pasted or patched to the human body, and to establish the algorism for understanding human condition including both physical and mental conditions from obtained data. This system can be used towards the prevention of the danger of accidents and the maintenance of health. The actual research has just begun and preparations for project are well under way.

  17. Atomic Data and Modelling for Fusion: the ADAS Project

    NASA Astrophysics Data System (ADS)

    Summers, H. P.; O'Mullane, M. G.

    2011-05-01

    The paper is an update on the Atomic Data and Analysis Structure, ADAS, since ICAM-DATA06 and a forward look to its evolution in the next five years. ADAS is an international project supporting principally magnetic confinement fusion research. It has participant laboratories throughout the world, including ITER and all its partner countries. In parallel with ADAS, the ADAS-EU Project provides enhanced support for fusion research at Associated Laboratories and Universities in Europe and ITER. OPEN-ADAS, sponsored jointly by the ADAS Project and IAEA, is the mechanism for open access to principal ADAS atomic data classes and facilitating software for their use. EXTENDED-ADAS comprises a variety of special, integrated application software, beyond the purely atomic bounds of ADAS, tuned closely to specific diagnostic analyses and plasma models. The current scientific content and scope of these various ADAS and ADAS related activities are briefly reviewed. These span a number of themes including heavy element spectroscopy and models, charge exchange spectroscopy, beam emission spectroscopy and special features which provide a broad baseline of atomic modelling and support. Emphasis will be placed on `lifting the fundamental data baseline'—a principal ADAS task for the next few years. This will include discussion of ADAS and ADAS-EU coordinated and shared activities and some of the methods being exploited.

  18. Atomic Data and Modelling for Fusion: the ADAS Project

    SciTech Connect

    Summers, H. P.; O'Mullane, M. G.

    2011-05-11

    The paper is an update on the Atomic Data and Analysis Structure, ADAS, since ICAM-DATA06 and a forward look to its evolution in the next five years. ADAS is an international project supporting principally magnetic confinement fusion research. It has participant laboratories throughout the world, including ITER and all its partner countries. In parallel with ADAS, the ADAS-EU Project provides enhanced support for fusion research at Associated Laboratories and Universities in Europe and ITER. OPEN-ADAS, sponsored jointly by the ADAS Project and IAEA, is the mechanism for open access to principal ADAS atomic data classes and facilitating software for their use. EXTENDED-ADAS comprises a variety of special, integrated application software, beyond the purely atomic bounds of ADAS, tuned closely to specific diagnostic analyses and plasma models.The current scientific content and scope of these various ADAS and ADAS related activities are briefly reviewed. These span a number of themes including heavy element spectroscopy and models, charge exchange spectroscopy, beam emission spectroscopy and special features which provide a broad baseline of atomic modelling and support. Emphasis will be placed on 'lifting the fundamental data baseline'--a principal ADAS task for the next few years. This will include discussion of ADAS and ADAS-EU coordinated and shared activities and some of the methods being exploited.

  19. Simulated disparity and peripheral blur interact during binocular fusion.

    PubMed

    Maiello, Guido; Chessa, Manuela; Solari, Fabio; Bex, Peter J

    2014-07-17

    We have developed a low-cost, practical gaze-contingent display in which natural images are presented to the observer with dioptric blur and stereoscopic disparity that are dependent on the three-dimensional structure of natural scenes. Our system simulates a distribution of retinal blur and depth similar to that experienced in real-world viewing conditions by emmetropic observers. We implemented the system using light-field photographs taken with a plenoptic camera which supports digital refocusing anywhere in the images. We coupled this capability with an eye-tracking system and stereoscopic rendering. With this display, we examine how the time course of binocular fusion depends on depth cues from blur and stereoscopic disparity in naturalistic images. Our results show that disparity and peripheral blur interact to modify eye-movement behavior and facilitate binocular fusion, and the greatest benefit was gained by observers who struggled most to achieve fusion. Even though plenoptic images do not replicate an individual’s aberrations, the results demonstrate that a naturalistic distribution of depth-dependent blur may improve 3-D virtual reality, and that interruptions of this pattern (e.g., with intraocular lenses) which flatten the distribution of retinal blur may adversely affect binocular fusion. © 2014 ARVO.

  20. Simulated disparity and peripheral blur interact during binocular fusion

    PubMed Central

    Maiello, Guido; Chessa, Manuela; Solari, Fabio; Bex, Peter J

    2014-01-01

    We have developed a low-cost, practical gaze-contingent display in which natural images are presented to the observer with dioptric blur and stereoscopic disparity that are dependent on the three-dimensional structure of natural scenes. Our system simulates a distribution of retinal blur and depth similar to that experienced in real-world viewing conditions by emmetropic observers. We implemented the system using light-field photographs taken with a plenoptic camera which supports digital refocusing anywhere in the images. We coupled this capability with an eye-tracking system and stereoscopic rendering. With this display, we examine how the time course of binocular fusion depends on depth cues from blur and stereoscopic disparity in naturalistic images. Our results show that disparity and peripheral blur interact to modify eye-movement behavior and facilitate binocular fusion, and the greatest benefit was gained by observers who struggled most to achieve fusion. Even though plenoptic images do not replicate an individual's aberrations, the results demonstrate that a naturalistic distribution of depth-dependent blur may improve 3-D virtual reality, and that interruptions of this pattern (e.g., with intraocular lenses) which flatten the distribution of retinal blur may adversely affect binocular fusion. PMID:25034260

  1. KULL: LLNL's ASCI Inertial Confinement Fusion Simulation Code

    SciTech Connect

    Rathkopf, J. A.; Miller, D. S.; Owen, J. M.; Zike, M. R.; Eltgroth, P. G.; Madsen, N. K.; McCandless, K. P.; Nowak, P. F.; Nemanic, M. K.; Gentile, N. A.; Stuart, L. M.; Keen, N. D.; Palmer, T. S.

    2000-01-10

    KULL is a three dimensional, time dependent radiation hydrodynamics simulation code under development at Lawrence Livermore National Laboratory. A part of the U.S. Department of Energy's Accelerated Strategic Computing Initiative (ASCI), KULL's purpose is to simulate the physical processes in Inertial Confinement Fusion (ICF) targets. The National Ignition Facility, where ICF experiments will be conducted, and ASCI are part of the experimental and computational components of DOE's Stockpile Stewardship Program. This paper provides an overview of ASCI and describes KULL, its hydrodynamic simulation capability and its three methods of simulating radiative transfer. Particular emphasis is given to the parallelization techniques essential to obtain the performance required of the Stockpile Stewardship Program and to exploit the massively parallel processor machines that ASCI is procuring.

  2. Simulating Intense Ion Beams for Inertial Fusion Energy

    SciTech Connect

    Friedman, A

    2001-02-20

    The Heavy Ion Fusion (HIF) program's goal is the development of the body of knowledge needed for Inertial Fusion Energy (IFE) to realize its promise. The intense ion beams that will drive HIF targets are nonneutral plasmas and exhibit collective, nonlinear dynamics which must be understood using the kinetic models of plasma physics. This beam physics is both rich and subtle: a wide range in spatial and temporal scales is involved, and effects associated with both instabilities and non-ideal processes must be understood. Ion beams have a ''long memory'', and initialization of a beam at mid-system with an idealized particle distribution introduces uncertainties; thus, it will be crucial to develop, and to extensively use, an integrated and detailed ''source-to-target'' HIF beam simulation capability. We begin with an overview of major issues.

  3. Simulating Intense Ion Beams for Inertial Fusion Energy

    SciTech Connect

    Friedman, A.

    2001-02-20

    The Heavy Ion Fusion (HIF) program's goal is the development of the body of knowledge needed for Inertial Fusion Energy (IFE) to realize its promise. The intense ion beams that will drive HIF targets are rzonneutral plasmas and exhibit collective, nonlinear dynamics which must be understood using the kinetic models of plasma physics. This beam physics is both rich and subtle: a wide range in spatial and temporal scales is involved, and effects associated with both instabilities and non-ideal processes must be understood. Ion beams have a ''long memory,'' and initialization of a beam at mid-system with an idealized particle distribution introduces uncertainties; thus, it will be crucial to develop, and to extensively use, an integrated and detailed ''source-to-target'' HIF beam simulation capability. We begin with an overview of major issues.

  4. Simulation of polyethylene glycol and calcium-mediated membrane fusion

    SciTech Connect

    Pannuzzo, Martina; De Jong, Djurre H.; Marrink, Siewert J.; Raudino, Antonio

    2014-03-28

    We report on the mechanism of membrane fusion mediated by polyethylene glycol (PEG) and Ca{sup 2+} by means of a coarse-grained molecular dynamics simulation approach. Our data provide a detailed view on the role of cations and polymer in modulating the interaction between negatively charged apposed membranes. The PEG chains cause a reduction of the inter-lamellar distance and cause an increase in concentration of divalent cations. When thermally driven fluctuations bring the membranes at close contact, a switch from cis to trans Ca{sup 2+}-lipid complexes stabilizes a focal contact acting as a nucleation site for further expansion of the adhesion region. Flipping of lipid tails induces subsequent stalk formation. Together, our results provide a molecular explanation for the synergistic effect of Ca{sup 2+} and PEG on membrane fusion.

  5. Terascale simulations for heavy ion inertial fusion energy

    SciTech Connect

    Friedman, A; Cohen, R H; Grote, D P; Sharp, W M; Celata, C M; Lee, E P; Vay, J-L; Davidson, R C; Kaganovich, I; Lee, W W; Qin, H; Welch, D R; Haber, I; Kishek, R A

    2000-06-08

    The intense ion beams in a heavy ion Inertial Fusion Energy (IFE) driver and fusion chamber are non-neutral plasmas whose dynamics are largely dominated by space charge. We propose to develop a ''source-to-target'' Heavy Ion Fusion (HIF) beam simulation capability: a description of the kinetic behavior of this complex, nonlinear system which is both integrated and detailed. We will apply this new capability to further our understanding of key scientific issues in the physics of ion beams for IFE. The simulations will entail self-consistent field descriptions that require interprocessor communication, but are scalable and will run efficiently on terascale architectures. This new capability will be based on the integration of three types of simulations, each requiring terascale computing: (1) simulations of acceleration and confinement of the space-charge-dominated ion beams through the driver (accelerator, pulse compression line, and final focusing system) which accurately describe their dynamics, including emittance growth (phase-space dilution) effects; these are particle-in-cell (PIC) models; (2) electromagnetic (EM) and magnetoinductive (Darwin) simulations which describe the beam and the fusion chamber environment, including multibeam, neutralization, stripping, beam and plasma ionization processes, and return current effects; and (3) highly detailed simulations (6f, multispecies PIC, continuum Vlasov), which can examine electron effects and collective modes in the driver and chamber, and can study halo generation with excellent statistics, to ensure that these effects do not disrupt the focusability of the beams. The code development will involve: (i) adaptation of existing codes to run efficiently on multi-SMP computers that use a hybrid of shared and distributed memory; (ii) development of new and improved numerical algorithms, e.g., averaging techniques that will afford larger timesteps; and (iii) incorporation of improved physics models (e.g., for self

  6. The Progress of Research Project for Magnetized Target Fusion in China

    NASA Astrophysics Data System (ADS)

    Yang, Xian-Jun

    2015-11-01

    The fusion of magnetized plasma called Magnetized Target Fusion (MTF) is a hot research area recently. It may significantly reduce the cost and size. Great progress has been achieved in past decades around the world. Five years ago, China initiated the MTF project and has gotten some progress as follows: 1. Verifying the feasibility of ignition of MTF by means of first principle and MHD simulation; 2. Generating the magnetic field over 1400 Tesla, which can be suppress the heat conduction from charged particles, deposit the energy of alpha particle to promote the ignition process, and produce the stable magnetized plasma for the target of ignition; 3. The imploding facility of FP-1 can put several Mega Joule energy to the solid liner of about ten gram in the range of microsecond risen time, while the simulating tool has been developed for design and analysis of the process; 4. The target of FRC can be generated by ``YG 1 facility'' while some simulating tools have be developed. Next five years, the above theoretical work and the experiments of MTF may be integrated to step up as the National project, which may make my term play an important lead role and be supposed to achieve farther progress in China. Supported by the National Natural Science Foundation of China under Grant No 11175028.

  7. Bohunice Simulator Data Collection Project

    SciTech Connect

    Cillik, Ivan; Prochaska, Jan

    2002-07-01

    The paper describes the way and results of human reliability data analysis collected as a part of the Bohunice Simulator Data Collection Project (BSDCP), which was performed by VUJE Trnava, Inc. with funding support from the U.S. DOE, National Nuclear Security Administration. The goal of the project was to create a methodology for simulator data collection and analysis to support activities in probabilistic safety assessment (PSA) and human reliability assessment for Jaslovske Bohunice nuclear power plant consisting of two sets of twin units: two VVER 440/V-230 (V1) and two VVER 440/V-213 (V2) reactors. During the project training of V-2 control room crews was performed at VUJE-Trnava simulator. The simulator training and the data collection were done in parallel. The main goal of BSDCP was to collect suitable data of human errors under simulated conditions requiring the use of symptom-based emergency operating procedures (SBEOPs). The subjects of the data collection were scenario progress time data, operator errors, and real-time technological parameters. The paper contains three main parts. The first part presents preparatory work and semi-automatic computer-based methods used to collect data and to check technological parameters in order to find hidden errors of operators, to be able to retrace the course of each scenario for purposes of further analysis, and to document the whole training process. The first part gives also an overview of collected data scope, human error taxonomy, and state classifications for SBEOP instructions coding. The second part describes analytical work undertaken to describe time distribution necessary for execution of various kinds of instructions performed by operators according to the classification for coding of SBEOP instructions. It also presents the methods used for determination of probability distribution for different operator errors. Results from the data evaluation are presented in the last part of the paper. An overview of

  8. Multisource report-level simulator for fusion research

    NASA Astrophysics Data System (ADS)

    Carlotto, Mark J.; Kadar, Ivan

    2003-08-01

    The Multi-source Report-level Simulator (MRS) is a tool developed by Veridian Systems as part of its Model-adaptive Multi-source Track Fusion (MMTF) effort under DARPA's DTT program. MRS generates simulated multisensor contact reports for GMTI, HUMINT, IMINT, SIGINT, UGS, and video. It contains a spatial editor for creating ground tracks along which vehicles move over the terrain. Vehicles can start, stop, speed up, or slow down. The spatial editor is also used to define the locations of fixed sensors such as UGS and HUMINT observers on the ground, and flight paths of GMTI, IMINT, SIGINT, and video sensors in the air. Observation models characterize each sensor at the report level in terms of their operating characteristics (revisit rate, resolution, etc.) measurement errors, and detection/classification performance (i.e., Pd, Nfa, Pcc, and Pid). Contact reports are linked to ground truth data to facilitate the testing of track/fusion algorithms and the validation of associated performance models.

  9. Physics Basis and Simulation of Burning Plasma Physics for the Fusion Ignition Research Experiment (FIRE)

    SciTech Connect

    C.E. Kessel; D. Meade; S.C. Jardin

    2002-01-18

    The FIRE [Fusion Ignition Research Experiment] design for a burning plasma experiment is described in terms of its physics basis and engineering features. Systems analysis indicates that the device has a wide operating space to accomplish its mission, both for the ELMing H-mode reference and the high bootstrap current/high beta advanced tokamak regimes. Simulations with 1.5D transport codes reported here both confirm and constrain the systems projections. Experimental and theoretical results are used to establish the basis for successful burning plasma experiments in FIRE.

  10. Multi-Scale Fusion of Information for Uncertainty Quantification and Management in Large-Scale Simulations

    DTIC Science & Technology

    2015-12-02

    AFRL-AFOSR-VA-TR-2015-0391 MURI-09) MULTI-SCALE FUSION OF INFORMATION FOR UNCERTAINTY QUANTIFICATION AND M George Em Karniadakis BROWN UNIVERSITY IN...George Em Karniadakis 5d.  PROJECT NUMBER 5e.  TASK NUMBER 5f.  WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) BROWN UNIVERSITY...TIFICATION AND MANAGEMENT IN LARGE-SCALE SIMULATIONS AFOSR GRANT NUMBER: FA9550-09-1-0613 (FINAL REPORT) GE Karniadakis, JS Hesthaven & B Rozovsky, Brown

  11. Computer modeling and simulation in inertial confinement fusion

    SciTech Connect

    McCrory, R.L.; Verdon, C.P.

    1989-03-01

    The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper we describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics. 46 refs., 19 figs., 1 tab.

  12. Computer modeling and simulation in inertial confinement fusion

    NASA Astrophysics Data System (ADS)

    McCrory, R. L.; Verdon, C. P.

    1989-03-01

    The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper we describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics.

  13. Report of the Fusion Energy Sciences Advisory Committee. Panel on Integrated Simulation and Optimization of Magnetic Fusion Systems

    SciTech Connect

    Dahlburg, Jill; Corones, James; Batchelor, Donald; Bramley, Randall; Greenwald, Martin; Jardin, Stephen; Krasheninnikov, Sergei; Laub, Alan; Leboeuf, Jean-Noel; Lindl, John; Lokke, William; Rosenbluth, Marshall; Ross, David; Schnack, Dalton

    2002-11-01

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individual features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC

  14. The Mars Gravity Simulation Project

    NASA Technical Reports Server (NTRS)

    Korienek, Gene

    1998-01-01

    Human beings who make abrupt transitions between one gravitational environment and another undergo severe disruptions of their visual perception and visual- motor coordination, frequently accompanied by "space sickness." Clearly, such immediate effects of exposure to a novel gravitational condition have significant implications for human performance. For example, when astronauts first arrive in Earth orbit their attempts to move about in the spacecraft and to perform their duties are uncoordinated, inaccurate, and inefficient. Other inter-gravitational transitions for which these difficulties can be expected include going from the 0 g of the spacecraft to the. 16 g of the Moon, from 0 g to the .38 g of Mars, and from 0 g back to the 1.0 g of Earth. However, after astronauts have actively interacted with their new gravitational environment for several days, these problems tend to disappear, evidence that some sort of adaptive process has taken place. It would be advantageous, therefore, if there were some way to minimize or perhaps even to eliminate this potentially hazardous adaptive transition period by allowing astronauts to adapt to the altered gravitational conditions before actually entering them. Simultaneous adaptations to both the altered and the normal gravitational environment as a result of repeatedly adapting to one and readapting to the other, a phenomenon known as dual adaptation. The objective of the Mars Gravity Simulator (MGS) Project is to construct a simulation of the visual and bodily effects of altered gravity. This perceptual-motor simulation is created through the use of: 1) differential body pressure to produce simulated hypo-gravity and 2) treadmill-controlled virtual reality to create a corresponding visual effect. It is expected that this combination will produce sensory motor perturbations in the subjects. Both the immediate and adaptive behavioral (postural and ambulatory) responses to these sensory perturbations will be assessed.

  15. The Mars Gravity Simulation Project

    NASA Astrophysics Data System (ADS)

    Korienek, Gene

    1998-10-01

    Human beings who make abrupt transitions between one gravitational environment and another undergo severe disruptions of their visual perception and visual- motor coordination, frequently accompanied by "space sickness." Clearly, such immediate effects of exposure to a novel gravitational condition have significant implications for human performance. For example, when astronauts first arrive in Earth orbit their attempts to move about in the spacecraft and to perform their duties are uncoordinated, inaccurate, and inefficient. Other inter-gravitational transitions for which these difficulties can be expected include going from the 0 g of the spacecraft to the. 16 g of the Moon, from 0 g to the .38 g of Mars, and from 0 g back to the 1.0 g of Earth. However, after astronauts have actively interacted with their new gravitational environment for several days, these problems tend to disappear, evidence that some sort of adaptive process has taken place. It would be advantageous, therefore, if there were some way to minimize or perhaps even to eliminate this potentially hazardous adaptive transition period by allowing astronauts to adapt to the altered gravitational conditions before actually entering them. Simultaneous adaptations to both the altered and the normal gravitational environment as a result of repeatedly adapting to one and readapting to the other, a phenomenon known as dual adaptation. The objective of the Mars Gravity Simulator (MGS) Project is to construct a simulation of the visual and bodily effects of altered gravity. This perceptual-motor simulation is created through the use of: 1) differential body pressure to produce simulated hypo-gravity and 2) treadmill-controlled virtual reality to create a corresponding visual effect. It is expected that this combination will produce sensory motor perturbations in the subjects. Both the immediate and adaptive behavioral (postural and ambulatory) responses to these sensory perturbations will be assessed.

  16. The Mars Gravity Simulation Project

    NASA Technical Reports Server (NTRS)

    Korienek, Gene

    1998-01-01

    Human beings who make abrupt transitions between one gravitational environment and another undergo severe disruptions of their visual perception and visual- motor coordination, frequently accompanied by "space sickness." Clearly, such immediate effects of exposure to a novel gravitational condition have significant implications for human performance. For example, when astronauts first arrive in Earth orbit their attempts to move about in the spacecraft and to perform their duties are uncoordinated, inaccurate, and inefficient. Other inter-gravitational transitions for which these difficulties can be expected include going from the 0 g of the spacecraft to the. 16 g of the Moon, from 0 g to the .38 g of Mars, and from 0 g back to the 1.0 g of Earth. However, after astronauts have actively interacted with their new gravitational environment for several days, these problems tend to disappear, evidence that some sort of adaptive process has taken place. It would be advantageous, therefore, if there were some way to minimize or perhaps even to eliminate this potentially hazardous adaptive transition period by allowing astronauts to adapt to the altered gravitational conditions before actually entering them. Simultaneous adaptations to both the altered and the normal gravitational environment as a result of repeatedly adapting to one and readapting to the other, a phenomenon known as dual adaptation. The objective of the Mars Gravity Simulator (MGS) Project is to construct a simulation of the visual and bodily effects of altered gravity. This perceptual-motor simulation is created through the use of: 1) differential body pressure to produce simulated hypo-gravity and 2) treadmill-controlled virtual reality to create a corresponding visual effect. It is expected that this combination will produce sensory motor perturbations in the subjects. Both the immediate and adaptive behavioral (postural and ambulatory) responses to these sensory perturbations will be assessed.

  17. SIMULATION OF INTENSE BEAMS FOR HEAVY ION FUSION

    SciTech Connect

    Friedman, A

    2004-06-10

    Computer simulations of intense ion beams play a key role in the Heavy Ion Fusion research program. Along with analytic theory, they are used to develop future experiments, guide ongoing experiments, and aid in the analysis and interpretation of experimental results. They also afford access to regimes not yet accessible in the experimental program. The U.S. Heavy Ion Fusion Virtual National Laboratory and its collaborators have developed state-of-the art computational tools, related both to codes used for stationary plasmas and to codes used for traditional accelerator applications, but necessarily differing from each in important respects. These tools model beams in varying levels of detail and at widely varying computational cost. They include moment models (envelope equations and fluid descriptions), particle-in-cell methods (electrostatic and electromagnetic), nonlinear-perturbative descriptions (''{delta}f''), and continuum Vlasov methods. Increasingly, it is becoming clear that it is necessary to simulate not just the beams themselves, but also the environment in which they exist, be it an intentionally-created plasma or an unwanted cloud of electrons and gas. In this paper, examples of the application of simulation tools to intense ion beam physics are presented, including support of present-day experiments, fundamental beam physics studies, and the development of future experiments. Throughout, new computational models are described and their utility explained. These include Mesh Refinement (and its dynamic variant, Adaptive Mesh Refinement); improved electron cloud and gas models, and an electron advance scheme that allows use of larger time steps; and moving-mesh and adaptive-mesh Vlasov methods.

  18. Membrane Fusion Involved in Neurotransmission: Glimpse from Electron Microscope and Molecular Simulation.

    PubMed

    Yang, Zhiwei; Gou, Lu; Chen, Shuyu; Li, Na; Zhang, Shengli; Zhang, Lei

    2017-01-01

    Membrane fusion is one of the most fundamental physiological processes in eukaryotes for triggering the fusion of lipid and content, as well as the neurotransmission. However, the architecture features of neurotransmitter release machinery and interdependent mechanism of synaptic membrane fusion have not been extensively studied. This review article expounds the neuronal membrane fusion processes, discusses the fundamental steps in all fusion reactions (membrane aggregation, membrane association, lipid rearrangement and lipid and content mixing) and the probable mechanism coupling to the delivery of neurotransmitters. Subsequently, this work summarizes the research on the fusion process in synaptic transmission, using electron microscopy (EM) and molecular simulation approaches. Finally, we propose the future outlook for more exciting applications of membrane fusion involved in synaptic transmission, with the aid of stochastic optical reconstruction microscopy (STORM), cryo-EM (cryo-EM), and molecular simulations.

  19. Inertial Fusion Energy Studies on an Earth Simulator-Class Computer

    SciTech Connect

    Friedman, A; Stephens, R

    2002-08-13

    The U.S. is developing fusion energy based on inertial confinement of the burning fusion fuel, as a complement to the magnetic confinement approach. DOE's Inertial Fusion Energy (IFE) program within the Office of Fusion Energy Sciences (OFES) is coordinated with, and gains leverage from, the much larger Inertial Confinement Fusion program of the National Nuclear Security Administration (NNSA). Advanced plasma and particle beam simulations play a major role in the IFE effort, and the program is well poised to benefit from an Earth Simulator-class resource. Progress in all key physics areas of IFE, including heavy-ion ''drivers'' which impart the energy to the fusion fuel, the targets for both ion- and laser-driven approaches, and an advanced concept known as fast ignition, would be dramatically accelerated by an Earth Simulator-class resource.

  20. Deuterium-Tritium Simulations of the Enhanced Reversed Shear Mode in the Tokamak Fusion Test Reactor

    SciTech Connect

    Mikkelsen, D.R.; Manickam, J.; Scott, S.D.; Zarnstorff

    1997-04-01

    The potential performance, in deuterium-tritium plasmas, of a new enhanced con nement regime with reversed magnetic shear (ERS mode) is assessed. The equilibrium conditions for an ERS mode plasma are estimated by solving the plasma transport equations using the thermal and particle dif- fusivities measured in a short duration ERS mode discharge in the Tokamak Fusion Test Reactor [F. M. Levinton, et al., Phys. Rev. Letters, 75, 4417, (1995)]. The plasma performance depends strongly on Zeff and neutral beam penetration to the core. The steady state projections typically have a central electron density of {approx}2:5x10 20 m{sup -3} and nearly equal central electron and ion temperatures of {approx}10 keV. In time dependent simulations the peak fusion power, {approx} 25 MW, is twice the steady state level. Peak performance occurs during the density rise when the central ion temperature is close to the optimal value of {approx} 15 keV. The simulated pressure profiles can be stable to ideal MHD instabilities with toroidal mode number n = 1, 2, 3, 4 and {infinity} for {beta}{sub norm} up to 2.5; the simulations have {beta}{sub norm} {le} 2.1. The enhanced reversed shear mode may thus provide an opportunity to conduct alpha physics experiments in conditions imilar to those proposed for advanced tokamak reactors.

  1. Lipid Tail Protrusion in Simulations Predicts Fusogenic Activity of Influenza Fusion Peptide Mutants and Conformational Models

    PubMed Central

    Larsson, Per; Kasson, Peter M.

    2013-01-01

    Fusion peptides from influenza hemagglutinin act on membranes to promote membrane fusion, but the mechanism by which they do so remains unknown. Recent theoretical work has suggested that contact of protruding lipid tails may be an important feature of the transition state for membrane fusion. If this is so, then influenza fusion peptides would be expected to promote tail protrusion in proportion to the ability of the corresponding full-length hemagglutinin to drive lipid mixing in fusion assays. We have performed molecular dynamics simulations of influenza fusion peptides in lipid bilayers, comparing the X-31 influenza strain against a series of N-terminal mutants. As hypothesized, the probability of lipid tail protrusion correlates well with the lipid mixing rate induced by each mutant. This supports the conclusion that tail protrusion is important to the transition state for fusion. Furthermore, it suggests that tail protrusion can be used to examine how fusion peptides might interact with membranes to promote fusion. Previous models for native influenza fusion peptide structure in membranes include a kinked helix, a straight helix, and a helical hairpin. Our simulations visit each of these conformations. Thus, the free energy differences between each are likely low enough that specifics of the membrane environment and peptide construct may be sufficient to modulate the equilibrium between them. However, the kinked helix promotes lipid tail protrusion in our simulations much more strongly than the other two structures. We therefore predict that the kinked helix is the most fusogenic of these three conformations. PMID:23505359

  2. Lipid tail protrusion in simulations predicts fusogenic activity of influenza fusion peptide mutants and conformational models.

    PubMed

    Larsson, Per; Kasson, Peter M

    2013-01-01

    Fusion peptides from influenza hemagglutinin act on membranes to promote membrane fusion, but the mechanism by which they do so remains unknown. Recent theoretical work has suggested that contact of protruding lipid tails may be an important feature of the transition state for membrane fusion. If this is so, then influenza fusion peptides would be expected to promote tail protrusion in proportion to the ability of the corresponding full-length hemagglutinin to drive lipid mixing in fusion assays. We have performed molecular dynamics simulations of influenza fusion peptides in lipid bilayers, comparing the X-31 influenza strain against a series of N-terminal mutants. As hypothesized, the probability of lipid tail protrusion correlates well with the lipid mixing rate induced by each mutant. This supports the conclusion that tail protrusion is important to the transition state for fusion. Furthermore, it suggests that tail protrusion can be used to examine how fusion peptides might interact with membranes to promote fusion. Previous models for native influenza fusion peptide structure in membranes include a kinked helix, a straight helix, and a helical hairpin. Our simulations visit each of these conformations. Thus, the free energy differences between each are likely low enough that specifics of the membrane environment and peptide construct may be sufficient to modulate the equilibrium between them. However, the kinked helix promotes lipid tail protrusion in our simulations much more strongly than the other two structures. We therefore predict that the kinked helix is the most fusogenic of these three conformations.

  3. Evaluation of performance of select fusion experiments and projected reactors

    NASA Technical Reports Server (NTRS)

    Miley, G. H.

    1978-01-01

    The performance of NASA Lewis fusion experiments (SUMMA and Bumpy Torus) is compared with other experiments and that necessary for a power reactor. Key parameters cited are gain (fusion power/input power) and the time average fusion power, both of which may be more significant for real fusion reactors than the commonly used Lawson parameter. The NASA devices are over 10 orders of magnitude below the required powerplant values in both gain and time average power. The best experiments elsewhere are also as much as 4 to 5 orders of magnitude low. However, the NASA experiments compare favorably with other alternate approaches that have received less funding than the mainline experiments. The steady-state character and efficiency of plasma heating are strong advantages of the NASA approach. The problem, though, is to move ahead to experiments of sufficient size to advance in gain and average power parameters.

  4. Projection-Based linear constrained estimation and fusion over long-haul links

    SciTech Connect

    Rao, Nageswara S

    2016-01-01

    We study estimation and fusion with linear dynamics in long-haul sensor networks, wherein a number of sensors are remotely deployed over a large geographical area for performing tasks such as target tracking, and a remote fusion center serves to combine the information provided by these sensors in order to improve the overall tracking accuracy. In reality, the motion of a dynamic target might be subject to certain constraints, for instance, those defined by a road network. We explore the accuracy performance of projection-based constrained estimation and fusion methods that is affected by information loss over the long-haul links. We use an example to compare the tracking errors under various implementations of centralized and distributed projection-based estimation and fusion methods and demonstrate the effectiveness of using projection-based methods in these settings.

  5. Mesh refinement for particle-in-cell plasma simulations: Applications to - and benefits for - heavy ion fusion

    SciTech Connect

    Vay, J.L.; Colella, P.; McCorquodale, P.; Van Straalen, B.; Friedman, A.; Grote, D.P.

    2002-05-24

    The numerical simulation of the driving beams in a heavy ion fusion power plant is a challenging task, and simulation of the power plant as a whole, or even of the driver, is not yet possible. Despite the rapid progress in computer power, past and anticipated, one must consider the use of the most advanced numerical techniques, if they are to reach the goal expeditiously. One of the difficulties of these simulations resides in the disparity of scales, in time and in space, which must be resolved. When these disparities are in distinctive zones of the simulation region, a method which has proven to be effective in other areas (e.g., fluid dynamics simulations) is the mesh refinement technique. They discuss the challenges posed by the implementation of this technique into plasma simulations (due to the presence of particles and electromagnetic waves). They present the prospects for and projected benefits of its application to heavy ion fusion, in particular to the simulation of the ion source and the final beam propagation in the chamber. A Collaboration project is under way at LBNL between the Applied Numerical Algorithms Group (ANAG) and the HIF group to couple the Adaptive Mesh Refinement (AMR) library CHOMBO developed by the ANAG group to the Particle-In-Cell accelerator code (WARP) developed by the HIF-VNL. They describe their progress and present their initial findings.

  6. Simulation of a fusion gamma reaction history diagnostic for the Shenguang-III facility

    NASA Astrophysics Data System (ADS)

    Song, Zifeng; Chen, Jiabin; Xu, Tao; Liu, Zhongjie; Zhan, Xiayu; Tang, Qi

    2017-05-01

    The fusion gamma has an advantage to measure fusion reaction history in the deuterium-tritium (DT) fuel implosion experiments. A gas Cherenkov detector is available to measure DT fusion gamma in a high background environment. Simulation is carried out by Geant4 to evaluate the conversion efficiency and the time response of this Cherenkov detector. The background gamma rays are roughly estimated based on ENDF/B-VII.0 data, and the signal-to-noise (SNR) is evaluated based on the simulated energy response curve. The simulation result and the SNR analysis are helpful to construct the Cherenkov detector at Shenguang-III facility.

  7. Fusion

    NASA Astrophysics Data System (ADS)

    Herman, Robin

    1990-10-01

    The book abounds with fascinating anecdotes about fusion's rocky path: the spurious claim by Argentine dictator Juan Peron in 1951 that his country had built a working fusion reactor, the rush by the United States to drop secrecy and publicize its fusion work as a propaganda offensive after the Russian success with Sputnik; the fortune Penthouse magazine publisher Bob Guccione sank into an unconventional fusion device, the skepticism that met an assertion by two University of Utah chemists in 1989 that they had created "cold fusion" in a bottle. Aimed at a general audience, the book describes the scientific basis of controlled fusion--the fusing of atomic nuclei, under conditions hotter than the sun, to release energy. Using personal recollections of scientists involved, it traces the history of this little-known international race that began during the Cold War in secret laboratories in the United States, Great Britain and the Soviet Union, and evolved into an astonishingly open collaboration between East and West.

  8. Networking Industry and Academia: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Stephens, Simon; Onofrei, George

    2009-01-01

    Graduate development programmes such as FUSION continue to be seen by policy makers, higher education institutions and small and medium-sized enterprises (SMEs) as primary means of strengthening higher education-business links and in turn improving the match between graduate output and the needs of industry. This paper provides evidence from case…

  9. Networking Industry and Academia: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Stephens, Simon; Onofrei, George

    2009-01-01

    Graduate development programmes such as FUSION continue to be seen by policy makers, higher education institutions and small and medium-sized enterprises (SMEs) as primary means of strengthening higher education-business links and in turn improving the match between graduate output and the needs of industry. This paper provides evidence from case…

  10. Projection-Based Linear Constrained Estimation and Fusion over Long-Haul Links

    SciTech Connect

    Rao, Nageswara S

    2016-01-01

    In this work, we study estimation and fusion with linear dynamics in long-haul sensor networks, wherein a number of sensors are remotely deployed over a large geographical area for performing tasks such as target tracking, and a remote fusion center serves to combine the information provided by these sensors in order to improve the overall tracking accuracy. In reality, the motion of a dynamic target might be subject to certain constraints, for instance, those defined by a road network. We explore the accuracy performance of projection-based constrained estimation and fusion methods that is affected by information loss over the long-haul links. We use a tracking example to compare the tracking errors under various implementations of centralized and distributed projection-based estimation and fusion methods.

  11. Simulating weld-fusion boundary microstructures in aluminum alloys

    NASA Astrophysics Data System (ADS)

    Kostrivas, Anastasios D.; Lippold, John C.

    2004-02-01

    A fundamental study of weld-fusion boundary microstructure evolution in aluminum alloys was conducted in an effort to understand equiaxed grain zone formation and fusion boundary nucleation and growth phenomena. In addition to commercial aluminum alloys, experimental Mg-bearing alloys with Zr and Sc additions were studied along with the widely used Cu- and Licontaining alloy 2195-T8. This article describes work conducted to clarify the interrelation among composition, base metal substrate, and temperature as they relate to nucleation and growth phenomena at the fusion boundary.

  12. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  13. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  14. Internet and web projects for fusion plasma science and education. Final technical report

    SciTech Connect

    Eastman, Timothy E.

    1999-08-30

    The plasma web site at http://www.plasmas.org provides comprehensive coverage of all plasma science and technology with site links worldwide. Prepared to serve the general public, students, educators, researchers, and decision-makers, the site covers basic plasma physics, fusion energy, magnetic confinement fusion, high energy density physics include ICF, space physics and astrophysics, pulsed-power, lighting, waste treatment, plasma technology, plasma theory, simulations and modeling.

  15. Phase space structures in gyrokinetic simulations of fusion plasma turbulence

    NASA Astrophysics Data System (ADS)

    Ghendrih, Philippe; Norscini, Claudia; Cartier-Michaud, Thomas; Dif-Pradalier, Guilhem; Abiteboul, Jérémie; Dong, Yue; Garbet, Xavier; Gürcan, Ozgür; Hennequin, Pascale; Grandgirard, Virginie; Latu, Guillaume; Morel, Pierre; Sarazin, Yanick; Storelli, Alexandre; Vermare, Laure

    2014-10-01

    Gyrokinetic simulations of fusion plasmas give extensive information in 5D on turbulence and transport. This paper highlights a few of these challenging physics in global, flux driven simulations using experimental inputs from Tore Supra shot TS45511. The electrostatic gyrokinetic code GYSELA is used for these simulations. The 3D structure of avalanches indicates that these structures propagate radially at localised toroidal angles and then expand along the field line at sound speed to form the filaments. Analysing the poloidal mode structure of the potential fluctuations (at a given toroidal location), one finds that the low modes m = 0 and m = 1 exhibit a global structure; the magnitude of the m = 0 mode is much larger than that of the m = 1 mode. The shear layers of the corrugation structures are thus found to be dominated by the m = 0 contribution, that are comparable to that of the zonal flows. This global mode seems to localise the m = 2 mode but has little effect on the localisation of the higher mode numbers. However when analysing the pulsation of the latter modes one finds that all modes exhibit a similar phase velocity, comparable to the local zonal flow velocity. The consequent dispersion like relation between the modes pulsation and the mode numbers provides a means to measure the zonal flow. Temperature fluctuations and the turbulent heat flux are localised between the corrugation structures. Temperature fluctuations are found to exhibit two scales, small fluctuations that are localised by the corrugation shear layers, and appear to bounce back and forth radially, and large fluctuations, also readily observed on the flux, which are associated to the disruption of the corrugations. The radial ballistic velocity of both avalanche events if of the order of 0.5ρ∗c0 where ρ∗ = ρ0/a, a being the tokamak minor radius and ρ0 being the characteristic Larmor radius, ρ0 = c0/Ω0. c0 is the reference ion thermal velocity and Ω0 = qiB0/mi the reference

  16. Image Fusion Software in the Clearpem-Sonic Project

    NASA Astrophysics Data System (ADS)

    Pizzichemi, M.; di Vara, N.; Cucciati, G.; Ghezzi, A.; Paganoni, M.; Farina, F.; Frisch, B.; Bugalho, R.

    2012-08-01

    ClearPEM-Sonic is a mammography scanner that combines Positron Emission Tomography with 3D ultrasound echographic and elastographic imaging. It has been developed to improve early stage detection of breast cancer by combining metabolic and anatomical information. The PET system has been developed by the Crystal Clear Collaboration, while the 3D ultrasound probe has been provided by SuperSonic Imagine. In this framework, the visualization and fusion software is an essential tool for the radiologists in the diagnostic process. This contribution discusses the design choices, the issues faced during the implementation, and the commissioning of the software tools developed for ClearPEM-Sonic.

  17. Probing the mechanism of fusion in a two-dimensional computer simulation.

    PubMed Central

    Chanturiya, Alexandr; Scaria, Puthurapamil; Kuksenok, Oleksandr; Woodle, Martin C

    2002-01-01

    A two-dimensional (2D) model of lipid bilayers was developed and used to investigate a possible role of membrane lateral tension in membrane fusion. We found that an increase of lateral tension in contacting monolayers of 2D analogs of liposomes and planar membranes could cause not only hemifusion, but also complete fusion when internal pressure is introduced in the model. With a certain set of model parameters it was possible to induce hemifusion-like structural changes by a tension increase in only one of the two contacting bilayers. The effect of lysolipids was modeled as an insertion of a small number of extra molecules into the cis or trans side of the interacting bilayers at different stages of simulation. It was found that cis insertion arrests fusion and trans insertion has no inhibitory effect on fusion. The possibility of protein participation in tension-driven fusion was tested in simulation, with one of two model liposomes containing a number of structures capable of reducing the area occupied by them in the outer monolayer. It was found that condensation of these structures was sufficient to produce membrane reorganization similar to that observed in simulations with "protein-free" bilayers. These data support the hypothesis that changes in membrane lateral tension may be responsible for fusion in both model phospholipid membranes and in biological protein-mediated fusion. PMID:12023230

  18. Humanoid Flight Metabolic Simulator Project

    NASA Technical Reports Server (NTRS)

    Ross, Stuart

    2015-01-01

    NASA's Evolvable Mars Campaign (EMC) has identified several areas of technology that will require significant improvements in terms of performance, capacity, and efficiency, in order to make a manned mission to Mars possible. These include crew vehicle Environmental Control and Life Support System (ECLSS), EVA suit Portable Life Support System (PLSS) and Information Systems, autonomous environmental monitoring, radiation exposure monitoring and protection, and vehicle thermal control systems (TCS). (MADMACS) in a Suit can be configured to simulate human metabolism, consuming crew resources (oxygen) in the process. In addition to providing support for testing Life Support on unmanned flights, MADMACS will also support testing of suit thermal controls, and monitor radiation exposure, body zone temperatures, moisture, and loads.

  19. Edge Simulation Laboratory Project Report

    SciTech Connect

    Cohen, R. H.; Dorf, M.; Dorr, M.; Rognlien, T. D.

    2011-02-25

    In 2010 The Edge Simulation Laboratory (ESL) embarked upon the plan laid out in the renewal proposal submitted in December 2009. This proposal called for initially parallel efforts addressing the physics of the closed-flux-surface pedestal region, using existing computational tools (GYRO, BOUT++) and analytic modeling, and physics of the scrape-off layer via development of the new edge gyrokinetic code COGENT. Progress in the former area is described in a series of monthly progress reports prepared by General Atomics; these are attached as a set of appendices (describing work done in the month prior to the indicated date). Progress in the latter area, as well as associated theoretical development, is described.

  20. Size limitations for microwave cavity to simulate heating of blanket material in fusion reactor

    SciTech Connect

    Wolf, D.

    1987-01-01

    The power profile in the blanket material of a nuclear fusion reactor can be simulated by using microwaves at 200 MHz. Using these microwaves, ceramic breeder materials can be thermally tested to determine their acceptability as blanket materials without entering a nuclear fusion environment. A resonating cavity design is employed which can achieve uniform cross sectional heating in the plane transverse to the neutron flux. As the sample size increases in height and width, higher order modes, above the dominant mode, are propagated and destroy the approximation to the heating produced in a fusion reactor. The limits at which these modes develop are determined in the paper.

  1. Programmable AC power supply for simulating power transient expected in fusion reactor

    SciTech Connect

    Halimi, B.; Suh, K. Y.

    2012-07-01

    This paper focus on control engineering of the programmable AC power source which has capability to simulate power transient expected in fusion reactor. To generate the programmable power source, AC-AC power electronics converter is adopted to control the power of a set of heaters to represent the transient phenomena of heat exchangers or heat sources of a fusion reactor. The International Thermonuclear Experimental Reactor (ITER) plasma operation scenario is used as the basic reference for producing this transient power source. (authors)

  2. Gamma Efficiency Simulations towards Coincidence Measurements for Fusion Cross Sections

    NASA Astrophysics Data System (ADS)

    Heine, M.; Courtin, S.; Fruet, G.; Jenkins, D. G.; Montanari, D.; Morris, L.; Regan, P. H.; Rudigier, M.; Symochko, D.

    2016-10-01

    With the experimental station STELLA (STELlar LAboratory) we will measure fusion cross sections of astrophysical relevance making use of the coincident detection of charged particles and gamma rays for background reduction. For the measurement of gamma rays from the de-excitation of fusion products a compact array of 36 UK FATIMA LaBr3 detectors is designed based on efficiency studies with Geant4. The photo peak efficiency in the region of interest compares to other gamma detection systems used in this field. The features of the internal decay of 138La is used in a background study to obtain an online calibration of the gamma detectors. Background data are fit to the Monte Carlo model of the self activity assuming crude exponential behavior of external background. Accuracy in the region of interest is of the order of some keV in this first study.

  3. Simulation of RF-fields in a fusion device

    SciTech Connect

    De Witte, Dieter; Bogaert, Ignace; De Zutter, Daniel; Van Oost, Guido; Van Eester, Dirk

    2009-11-26

    In this paper the problem of scattering off a fusion plasma is approached from the point of view of integral equations. Using the volume equivalence principle an integral equation is derived which describes the electromagnetic fields in a plasma. The equation is discretized with MoM using conforming basis functions. This reduces the problem to solving a dense matrix equation. This can be done iteratively. Each iteration can be sped up using FFTs.

  4. Graduate Training: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Hegarty, Cecilia; Johnston, Janet

    2008-01-01

    Purpose: This paper aims to explore graduate training through SME-based project work. The views and behaviours of graduates are examined along with the perceptions of the SMEs and academic partner institutions charged with training graduates. Design/methodology/approach: The data are largely qualitative and derived from the experiences of…

  5. Graduate Training: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Hegarty, Cecilia; Johnston, Janet

    2008-01-01

    Purpose: This paper aims to explore graduate training through SME-based project work. The views and behaviours of graduates are examined along with the perceptions of the SMEs and academic partner institutions charged with training graduates. Design/methodology/approach: The data are largely qualitative and derived from the experiences of…

  6. One-dimensional particle simulations of Knudsen-layer effects on D-T fusion

    SciTech Connect

    Cohen, Bruce I.; Dimits, Andris M.; Zimmerman, George B.; Wilks, Scott C.

    2014-12-15

    Particle simulations are used to solve the fully nonlinear, collisional kinetic equation describing the interaction of a high-temperature, high-density, deuterium-tritium plasma with absorbing boundaries, a plasma source, and the influence of kinetic effects on fusion reaction rates. Both hydrodynamic and kinetic effects influence the end losses, and the simulations show departures of the ion velocity distributions from Maxwellian due to the reduction of the population of the highest energy ions (Knudsen-layer effects). The particle simulations show that the interplay between sources, plasma dynamics, and end losses results in temperature anisotropy, plasma cooling, and concomitant reductions in the fusion reaction rates. However, for the model problems and parameters considered, particle simulations show that Knudsen-layer modifications do not significantly affect the velocity distribution function for velocities most important in determining the fusion reaction rates, i.e., the thermal fusion reaction rates using the local densities and bulk temperatures give good estimates of the kinetic fusion reaction rates.

  7. Overview of Theory and Simulations in the Heavy Ion Fusion Science Virtual National Laboratory

    SciTech Connect

    Friedman, A

    2006-07-03

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  8. Overview of Theory and Simulations in the Heavy Ion Fusion ScienceVirtual National Laboratory

    SciTech Connect

    Friedman, Alex

    2006-07-09

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  9. Synaptic fusion pore structure and AMPA receptor activation according to Brownian simulation of glutamate diffusion.

    PubMed

    Ventriglia, Francesco; Maio, Vito Di

    2003-03-01

    The rising phase of fast, AMPA-mediated Excitatory Post Synaptic Currents (EPSCs) has a primary role in the computational ability of neurons. The structure and radial expansion velocity of the fusion pore between the vesicle and the presynaptic membrane could be important factors in determining the time course of the EPSC. We have used a Brownian simulation model for glutamate neurotransmitter diffusion to test two hypotheses on the fusion pore structure, namely, the proteinaceous pore and the purely lipidic pore. Three more hypotheses on the radial expansion velocity were also tested. The rising phases of the EPSC, computed under various conditions, were compared with experimental data from the literature. Our present results show that a proteinaceous fusion pore should produce a more marked foot at the beginning of the rising phase of the EPSC. They also confirm the hypothesis that the structure of the fusion pore and its radial expansion velocity play significant roles in shaping the fast EPSC time course.

  10. Numerical simulation on a new cylindrical target for Z-pinch driven inertial confinement fusion

    NASA Astrophysics Data System (ADS)

    Chu, Y. Y.; Wang, Z.; Qi, J. M.; Wu, F. Y.; Li, Z. H.

    2017-06-01

    A new indirectly driven cylindrical target is proposed for Z-pinch inertial confinement fusion, and the target implosion dynamics is simulated with a combination of the mass-point model and the radiation hydrodynamic model. Driven by a current waveform with the peak value of 60 MA and 10-90% rising time of 180 ns, the shell kinetic energy of 5 MJ cm-1 can be obtained when the 60 mg cm-1 liner with initial radius 5 cm is imploded to radius of 5 mm. The simulated kinetic energy is loaded to compress the multi-layer cylindrical target, and 24.6 MJ fusion energy can be released according to the radiation hydrodynamic simulation. The power balance relationship is analyzed for the fusion fuel, and the fuel is ignited in the volume-ignition style. The target here can avoid the problem of coupling between the cylindrical Z-pinch and spherical fusion capsule, and can make use of dynamics hohlraum to weaken the influence of Z-pinch instability on the fuel compression. The implosion dynamics of the cylindrical fusion target is easy to diagnose from the axial direction, which makes it suitable to be investigated in future experiments.

  11. HIFSA: Heavy-Ion Fusion Systems Assessment Project: Volume 1, Executive summary

    SciTech Connect

    Dudziak, D.J.; Herrmannsfeldt, W.B.; Saylor, W.W.

    1987-12-01

    The Heavy-Ion Fusion Systems Assessment (HIFSA) was conducted with the specific objective of evaluating the prospects of using induction-linac heavy-ion accelerators to generate economical electrical power from Inertial Confinement Fusion (ICF). Cost/performance models of the major fusion power plant systems were used to identify promising areas in parameter space. Resulting cost-of-electricity projections for a plant size of 1 GWe are comparable to those from other fusion system studies, some of which were for much larger power plants. These favorable projections maintain over an unusually large domain of parameter space but depend especially on making large cost savings for the accelerator by using higher charge-to-mass ratio ions than assumed previously. The feasibility of realizing such savings has been shown by (1) experiments demonstrating transport stability better than anticipated for space-charge-dominated beams, and (2) theoretical predictions that the final transport and pulse compression in reactor-chamber environments will be sufficiently resistant to streaming instabilities to allow successful propagation of neutralized beams to the target. Results of the HIFSA study already have had a significant impact on the heavy-ion induction accelerator R and D program, especially in selection of the charge-state objectives. Also, the study should enhance the credibility of induction linacs as ICF drivers.

  12. Phasor Simulator for Operator Training Project

    SciTech Connect

    Dyer, Jim

    2016-09-14

    Synchrophasor systems are being deployed in power systems throughout the North American Power Grid and there are plans to integrate this technology and its associated tools into Independent System Operator (ISO)/utility control room operations. A pre-requisite to using synchrophasor technologies in control rooms is for operators to obtain training and understand how to use this technology in real-time situations. The Phasor Simulator for Operator Training (PSOT) project objective was to develop, deploy and demonstrate a pre-commercial training simulator for operators on the use of this technology and to promote acceptance of the technology in utility and ISO/Regional Transmission Owner (RTO) control centers.

  13. Fusion research at General Atomics. Annual report, October 1, 1992--September 30, 1993. General Atomics Project 3469

    SciTech Connect

    Not Available

    1994-05-01

    This report is the General Atomics Fusion Physics annual report for the period October 1992 thru September 1993. It highlights major activities and projects in four areas: fusion technology development overview; reactor design studies; RF technology; plasma facing components. A listing of publications for the period is also enclosed.

  14. Simulations of the performance of the Fusion-FEM, for an increased e-beam emittance

    SciTech Connect

    Tulupov, A.V.; Urbanus, W.H.; Caplan, M.

    1995-12-31

    The original design of the Fusion-FEM, which is under construction at the FOM-Institute for Plasma Physics, was based on an electron beam emittance of 50 {pi} mm mrad. Recent measurements of the emittance of the beam emitted by the electron gun showed that the actual emittance is 80 {pi} mm mrad. This results in a 2.5 times lower beam current density inside the undulator. As a result it changes the linear gain, the start-up time, the saturation level and the frequency spectrum. The main goal of the FEM project is to demonstrate a stable microwave output power of at least 1 MW. The decrease of the electron beam current density has to be compensated by variations of the other FEM parameters, such as the reflection (feedback) coefficient of the microwave cavity and the length of the drift gap between the two sections of the step-tapered undulator. All basic dependencies of the linear and nonlinear gain, and of the output power on the main FEM parameters have been simulated numerically with the CRMFEL code. Regimes of stable operation of the FEM with the increased emittance have been found. These regimes could be found because of the original flexibility of the FEM design.

  15. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  16. Simulation of Intense Beams and Targets for Heavy-Ion-Fusion Science (HEDLP / Inertial Fusion Energy)

    SciTech Connect

    Friedman, Alex; Barnard, John J.; Cohen, Ron H.; Dorf, Mikhail; Eder, David; Grote, Dave P.; Lund, Steve M.; Sharp, William M.; Henestroza, Enrique; Lee, Ed P.; Vay, Jean -Luc; Davidson, Ron C.; Kaganovich, Igor D.; Qin, Hong; Startsev, Ed; Fagnan, Kirsten; Koniges, Alice; Bertozzi, Andrea

    2010-08-26

    Our principal goals, and activities in support of those goals, over the next five years are as follows: (1) Optimize the properties of the NDCX-II beam for each class of target experiments; achieve quantitative agreement with measurements; develop improved machine configurations and operating points. To accomplish these goals, we plan to use Warp to simulate NDCX-II from source to target, in full kinetic detail, including first-principles modeling of beam neutralization by plasma. The output from an ensemble of Warp runs (representing shot-to-shot variations) will be used as input to target simulations using ALE-AMR on NERSC, and other codes. (2) Develop enhanced versions of NDCX-II (the machine is designed to be extensible and reconfigurable), and carry out studies to define a next-step ion beam facility. To accomplish these goals, much of the work will involve iterative optimization employing Warp runs that assume ideal beam neutralization downstream of the accelerator. (3) Carry out detailed target simulations in the Warm Dense Matter regime using the ALE-AMR code, including surface tension effects, liquid-vapor coexistence, and accurate models of both the driving beam and the target geometry. For this we will need to make multiple runs (to capture shot-to-shot variations), and to both develop and employ synthetic diagnostics (to enable comparison with experiments). The new science that will be revealed is the physics of the transition from the liquid to vapor state of a volumetrically superheated material, wherein droplets are formed, and wherein phase transitions, surface tension and hydrodynamics all play significant roles in the dynamics. These simulations will enable calculations of equation of state and other material properties, and will also be of interest for their illumination of the science of droplet formation.

  17. Data assimilation and data fusion in a regional simulation

    NASA Astrophysics Data System (ADS)

    Hoareau, N.; Umbert, M.; Turiel, A.; Ballabrera, J.; Portabella, M.

    2012-04-01

    An Ensemble Kalman filter [Ballabrera-Poy et al., 2009] has been used to assimilate Sea Surface Temperature (SST) and Argo data into a regional configuration of the NEMO-OPA ocean model. Our validation of the data assimilation experiments include the comparison against a random ensemble of Argo profilers previously set aside (cross-validation), where the usual metrics are estimated from the differences of our data assimilation fields against Argo data (root mean square, mean value, standard deviation). We have also developed another metric based on the multifractal structure of the flow, comparing the histograms of singularity exponents of observations, as well as those of the background and analysis fields. While the first approach does directly measure the point by point difference between the model data and the in-situ independent observation, the second method focuses on the geophysical coherence of dynamical structures as it gives information about multi-point spatial correlations. In a second part of this work we have analysed the relative advantages and drawbacks between data assimilation (here based on the Ensemble Kalman filter) and data fusion (here based on the Multifractal Microcanonical Formalism, see Pottier et al., 2008) when applied to the production of quality remote sensing products of ocean observation. We have thus used both methods for the generation of SMOS Level 4 products of Sea Surface Salinity; the resulting maps have been validated with our metrics and analyzed at global and regional basis.

  18. Sensitivity of mix in Inertial Confinement Fusion simulations to diffusion processes

    NASA Astrophysics Data System (ADS)

    Melvin, Jeremy; Cheng, Baolian; Rana, Verinder; Lim, Hyunkyung; Glimm, James; Sharp, David H.

    2015-11-01

    We explore two themes related to the simulation of mix within an Inertial Confinement Fusion (ICF) implosion, the role of diffusion (viscosity, mass diffusion and thermal conduction) processes and the impact of front tracking on the growth of the hydrodynamic instabilities. Using the University of Chicago HEDP code FLASH, we study the sensitivity of post-shot simulations of a NIC cryogenic shot to the diffusion models and front tracking of the material interfaces. Results of 1D and 2D simulations are compared to experimental quantities and an analysis of the current state of fully integrated ICF simulations is presented.

  19. Surface Roughness Instability Simulations of Inertial Confinement Fusion Implosions

    NASA Astrophysics Data System (ADS)

    McGlinchey, Kristopher; Niasse, Nicolas; Chittenden, Jeremy

    2016-10-01

    Understanding hydrodynamic instabilities seeded by the inherit roughness on a capsule's surface is critical in quantifying an implosion's performance. Combined with instabilities on the ice-gas interface during the deceleration phase, their growth can lead to inhomogeneity in the shell's areal density. Recent work carried out at the National Ignition Facility (NIF) on surface roughness Rayleigh-Taylor Instability (RTI) growth rates show larger amplitudes in experiment as compared to simulation, even with a deliberately roughened surface. We report on simulations of ICF experiments occurring at NIF using the Chimera code developed at Imperial College. Chimera is a fully explicit, Eulerian 3D multi-group radiation-hydrodynamics code utilising P1/3 automatic flux limiting radiation transport with opacity data from a non-LTE atomic model also developed at Imperial College. One-dimensional simulations are briefly presented to highlight that proper shock timing and stagnation properties have been achieved as are 2D harmonic perturbation simulations to benchmark their growth rates. Surface roughness implosions (initialised from metrology data) were then simulated for: shot N120321, a low-foot implosion with large surface perturbations and shot N130927, a high-foot implosion. Synthetic radiographs of these implosions were constructed at low convergence ratio (3-4) for comparison to experiment and at higher convergence to investigate what will be observable by new diagnostics in development at NIF.

  20. The UPSCALE project: a large simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, Matthew; Roberts, Malcolm; Vidale, Pier Luigi; Schiemann, Reinhard; Demory, Marie-Estelle; Strachan, Jane

    2014-05-01

    The development of a traceable hierarchy of HadGEM3 global climate models, based upon the Met Office Unified Model, at resolutions from 135 km to 25 km, now allows the impact of resolution on the mean state, variability and extremes of climate to be studied in a robust fashion. In 2011 we successfully obtained a single-year grant of 144 million core hours of supercomputing time from the PRACE organization to run ensembles of 27 year atmosphere-only (HadGEM3-A GA3.0) climate simulations at 25km resolution, as used in present global weather forecasting, on HERMIT at HLRS. Through 2012 the UPSCALE project (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) ran over 650 years of simulation at resolutions of 25 km (N512), 60 km (N216) and 135 km (N96) to look at the value of high resolution climate models in the study of both present climate and a potential future climate scenario based on RCP8.5. Over 400 TB of data was produced using HERMIT, with additional simulations run on HECToR (UK supercomputer) and MONSooN (Met Office NERC Supercomputing Node). The data generated was transferred to the JASMIN super-data cluster, hosted by STFC CEDA in the UK, where analysis facilities are allowing rapid scientific exploitation of the data set. Many groups across the UK and Europe are already taking advantage of these facilities and we welcome approaches from other interested scientists. This presentation will briefly cover the following points; Purpose and requirements of the UPSCALE project and facilities used. Technical implementation and hurdles (model porting and optimisation, automation, numerical failures, data transfer). Ensemble specification. Current analysis projects and access to the data set. A full description of UPSCALE and the data set generated has been submitted to Geoscientific Model development, with overview information available from http://proj.badc.rl.ac.uk/upscale .

  1. Radiation-MHD Simulations of Plasma-Jet-Driven Magneto-Inertial Fusion Gain Using USim

    NASA Astrophysics Data System (ADS)

    Stoltz, Peter; Beckwith, Kristian; Kundrapu, Mahdusudhan; Hsu, Scott; Langendorf, Samuel

    2016-10-01

    One goal of the modeling effort for the PLX- α project is to identify plasma-jet-driven magneto-inertial fusion (PJMIF) configurations with potential net fusion-energy gain. We use USim, which is a tool for modeling high-energy-density plasmas using multi-fluid models coupled to electromagnetics using fully-implicit iterative solvers, combined with finite volume discretizations on unstructured meshes. We include physical viscosity and advanced-EOS modeling capability, and are investigating the effects of different radiation (including flux-limited diffusion) and alpha-transport models. We compare 2D and 1D gain calculations for various liner geometries, parameters, and plasma species, and consider the effects of liner non-uniformities on fusion-gain degradation. Supported by the ARPA-E ALPHA Program.

  2. The European Fusion Research and Development Programme and the ITER Project

    NASA Astrophysics Data System (ADS)

    Green, B. J.

    2006-07-01

    The EURATOM fusion research and development programme is a well integrated and coordinated programme. It has the objective of ''developing the technology for a safe, sustainable, environmentally responsible and economically viable energy source.'' The programme is focussed on the magnetic confinement approach and supports 23 Associations which involve research entities (many with experimental and technology facilities) each having a bilateral contractual relationship with the European Commission. The paper will describe fusion reactions and present their potential advantages as an energy source. Further, it will describe the EURATOM programme and how it is organised and implemented. The success of the European programme and that of other national programmes, have provided the basis for the international ITER Project, which is the next logical step in the development of fusion energy. The paper will describe ITER, its aims, its design, and the supporting manufacture of prototype components. The European contribution to ITER, the exploitation of the Joint European Torus (JET), and the long-term reactor technology R&D are carried out under the multilateral European Fusion Development Agreement (EFDA).

  3. Simulating Halos with the Caterpillar Project

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-04-01

    The Caterpillar Project is a beautiful series of high-resolution cosmological simulations. The goal of this project is to examine the evolution of dark-matter halos like the Milky Ways, to learn about how galaxies like ours formed. This immense computational project is still in progress, but the Caterpillar team is already providing a look at some of its first results.Lessons from Dark-Matter HalosWhy simulate the dark-matter halos of galaxies? Observationally, the formation history of our galaxy is encoded in galactic fossil record clues, like the tidal debris from disrupted satellite galaxies in the outer reaches of our galaxy, or chemical abundance patterns throughout our galactic disk and stellar halo.But to interpret this information in a way that lets us learn about our galaxys history, we need to first test galaxy formation and evolution scenarios via cosmological simulations. Then we can compare the end result of these simulations to what we observe today.This figure illustrates the difference that mass resolution makes. In the left panel, the mass resolution is 1.5*10^7 solar masses per particle. In the right panel, the mass resolution is 3*10^4 solar masses per particle [Griffen et al. 2016]A Computational ChallengeDue to how computationally expensive such simulations are, previous N-body simulations of the growth of Milky-Way-like halos have consisted of only one or a few halos each. But in order to establish a statistical understanding of how galaxy halos form and find out whether the Milky Ways halo is typical or unusual! it is necessary to simulate a larger number of halos.In addition, in order to accurately follow the formation and evolution of substructure within the dark-matter halos, these simulations must be able to resolve the smallest dwarf galaxies, which are around a million solar masses. This requires an extremely high mass resolution, which adds to the computational expense of the simulation.First OutcomesThese are the challenges faced by

  4. [Accuracy of morphological simulation for orthognatic surgery. Assessment of a 3D image fusion software.

    PubMed

    Terzic, A; Schouman, T; Scolozzi, P

    2013-08-06

    The CT/CBCT data allows for 3D reconstruction of skeletal and untextured soft tissue volume. 3D stereophotogrammetry technology has strongly improved the quality of facial soft tissue surface texture. The combination of these two technologies allows for an accurate and complete reconstruction. The 3D virtual head may be used for orthognatic surgical planning, virtual surgery, and morphological simulation obtained with a software dedicated to the fusion of 3D photogrammetric and radiological images. The imaging material include: a multi-slice CT scan or broad field CBCT scan, a 3D photogrammetric camera. The operative image processing protocol includes the following steps: 1) pre- and postoperative CT/CBCT scan and 3D photogrammetric image acquisition; 2) 3D image segmentation and fusion of untextured CT/CBCT skin with the preoperative textured facial soft tissue surface of the 3D photogrammetric scan; 3) image fusion of the pre- and postoperative CT/CBCT data set virtual osteotomies, and 3D photogrammetric soft tissue virtual simulation; 4) fusion of virtual simulated 3D photogrammetric and real postoperative images, and assessment of accuracy using a color-coded scale to measure the differences between the two surfaces. Copyright © 2013. Published by Elsevier Masson SAS.

  5. Particle Simulations of Ion Rings for Magnetic Fusion.

    NASA Astrophysics Data System (ADS)

    Lyster, Peter Michael

    1987-09-01

    This thesis contains a numerical study of the dynamics of axis encircling charged particles in ion rings and layers. Part of this work deals with the coalescence of ion rings to form field reversed rings, which may be useful for Compact Torus magnetic fusion reactors. The coalescence of weak ion rings with Compact Toroids is also investigated. This is important because a component of energetic particles may help to maintain the flux or stabilize these configurations against a number of macroscopic magnetohydrodynamic instabilities. Several different types of particle codes are used. RINGA and CIDER are two and one half-dimensional codes in cylindrical axisymmetric geometry. For the RINGA code, a simple Ohm's law is used for modeling a resistive background plasma. For CIDER, the massless electron momentum equation is used for modeling a conductive background plasma. In a resistive plasma, ring coalescence can be achieved if the initial relative translational velocity is not excessive, and if the plasma conductivity is chosen to maximize the dissipation of ring energy. A theoretical and computational study is made of a mechanism by which ring translational energy is transferred to Alfven waves in a conductive plasma. A new collective phenomenon is discussed, whereby the merging of rings is improved if they have stronger initial self fields. A study is made of the coalescence of strong field-reversed ion rings in highly conductive plasmas, in which it is found that magnetic field line reconnection is an important process. Finally, a study of the magnetic compression of ion layers in conductive plasmas is presented. BAGSHAW, a one-dimensional particle code which treats the background plasma in the two fluid approximation, was developed for this purpose. Compression on a timescale which is comparable with the Alfven transit time may create considerable transients in the system. In a one-dimensional system, the plasma return current does not cancel the increase in the

  6. Gyrokinetic Simulation of Energetic Particles Turbulence and Transport in Fusion Plasmas

    NASA Astrophysics Data System (ADS)

    Zhang, Wenlu; Lin, Zhihong; Holod, Ihor; Xiao, Yong; Bierwage, Andreas; Spong, Donald; Chu, Ming

    2009-05-01

    The confinement of the energetic particles (EP) is a critical issue in the International Thermonuclear Experimental Reactor (ITER), since that ignition relies on the self-heating by the fusion products. Shear Alfven wave excitations by EP in toroidal systems, for example Toroidal Alfven Eigenmode (TAE) and Energetic Particle Mode (EPM) have been investigated as primary candidate for fluctuation-induced transport of EP in fusion plasma. In this work, TAE excitations by energetic particles are investigated in large scale first-principle simulations of fusion plasmas using the global gyrokinetic toroidal code (GTC) [Lin, Science 1998]. Comprehensive linear benchmarking results are reported between GTC, GYRO, fluid code TAEFL, and Magnetohydrodynamic-gyrokinetic hybrid code HMGC.

  7. StabilimaxNZ® versus simulated fusion: evaluation of adjacent-level effects

    PubMed Central

    Henderson, Gweneth; James, Yue; Timm, Jens Peter

    2007-01-01

    Rationale behind motion preservation devices is to eliminate the accelerated adjacent-level effects (ALE) associated with spinal fusion. We evaluated multidirectional flexibilities and ALEs of StabilimaxNZ® and simulated fusion applied to a decompressed spine. StabilimaxNZ® was applied at L4–L5 after creating a decompression (laminectomy of L4 plus bilateral medial facetectomy at L4–L5). Multidirectional Flexibility and Hybrid tests were performed on six fresh cadaveric human specimens (T12–S1). Decompression increased average flexion–extension rotation to 124.0% of the intact. StabilimaxNZ® and simulated fusion decreased the motion to 62.4 and 23.8% of intact, respectively. In lateral bending, corresponding increase was 121.6% and decreases were 57.5 and 11.9%. In torsion, corresponding increase was 132.7%, and decreases were 36.3% for fusion, and none for StabilimaxNZ® ALE was defined as percentage increase over the intact. The ALE at L3–4 was 15.3% for StabilimaxNZ® versus 33.4% for fusion, while at L5–S1 the ALE were 5.0% vs. 11.3%, respectively. In lateral bending, the corresponding ALE values were 3.0% vs. 19.1%, and 11.3% vs. 35.8%, respectively. In torsion, the corresponding values were 3.7% vs. 20.6%, and 4.0% vs. 33.5%, respectively. In conclusion, this in vitro study using Flexibility and Hybrid test methods showed that StabilimaxNZ® stabilized the decompressed spinal level effectively in sagittal and frontal planes, while allowing a good portion of the normal rotation, and concurrently it did not produce significant ALEs as compared to the fusion. However, it did not stabilize the decompressed specimen in torsion. PMID:17924151

  8. Molecular Dynamics Simulations of Folding and Insertion of the Ebola Virus Fusion Peptide into a Membrane Bilayer

    DTIC Science & Technology

    2008-07-01

    constitute the family Filoviridae. The most pathogenic strain ( Zaire ) of Ebola virus causes a severe form of hemorrhagic fever in humans and nonhuman...in the membrane-fusion process. Very recently, a NMR structure was reported of a 16-residue Zaire Ebola virus fusion peptide (Ebo-16) of GP2 [1...Molecular Dynamics Simulations of Folding and Insertion of the Ebola Virus Fusion Peptide into a Membrane Bilayer Mark A. Olson1, In

  9. [Simulated Total Wrist Fusion and its Influence on Hand Grip Function].

    PubMed

    Gülke, J; Schöll, H; Kapapa, T; Geyer, T; Mentzel, M; Wachter, N J

    2016-08-01

    Wrist fusion is still a common treatment for patients with advanced stage arthritis. Since patients are often intimidated by the functional limitations, we intended to evaluate the influence of the lack of wrist motion in different positions on the dynamic grip function and the grip strength of the hand. We simulated wrist fusion in 20° extension and 20° flexion and evaluated the following grip types: fist closure, 2 different power grips, pinch grip and precision grip. A TUB sensor glove was used, which allowed us to dynamically record the range of motion (ROM) of the finger joints as well as grip strength. Nineteen healthy subjects participated and all types of grips were performed using a standardised protocol with and without simulated wrist fusion. Lack of wrist motion in 20° extension had no relevant effect on the fingers' ROM, grip speed or strength. Simulated fusion in 20° flexion also had no influence on ROM or grip speed, rejecting our hypothesis that a tenodesis effect of the extensors in flexion would decrease ROM in the finger joints and grip speed. However, we were able to show a significant decrease of grip strength in flexion compared with extension or healthy wrists. The decrease averaged between 23 and 42% of healthy values, depending on the grip type. There was no change in strength distribution among the fingers. We didn't find any impact of lack of wrist motion on finger movement during forceful hand grip at normal speed. However, a significant loss of grip strength in flexed position of the wrist joint should be considered in patients with an indication for bilateral wrist fusion. © Georg Thieme Verlag KG Stuttgart · New York.

  10. SciDAC Fusiongrid Project--A National Collaboratory to Advance the Science of High Temperature Plasma Physics for Magnetic Fusion

    SciTech Connect

    SCHISSEL, D.P.; ABLA, G.; BURRUSS, J.R.; FEIBUSH, E.; FREDIAN, T.W.; GOODE, M.M.; GREENWALD, M.J.; KEAHEY, K.; LEGGETT, T.; LI, K.; McCUNE, D.C.; PAPKA, M.E.; RANDERSON, L.; SANDERSON, A.; STILLERMAN, J.; THOMPSON, M.R.; URAM, T.; WALLACE, G.

    2006-08-31

    This report summarizes the work of the National Fusion Collaboratory (NFC) Project funded by the United States Department of Energy (DOE) under the Scientific Discovery through Advanced Computing Program (SciDAC) to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. A five year project that was initiated in 2001, it built on the past collaborative work performed within the U.S. fusion community and added the component of computer science research done with the USDOE Office of Science, Office of Advanced Scientific Computer Research. The project was a collaboration itself uniting fusion scientists from General Atomics, MIT, and PPPL and computer scientists from ANL, LBNL, Princeton University, and the University of Utah to form a coordinated team. The group leveraged existing computer science technology where possible and extended or created new capabilities where required. Developing a reliable energy system that is economically and environmentally sustainable is the long-term goal of Fusion Energy Science (FES) research. In the U.S., FES experimental research is centered at three large facilities with a replacement value of over $1B. As these experiments have increased in size and complexity, there has been a concurrent growth in the number and importance of collaborations among large groups at the experimental sites and smaller groups located nationwide. Teaming with the experimental community is a theoretical and simulation community whose efforts range from applied analysis of experimental data to fundamental theory (e.g., realistic nonlinear 3D plasma models) that run on massively parallel computers. Looking toward the future, the large-scale experiments needed for FES research are staffed by correspondingly large, globally dispersed teams. The fusion program will be increasingly oriented toward the International Thermonuclear Experimental Reactor (ITER) where even now, a decade before operation begins, a large

  11. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, Dale L.; Greenwood, Lawrence R.; Loomis, Benny A.

    1989-03-07

    An apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  12. Three-dimensional simulations of the implosion of inertial confinement fusion targets

    SciTech Connect

    Town, R.P.J.; Bell, A.R. )

    1991-09-30

    The viability of inertial confinement fusion depends crucially on implosion symmetry. A spherical three-dimensional hydrocode called PLATO has been developed to model the growth in asymmetries during an implosion. Results are presented in the deceleration phase which show indistinguishable linear growth rates, but greater nonlinear growth of the Rayleigh-Taylor instability than is found in two-dimensional cylindrical simulations. The three-dimensional enhancement of the nonlinear growth is much smaller than that found by Sakagami and Nishihara.

  13. Study of Plasma Liner Driven Magnetized Target Fusion Via Advanced Simulations

    SciTech Connect

    Samulyak, Roman V.; Parks, Paul

    2013-08-31

    The feasibility of the plasma liner driven Magnetized Target Fusion (MTF) via terascale numerical simulations will be assessed. In the MTF concept, a plasma liner, formed by merging of a number (60 or more) of radial, highly supersonic plasma jets, implodes on the target in the form of two compact plasma toroids, and compresses it to conditions of the fusion ignition. By avoiding major difficulties associated with both the traditional laser driven inertial confinement fusion and solid liner driven MTF, the plasma liner driven MTF potentially provides a low-cost and fast R&D path towards the demonstration of practical fusion energy. High fidelity numerical simulations of full nonlinear models associated with the plasma liner MTF using state-of-art numerical algorithms and terascale computing are necessary in order to resolve uncertainties and provide guidance for future experiments. At Stony Brook University, we have developed unique computational capabilities that ideally suite the MTF problem. The FronTier code, developed in collaboration with BNL and LANL under DOE funding including SciDAC for the simulation of 3D multi-material hydro and MHD flows, has beenbenchmarked and used for fundamental and engineering problems in energy science applications. We have performed 3D simulations of converging supersonic plasma jets, their merger and the formation of the plasma liner, and a study of the corresponding oblique shock problem. We have studied the implosion of the plasma liner on the magnetized plasma target by resolving Rayleigh-Taylor instabilities in 2D and 3D and other relevant physics and estimate thermodynamic conditions of the target at the moment of maximum compression and the hydrodynamic efficiency of the method.

  14. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, Dale L.; Greenwood, Lawrence R.; Loomis, Benny A.

    1989-01-01

    An apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  15. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, D.L.; Greenwood, L.R.; Loomis, B.A.

    1988-05-20

    This paper discusses an apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  16. Analyzing large data sets from XGC1 magnetic fusion simulations using apache spark

    SciTech Connect

    Churchill, R. Michael

    2016-11-21

    Apache Spark is explored as a tool for analyzing large data sets from the magnetic fusion simulation code XGCI. Implementation details of Apache Spark on the NERSC Edison supercomputer are discussed, including binary file reading, and parameter setup. Here, an unsupervised machine learning algorithm, k-means clustering, is applied to XGCI particle distribution function data, showing that highly turbulent spatial regions do not have common coherent structures, but rather broad, ring-like structures in velocity space.

  17. Lipid droplets fusion in adipocyte differentiated 3T3-L1 cells: A Monte Carlo simulation

    SciTech Connect

    Boschi, Federico; Rizzatti, Vanni; Zamboni, Mauro; Sbarbati, Andrea

    2014-02-15

    Several human worldwide diseases like obesity, type 2 diabetes, hepatic steatosis, atherosclerosis and other metabolic pathologies are related to the excessive accumulation of lipids in cells. Lipids accumulate in spherical cellular inclusions called lipid droplets (LDs) whose sizes range from fraction to one hundred of micrometers in adipocytes. It has been suggested that LDs can grow in size due to a fusion process by which a larger LD is obtained with spherical shape and volume equal to the sum of the progenitors’ ones. In this study, the size distribution of two populations of LDs was analyzed in immature and mature (5-days differentiated) 3T3-L1 adipocytes (first and second populations, respectively) after Oil Red O staining. A Monte Carlo simulation of interaction between LDs has been developed in order to quantify the size distribution and the number of fusion events needed to obtain the distribution of the second population size starting from the first one. Four models are presented here based on different kinds of interaction: a surface weighted interaction (R2 Model), a volume weighted interaction (R3 Model), a random interaction (Random model) and an interaction related to the place where the LDs are born (Nearest Model). The last two models mimic quite well the behavior found in the experimental data. This work represents a first step in developing numerical simulations of the LDs growth process. Due to the complex phenomena involving LDs (absorption, growth through additional neutral lipid deposition in existing droplets, de novo formation and catabolism) the study focuses on the fusion process. The results suggest that, to obtain the observed size distribution, a number of fusion events comparable with the number of LDs themselves is needed. Moreover the MC approach results a powerful tool for investigating the LDs growth process. Highlights: • We evaluated the role of the fusion process in the synthesis of the lipid droplets. • We compared the

  18. High-level multifunction radar simulation for studying the performance of multisensor data fusion systems

    NASA Astrophysics Data System (ADS)

    Huizing, Albert G.; Bosse, Eloi

    1998-07-01

    This paper presents the basic requirements for a simulation of the main capabilities of a shipborne MultiFunction Radar (MFR) that can be used in conjunction with other sensor simulations in scenarios for studying Multi Sensor Data Fusion (MSDF) systems. This simulation is being used to support an ongoing joint effort (Canada - The Netherlands) in the development of MSDF testbeds. This joint effort is referred as Joint-FACET (Fusion Algorithms & Concepts Exploration Testbed), a highly modular and flexible series of applications that is capable of processing both real and synthetic input data. The question raised here is how realistic should the sensor simulations be to trust the MSDF performance assessment? A partial answer to this question is that at least, the dominant perturbing effects on sensor detection (true or false) are sufficiently represented. Following this philosophy, the MFR model, presented here, takes into account sensor's design parameters and external environmental effects such as clutter, propagation and jamming. Previous radar simulations capture most of these dominant effects. In this paper the emphasis is on an MFR scheduler which is the key element that needs to be added to the previous simulations to represent the MFR capability to search and track a large number of targets and at the same time support a large number of (semi-active) surface-to-air missiles (SAM) for the engagement of multiple hostile targets.

  19. ION BEAM HEATED TARGET SIMULATIONS FOR WARM DENSE MATTER PHYSICS AND INERTIAL FUSION ENERGY

    SciTech Connect

    Barnard, J.J.; Armijo, J.; Bailey, D.S.; Friedman, A.; Bieniosek, F.M.; Henestroza, E.; Kaganovich, I.; Leung, P.T.; Logan, B.G.; Marinak, M.M.; More, R.M.; Ng, S.F.; Penn, G.E.; Perkins, L.J.; Veitzer, S.; Wurtele, J.S.; Yu, S.S.; Zylstra, A.B.

    2008-08-01

    Hydrodynamic simulations have been carried out using the multi-physics radiation hydrodynamics code HYDRA and the simplified one-dimensional hydrodynamics code DISH. We simulate possible targets for a near-term experiment at LBNL (the Neutralized Drift Compression Experiment, NDCX) and possible later experiments on a proposed facility (NDCX-II) for studies of warm dense matter and inertial fusion energy related beam-target coupling. Simulations of various target materials (including solids and foams) are presented. Experimental configurations include single pulse planar metallic solid and foam foils. Concepts for double-pulsed and ramped-energy pulses on cryogenic targets and foams have been simulated for exploring direct drive beam target coupling, and concepts and simulations for collapsing cylindrical and spherical bubbles to enhance temperature and pressure for warm dense matter studies are described.

  20. Ion Beam Heated Target Simulations for Warm Dense Matter Physics and Inertial Fusion Energy

    SciTech Connect

    Barnard, J J; Armijo, J; Bailey, D S; Friedman, A; Bieniosek, F M; Henestroza, E; Kaganovich, I; Leung, P T; Logan, B G; Marinak, M M; More, R M; Ng, S F; Penn, G E; Perkins, L J; Veitzer, S; Wurtele, J S; Yu, S S; Zylstra, A B

    2008-08-12

    Hydrodynamic simulations have been carried out using the multi-physics radiation hydrodynamics code HYDRA and the simplified one-dimensional hydrodynamics code DISH. We simulate possible targets for a near-term experiment at LBNL (the Neutralized Drift Compression Experiment, NDCX) and possible later experiments on a proposed facility (NDCX-II) for studies of warm dense matter and inertial fusion energy related beam-target coupling. Simulations of various target materials (including solids and foams) are presented. Experimental configurations include single pulse planar metallic solid and foam foils. Concepts for double-pulsed and ramped-energy pulses on cryogenic targets and foams have been simulated for exploring direct drive beam target coupling, and concepts and simulations for collapsing cylindrical and spherical bubbles to enhance temperature and pressure for warm dense matter studies are described.

  1. An interprojection sensor fusion approach to estimate blocked projection signal in synchronized moving grid-based CBCT system

    PubMed Central

    Zhang, Hong; Ren, Lei; Kong, Vic; Giles, William; Zhang, You; Jin, Jian-Yue

    2016-01-01

    Purpose: A preobject grid can reduce and correct scatter in cone beam computed tomography (CBCT). However, half of the signal in each projection is blocked by the grid. A synchronized moving grid (SMOG) has been proposed to acquire two complimentary projections at each gantry position and merge them into one complete projection. That approach, however, suffers from increased scanning time and the technical difficulty of accurately merging the two projections per gantry angle. Herein, the authors present a new SMOG approach which acquires a single projection per gantry angle, with complimentary grid patterns for any two adjacent projections, and use an interprojection sensor fusion (IPSF) technique to estimate the blocked signal in each projection. The method may have the additional benefit of reduced imaging dose due to the grid blocking half of the incident radiation. Methods: The IPSF considers multiple paired observations from two adjacent gantry angles as approximations of the blocked signal and uses a weighted least square regression of these observations to finally determine the blocked signal. The method was first tested with a simulated SMOG on a head phantom. The signal to noise ratio (SNR), which represents the difference of the recovered CBCT image to the original image without the SMOG, was used to evaluate the ability of the IPSF in recovering the missing signal. The IPSF approach was then tested using a Catphan phantom on a prototype SMOG assembly installed in a bench top CBCT system. Results: In the simulated SMOG experiment, the SNRs were increased from 15.1 and 12.7 dB to 35.6 and 28.9 dB comparing with a conventional interpolation method (inpainting method) for a projection and the reconstructed 3D image, respectively, suggesting that IPSF successfully recovered most of blocked signal. In the prototype SMOG experiment, the authors have successfully reconstructed a CBCT image using the IPSF-SMOG approach. The detailed geometric features in the

  2. An interprojection sensor fusion approach to estimate blocked projection signal in synchronized moving grid-based CBCT system

    SciTech Connect

    Zhang, Hong; Kong, Vic; Ren, Lei; Giles, William; Zhang, You; Jin, Jian-Yue

    2016-01-15

    Purpose: A preobject grid can reduce and correct scatter in cone beam computed tomography (CBCT). However, half of the signal in each projection is blocked by the grid. A synchronized moving grid (SMOG) has been proposed to acquire two complimentary projections at each gantry position and merge them into one complete projection. That approach, however, suffers from increased scanning time and the technical difficulty of accurately merging the two projections per gantry angle. Herein, the authors present a new SMOG approach which acquires a single projection per gantry angle, with complimentary grid patterns for any two adjacent projections, and use an interprojection sensor fusion (IPSF) technique to estimate the blocked signal in each projection. The method may have the additional benefit of reduced imaging dose due to the grid blocking half of the incident radiation. Methods: The IPSF considers multiple paired observations from two adjacent gantry angles as approximations of the blocked signal and uses a weighted least square regression of these observations to finally determine the blocked signal. The method was first tested with a simulated SMOG on a head phantom. The signal to noise ratio (SNR), which represents the difference of the recovered CBCT image to the original image without the SMOG, was used to evaluate the ability of the IPSF in recovering the missing signal. The IPSF approach was then tested using a Catphan phantom on a prototype SMOG assembly installed in a bench top CBCT system. Results: In the simulated SMOG experiment, the SNRs were increased from 15.1 and 12.7 dB to 35.6 and 28.9 dB comparing with a conventional interpolation method (inpainting method) for a projection and the reconstructed 3D image, respectively, suggesting that IPSF successfully recovered most of blocked signal. In the prototype SMOG experiment, the authors have successfully reconstructed a CBCT image using the IPSF-SMOG approach. The detailed geometric features in the

  3. Transition from Beam-Target to Thermonuclear Fusion in High-Current Deuterium Z-Pinch Simulations

    NASA Astrophysics Data System (ADS)

    Offermann, Dustin; Welch, Dale; Rose, Dave; Thoma, Carsten; Clark, Robert; Mostrom, Chris; Schmidt, Andrea; Link, Anthony

    2016-10-01

    Fusion yields from dense, Z-pinch plasmas are known to scale with the drive current, which is favorable for many potential applications. Decades of experimental studies, however, show an unexplained drop in yield for currents above a few mega-ampere (MA). In this work, simulations of DD Z-Pinch plasmas have been performed in 1D and 2D for a constant pinch time and initial radius using the code LSP, and observations of a shift in scaling are presented. The results show that yields below 3 MA are enhanced relative to pure thermonuclear scaling by beamlike particles accelerated in the Rayleigh-Taylor induced electric fields, while yields above 3 MA are reduced because of energy lost by the instability and the inability of the beamlike ions to enter the pinch region. This research was developed with funding from the Defense Advanced Research Projects Agency (DARPA).

  4. Online Simulation of Radiation Track Structure Project

    NASA Technical Reports Server (NTRS)

    Plante, Ianik

    2015-01-01

    Space radiation comprises protons, helium and high charged and energy (HZE) particles. High-energy particles are a concern for human space flight, because they are no known options for shielding astronauts from them. When these ions interact with matter, they damage molecules and create radiolytic species. The pattern of energy deposition and positions of the radiolytic species, called radiation track structure, is highly dependent on the charge and energy of the ion. The radiolytic species damage biological molecules, which may lead to several long-term health effects such as cancer. Because of the importance of heavy ions, the radiation community is very interested in the interaction of HZE particles with DNA, notably with regards to the track structure. A desktop program named RITRACKS was developed to simulate radiation track structure. The goal of this project is to create a web interface to allow registered internal users to use RITRACKS remotely.

  5. Simulations of mixing in Inertial Confinement Fusion with front tracking and sub-grid scale models

    NASA Astrophysics Data System (ADS)

    Rana, Verinder; Lim, Hyunkyung; Melvin, Jeremy; Cheng, Baolian; Glimm, James; Sharp, David

    2015-11-01

    We present two related results. The first discusses the Richtmyer-Meshkov (RMI) and Rayleigh-Taylor instabilities (RTI) and their evolution in Inertial Confinement Fusion simulations. We show the evolution of the RMI to the late time RTI under transport effects and tracking. The role of the sub-grid scales helps capture the interaction of turbulence with diffusive processes. The second assesses the effects of concentration on the physics model and examines the mixing properties in the low Reynolds number hot spot. We discuss the effect of concentration on the Schmidt number. The simulation results are produced using the University of Chicago code FLASH and Stony Brook University's front tracking algorithm.

  6. Neutral Buoyancy Simulator - EASE Project (NB32)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Once the United States' space program had progressed from Earth's orbit into outerspace, the prospect of building and maintaining a permanent presence in space was realized. To accomplish this feat, NASA launched a temporary workstation, Skylab, to discover the effects of low gravity and weightlessness on the human body, and also to develop tools and equipment that would be needed in the future to build and maintain a more permanent space station. The structures, techniques, and work schedules had to be carefully designed to fit this unique construction site. The components had to be lightweight for transport into orbit, yet durable. The station also had to be made with removable parts for easy servicing and repairs by astronauts. All of the tools necessary for service and repairs had to be designed for easy manipulation by a suited astronaut. Construction methods had to be efficient due to the limited time the astronauts could remain outside their controlled environment. In lieu of all the specific needs for this project, an environment on Earth had to be developed that could simulate a low gravity atmosphere. A Neutral Buoyancy Simulator (NBS) was constructed by NASA's Marshall Space Flight Center (MSFC) in 1968. Since then, NASA scientists have used this facility to understand how humans work best in low gravity and also provide information about the different kinds of structures that can be built. Pictured is a Massachusetts Institute of Technology (MIT) student working in a spacesuit on the Experimental Assembly of Structures in Extravehicular Activity (EASE) project which was developed as a joint effort between MFSC and MIT. The EASE experiment required that crew members assemble small components to form larger components, working from the payload bay of the space shuttle. The MIT student in this photo is assembling two six-beam tetrahedrons.

  7. The GEM Detector projective alignment simulation system

    SciTech Connect

    Wuest, C.R.; Belser, F.C.; Holdener, F.R.; Roeben, M.D.; Paradiso, J.A.; Mitselmakher, G.; Ostapchuk, A.; Pier-Amory, J.

    1993-07-09

    Precision position knowledge (< 25 microns RMS) of the GEM Detector muon system at the Superconducting Super Collider Laboratory (SSCL) is an important physics requirement necessary to minimize sagitta error in detecting and tracking high energy muons that are deflected by the magnetic field within the GEM Detector. To validate the concept of the sagitta correction function determined by projective alignment of the muon detectors (Cathode Strip Chambers or CSCs), the basis of the proposed GEM alignment scheme, a facility, called the ``Alignment Test Stand`` (ATS), is being constructed. This system simulates the environment that the CSCs and chamber alignment systems are expected to experience in the GEM Detector, albeit without the 0.8 T magnetic field and radiation environment. The ATS experimental program will allow systematic study and characterization of the projective alignment approach, as well as general mechanical engineering of muon chamber mounting concepts, positioning systems and study of the mechanical behavior of the proposed 6 layer CSCs. The ATS will consist of a stable local coordinate system in which mock-ups of muon chambers (i.e., non-working mechanical analogs, representing the three superlayers of a selected barrel and endcap alignment tower) are implemented, together with a sufficient number of alignment monitors to overdetermine the sagitta correction function, providing a self-consistency check. This paper describes the approach to be used for the alignment of the GEM muon system, the design of the ATS, and the experiments to be conducted using the ATS.

  8. A Simulation Environment for Benchmarking Sensor Fusion-Based Pose Estimators

    PubMed Central

    Ligorio, Gabriele; Sabatini, Angelo Maria

    2015-01-01

    In-depth analysis and performance evaluation of sensor fusion-based estimators may be critical when performed using real-world sensor data. For this reason, simulation is widely recognized as one of the most powerful tools for algorithm benchmarking. In this paper, we present a simulation framework suitable for assessing the performance of sensor fusion-based pose estimators. The systems used for implementing the framework were magnetic/inertial measurement units (MIMUs) and a camera, although the addition of further sensing modalities is straightforward. Typical nuisance factors were also included for each sensor. The proposed simulation environment was validated using real-life sensor data employed for motion tracking. The higher mismatch between real and simulated sensors was about 5% of the measured quantity (for the camera simulation), whereas a lower correlation was found for an axis of the gyroscope (0.90). In addition, a real benchmarking example of an extended Kalman filter for pose estimation from MIMU and camera data is presented. PMID:26703603

  9. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment.

    PubMed

    Marzinek, Jan K; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G; Panzade, Sadhana; Verma, Chandra; Bond, Peter J

    2016-01-20

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes.

  10. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment

    PubMed Central

    Marzinek, Jan K.; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G.; Panzade, Sadhana; Verma, Chandra; Bond, Peter J.

    2016-01-01

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes. PMID:26785994

  11. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment

    NASA Astrophysics Data System (ADS)

    Marzinek, Jan K.; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G.; Panzade, Sadhana; Verma, Chandra; Bond, Peter J.

    2016-01-01

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes.

  12. Quasi-spherical direct drive fusion simulations for the Z machine and future accelerators.

    SciTech Connect

    VanDevender, J. Pace; McDaniel, Dillon Heirman; Roderick, Norman Frederick; Nash, Thomas J.

    2007-11-01

    We explored the potential of Quasi-Spherical Direct Drive (QSDD) to reduce the cost and risk of a future fusion driver for Inertial Confinement Fusion (ICF) and to produce megajoule thermonuclear yield on the renovated Z Machine with a pulse shortening Magnetically Insulated Current Amplifier (MICA). Analytic relationships for constant implosion velocity and constant pusher stability have been derived and show that the required current scales as the implosion time. Therefore, a MICA is necessary to drive QSDD capsules with hot-spot ignition on Z. We have optimized the LASNEX parameters for QSDD with realistic walls and mitigated many of the risks. Although the mix-degraded 1D yield is computed to be {approx}30 MJ on Z, unmitigated wall expansion under the > 100 gigabar pressure just before burn prevents ignition in the 2D simulations. A squeezer system of adjacent implosions may mitigate the wall expansion and permit the plasma to burn.

  13. Adaptive multifocus image fusion using block compressed sensing with smoothed projected Landweber integration in the wavelet domain.

    PubMed

    V S, Unni; Mishra, Deepak; Subrahmanyam, G R K S

    2016-12-01

    The need for image fusion in current image processing systems is increasing mainly due to the increased number and variety of image acquisition techniques. Image fusion is the process of combining substantial information from several sensors using mathematical techniques in order to create a single composite image that will be more comprehensive and thus more useful for a human operator or other computer vision tasks. This paper presents a new approach to multifocus image fusion based on sparse signal representation. Block-based compressive sensing integrated with a projection-driven compressive sensing (CS) recovery that encourages sparsity in the wavelet domain is used as a method to get the focused image from a set of out-of-focus images. Compression is achieved during the image acquisition process using a block compressive sensing method. An adaptive thresholding technique within the smoothed projected Landweber recovery process reconstructs high-resolution focused images from low-dimensional CS measurements of out-of-focus images. Discrete wavelet transform and dual-tree complex wavelet transform are used as the sparsifying basis for the proposed fusion. The main finding lies in the fact that sparsification enables a better selection of the fusion coefficients and hence better fusion. A Laplacian mixture model fit is done in the wavelet domain and estimation of the probability density function (pdf) parameters by expectation maximization leads us to the proper selection of the coefficients of the fused image. Using the proposed method compared with the fusion scheme without employing the projected Landweber (PL) scheme and the other existing CS-based fusion approaches, it is observed that with fewer samples itself, the proposed method outperforms other approaches.

  14. Simulation of Neural Firing Dynamics: A Student Project.

    ERIC Educational Resources Information Center

    Kletsky, E. J.

    This paper describes a student project in digital simulation techniques that is part of a graduate systems analysis course entitled Biosimulation. The students chose different simulation techniques to solve a problem related to the neuron model. (MLH)

  15. Detector Simulations for the COREA Project

    NASA Astrophysics Data System (ADS)

    Lee, Sungwon; Kang, Hyesung

    2006-12-01

    The COREA (COsmic ray Research and Education Array in Korea) project aims to build a ground array of particle detectors distributed over the Korean Peninsular, through collaborations of high school students, educators, and university researchers, in order to study the origin of ultra high energy cosmic rays. COREA array will consist of about 2000 detector stations covering several hundreds of km2 area at its final configuration and detect electrons and muons in extensive air-showers triggered by high energy particles. During the initial pase COREA array will start with a small number of detector stations in Seoul area schools. In this paper, we have studied by Monte Carlo simulations how to select detector sites for optimal detection efficiency for proton triggered air-showers. We considered several model clusters with up to 30 detector stations and calculated the effective number of air-shower events that can be detected per year for each cluster. The greatest detection efficiency is achieved when the mean distance between detector stations of a cluster is comparable to the effective radius of the air-shower of a given proton energy. We find the detection efficiency of a cluster with randomly selected detector sites is comparable to that of clusters with uniform detector spacing. We also considered a hybrid cluster with 60 detector stations that combines a small cluster with Δl ≈ 100 m and a large cluster with Δl ≈ 1 km. We suggest that it can be an ideal configuration for the initial phase study of the COREA project, since it can measure the cosmic rays with a wide energy range, i.e., 1016eV ≤E ≤ 1019eV, with a reasonable detection rate.

  16. Improving Project Management with Simulation and Completion Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2004-01-01

    Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500

  17. Simulations of longitudinal beam dynamics of space-charge dominated beams for heavy ion fusion

    SciTech Connect

    Miller, Debra Ann Callahan

    1994-12-01

    The longitudinal instability has potentially disastrous effects on the ion beams used for heavy ion driven inertial confinement fusion. This instability is a "resistive wall" instability with the impedance coining from the induction modules in the accelerator used as a driver. This instability can greatly amplify perturbations launched from the beam head and can prevent focusing of the beam onto the small spot necessary for fusion. This instability has been studied using the WARPrz particle-in-cell code. WARPrz is a 2 1/2 dimensional electrostatic axisymmetric code. This code includes a model for the impedance of the induction modules. Simulations with resistances similar to that expected in a driver show moderate amounts of growth from the instability as a perturbation travels from beam head to tail as predicted by cold beam fluid theory. The perturbation reflects off the beam tail and decays as it travels toward the beam head. Nonlinear effects cause the perturbation to steepen during reflection. Including the capacitive component of the, module impedance. has a partially stabilizing effect on the longitudinal instability. This reduction in the growth rate is seen in both cold beam fluid theory and in simulations with WARPrz. Instability growth rates for warm beams measured from WARPrz are lower than cold beam fluid theory predicts. Longitudinal thermal spread cannot account for this decrease in the growth rate. A mechanism for coupling the transverse thermal spread to decay of the longitudinal waves is presented. The longitudinal instability is no longer a threat to the heavy ion fusion program. The simulations in this thesis have shown that the growth rate for this instability will not be as large as earlier calculations predicted.

  18. Displacement damage parameters for fusion breeder blanket materials based on BCA computer simulations

    NASA Astrophysics Data System (ADS)

    Leichtle, Dieter

    2002-12-01

    Based on the MARLOWE code, a refined binary collision approximation (BCA) simulation model has been developed which is particularly suited for light mass and polyatomic ionic solids in a fusion environment. Main features of the model are described, including appropriate extensions of the kinematical procedure and the ion-solid interactions. Defect yields from the simulated collision cascades are used for deriving displacement cross sections in Be, Li 2O, Li 2SiO 3, Li 2SiO 4 and Li 2TiO 3. Comparisons with standard results show that there is an energy dependence which is strongly correlated with the spectrum of primary knock-on atoms. In particular, for lithium ceramics the contribution of damage induced by secondary helium and tritium is remarkable even in a fast neutron flux. The total displacements per atom in a fusion demonstration reactor blanket obtained by means of BCA-simulation results is in general lower than NRT-values by about 30% for the lithium breeder materials, but higher by around 90% for beryllium. These differences can be attributed to differences of binding properties and crystalline structure of the respective material, which also influence the defect composition.

  19. Adjoint Monte Carlo simulation of fusion product activation probe experiment in ASDEX Upgrade tokamak

    NASA Astrophysics Data System (ADS)

    Äkäslompolo, S.; Bonheure, G.; Tardini, G.; Kurki-Suonio, T.; The ASDEX Upgrade Team

    2015-10-01

    The activation probe is a robust tool to measure flux of fusion products from a magnetically confined plasma. A carefully chosen solid sample is exposed to the flux, and the impinging ions transmute the material making it radioactive. Ultra-low level gamma-ray spectroscopy is used post mortem to measure the activity and, thus, the number of fusion products. This contribution presents the numerical analysis of the first measurement in the ASDEX Upgrade tokamak, which was also the first experiment to measure a single discharge. The ASCOT suite of codes was used to perform adjoint/reverse Monte Carlo calculations of the fusion products. The analysis facilitates, for the first time, a comparison of numerical and experimental values for absolutely calibrated flux. The results agree to within a factor of about two, which can be considered a quite good result considering the fact that all features of the plasma cannot be accounted in the simulations.Also an alternative to the present probe orientation was studied. The results suggest that a better optimized orientation could measure the flux from a significantly larger part of the plasma. A shorter version of this contribution is due to be published in PoS at: 1st EPS conference on Plasma Diagnostics

  20. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Estep, Donald

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  1. Multi-step formation of a hemifusion diaphragm for vesicle fusion revealed by all-atom molecular dynamics simulations.

    PubMed

    Tsai, Hui-Hsu Gavin; Chang, Che-Ming; Lee, Jian-Bin

    2014-06-01

    Membrane fusion is essential for intracellular trafficking and virus infection, but the molecular mechanisms underlying the fusion process remain poorly understood. In this study, we employed all-atom molecular dynamics simulations to investigate the membrane fusion mechanism using vesicle models which were pre-bound by inter-vesicle Ca(2+)-lipid clusters to approximate Ca(2+)-catalyzed fusion. Our results show that the formation of the hemifusion diaphragm for vesicle fusion is a multi-step event. This result contrasts with the assumptions made in most continuum models. The neighboring hemifused states are separated by an energy barrier on the energy landscape. The hemifusion diaphragm is much thinner than the planar lipid bilayers. The thinning of the hemifusion diaphragm during its formation results in the opening of a fusion pore for vesicle fusion. This work provides new insights into the formation of the hemifusion diaphragm and thus increases understanding of the molecular mechanism of membrane fusion. This article is part of a Special Issue entitled: Membrane Structure and Function: Relevance in the Cell's Physiology, Pathology and Therapy. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Developing an anterior cervical diskectomy and fusion simulator for neurosurgical resident training.

    PubMed

    Ray, Wilson Z; Ganju, Aruna; Harrop, James S; Hoh, Daniel J

    2013-10-01

    Surgical simulators are useful in many surgical disciplines to augment residency training. Duty hour restrictions and increasing emphasis on patient safety and attending oversight have changed neurosurgical education from the traditional apprenticeship model. The Congress of Neurological Surgeons Simulation Committee has been developing neurosurgical simulators for the purpose of enhancing resident education and assessing proficiency. To review the initial experience with an anterior cervical diskectomy and fusion (ACDF) simulator. The first ACDF training module was implemented at the 2012 Congress of Neurological Surgeons Annual Meeting. The 90-minute curriculum included a written pretest, didactics, a practical pretest on the simulator, hands-on training, a written posttest, a practical posttest, and postcourse feedback. Didactic material covered clinical indications for ACDF, comparison with other cervical procedures, surgical anatomy and approach, principles of arthrodesis and spinal fixation, and complication management. Written pretests and posttests were administered to assess baseline knowledge and evidence of improvement after the module. Qualitative evaluation of individual performance on the practical (simulator) portion was included. Three neurosurgery residents, 2 senior medical students, and 1 attending neurosurgeon participated in the course. The pretest scores were an average 9.2 (range, 6-13). Posttest scores improved to 11.0 (range, 9-13; P = .03). Initial experience with the ACDF simulator suggests that it may represent a meaningful training module for residents. Simulation will be an important training modality for residents to practice surgical technique and for teachers to assess competency. Further development of an ACDF simulator and didactic curriculum will require additional verification of simulator validity and reliability.

  3. Computational Plasma Physics at the Bleeding Edge: Simulating Kinetic Turbulence Dynamics in Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Tang, William

    2013-04-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research in the 21st Century. The imperative is to translate the combination of the rapid advances in super-computing power together with the emergence of effective new algorithms and computational methodologies to help enable corresponding increases in the physics fidelity and the performance of the scientific codes used to model complex physical systems. If properly validated against experimental measurements and verified with mathematical tests and computational benchmarks, these codes can provide more reliable predictive capability for the behavior of complex systems, including fusion energy relevant high temperature plasmas. The magnetic fusion energy research community has made excellent progress in developing advanced codes for which computer run-time and problem size scale very well with the number of processors on massively parallel supercomputers. A good example is the effective usage of the full power of modern leadership class computational platforms from the terascale to the petascale and beyond to produce nonlinear particle-in-cell simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. Illustrative results provide great encouragement for being able to include increasingly realistic dynamics in extreme-scale computing campaigns to enable predictive simulations with unprecedented physics fidelity. Some illustrative examples will be presented of the algorithmic progress from the magnetic fusion energy sciences area in dealing with low memory per core extreme scale computing challenges for the current top 3 supercomputers worldwide. These include advanced CPU systems (such as the IBM-Blue-Gene-Q system and the Fujitsu K Machine) as well as the GPU-CPU hybrid system (Titan).

  4. Simulation of the target creation through FRC merging for a magneto-inertial fusion concept

    NASA Astrophysics Data System (ADS)

    Li, Chenguang; Yang, Xianjun

    2017-04-01

    A two-dimensional magnetohydrodynamics model has been used to simulate the target creation process in a magneto-inertial fusion concept named Magnetized Plasma Fusion Reactor (MPFR) [C. Li and X. Yang, Phys. Plasmas 23, 102702 (2016)], where the target plasma created through Field reversed configuration (FRC) merging was compressed by an imploding liner driven by the pulsed-power driver. In the scheme, two initial FRCs (Field reversed configurations) are translated into the region where FRC merging occurs, bringing out the target plasma ready for compression. The simulations cover the three stages of the target creation process: formation, translation, and merging. The factors affecting the achieved target are analyzed numerically. The magnetic field gradient produced by the conical coils is found to determine how fast the FRC is accelerated to peak velocity and the collision merging occurs. Moreover, it is demonstrated that FRC merging can be realized by real coils with gaps showing nearly identical performance, and the optimized target by FRC merging shows larger internal energy and retained flux, which is more suitable for the MPFR concept.

  5. Progress in theory and simulation of ion cyclotron emission from magnetic confinement fusion plasmas

    NASA Astrophysics Data System (ADS)

    Dendy, Richard; Chapman, Ben; Chapman, Sandra; Cook, James; Reman, Bernard; McClements, Ken; Carbajal, Leopoldo

    2016-10-01

    Suprathermal ion cyclotron emission (ICE) is detected from all large tokamak and stellarator plasmas. Its frequency spectrum has narrow peaks at sequential cyclotron harmonics of the energetic ion population (fusion-born or neutral beam-injected) at the outer edge of the plasma. ICE was the first collective radiative instability driven by confined fusion-born ions observed in deuterium-tritium plasmas in JET and TFTR, and the magnetoacoustic cyclotron instability is the most likely emission mechanism. Contemporary ICE measurements are taken at very high sampling rates from the LHD stellarator and from the conventional aspect ratio KSTAR tokamak. A correspondingly advanced modelling capability for the ICE emission mechanism has been developed using 1D3V PIC and hybrid-PIC codes, supplemented by analytical theory. These kinetic codes simulate the self-consistent full orbit dynamics of energetic and thermal ions, together with the electric and magnetic fields and the electrons. We report recent progress in theory and simulation that addresses: the scaling of ICE intensity with energetic particle density; the transition between super-Alfvénic and sub-Alfvénic regimes for the collectively radiating particles; and the rapid time evolution that is seen for some ICE measurements. This work was supported in part by the RCUK Energy Programme [Grant Number EP/I501045] and by Euratom.

  6. Investigation on reduced thermal models for simulating infrared images in fusion devices

    NASA Astrophysics Data System (ADS)

    Gerardin, J.; Aumeunier, M.-H.; Firdaouss, M.; Gardarein, J.-L.; Rigollet, F.

    2016-09-01

    In fusion facilities, the in-vessel wall receives high heat flux density up to 20 MW/m2. The monitoring of in-vessel components is usually ensured by infra-red (IR) thermography but with all-metallic walls, disturbance phenomenon as reflections may lead to inaccurate temperature estimates, potentially endangering machine safety. A full predictive photonic simulation is then used to assess accurately the IR measurements. This paper investigates some reduced thermal models (semi-infinite wall, thermal quadrupole) to predict the surface temperature from the particle loads on components for a given plasma scenario. The results are compared with a reference 3D Finite Element Method (Ansys Mechanical) and used as input for simulating IR images. The performances of reduced thermal models are analysed by comparing the resulting IR images.

  7. Verification of particle simulation of radio frequency waves in fusion plasmas

    SciTech Connect

    Kuley, Animesh; Lin, Z.; Wang, Z. X.; Wessel, F.

    2013-10-15

    Radio frequency (RF) waves can provide heating, current and flow drive, as well as instability control for steady state operations of fusion experiments. A particle simulation model has been developed in this work to provide a first-principles tool for studying the RF nonlinear interactions with plasmas. In this model, ions are considered as fully kinetic particles using the Vlasov equation and electrons are treated as guiding centers using the drift kinetic equation. This model has been implemented in a global gyrokinetic toroidal code using real electron-to-ion mass ratio. To verify the model, linear simulations of ion plasma oscillation, ion Bernstein wave, and lower hybrid wave are carried out in cylindrical geometry and found to agree well with analytic predictions.

  8. Using Geostatistical Data Fusion Techniques and MODIS Data to Upscale Simulated Wheat Yield

    NASA Astrophysics Data System (ADS)

    Castrignano, A.; Buttafuoco, G.; Matese, A.; Toscano, P.

    2014-12-01

    Population growth increases food request. Assessing food demand and predicting the actual supply for a given location are critical components of strategic food security planning at regional scale. Crop yield can be simulated using crop models because is site-specific and determined by weather, management, length of growing season and soil properties. Crop models require reliable location-specific data that are not generally available. Obtaining these data at a large number of locations is time-consuming, costly and sometimes simply not feasible. An upscaling method to extend coverage of sparse estimates of crop yield to an appropriate extrapolation domain is required. This work is aimed to investigate the applicability of a geostatistical data fusion approach for merging remote sensing data with the predictions of a simulation model of wheat growth and production using ground-based data. The study area is Capitanata plain (4000 km2) located in Apulia Region, mostly cropped with durum wheat. The MODIS EVI/NDVI data products for Capitanata plain were downloaded from the Land Processes Distributed Active Archive Center (LPDAAC) remote for the whole crop cycle of durum wheat. Phenological development, biomass growth and grain quantity of durum wheat were simulated by the Delphi system, based on a crop simulation model linked to a database including soil properties, agronomical and meteorological data. Multicollocated cokriging was used to integrate secondary exhaustive information (multi-spectral MODIS data) with primary variable (sparsely distributed biomass/yield model predictions of durum wheat). The model estimates looked strongly spatially correlated with the radiance data (red and NIR bands) and the fusion data approach proved to be quite suitable and flexible to integrate data of different type and support.

  9. FATRAS - the ATLAS Fast Track Simulation project

    NASA Astrophysics Data System (ADS)

    Mechnich, Jörg; ATLAS Collaboration

    2011-12-01

    The Monte Carlo simulation of the detector response is an integral component of any analysis performed with data from the LHC experiments. As these simulated data sets must be both large and precise, their production is a CPU-intensive task. ATLAS has developed full and fast detector simulation techniques to achieve this goal within the computing limits of the collaboration. At the current early stages of data-taking, it is necessary to reprocess the Monte Carlo event samples continuously, while integrating adaptations to the simulation modules in order to improve the agreement with data taken by means of the detector itself. FATRAS is a fast track simulation engine which produces a Monte Carlo simulation based on modules and the geometry of the standard ATLAS track reconstruction algorithm. It can be combined with a fast parametrized-response simulation of the calorimeters. This approach shows a high level of agreement with the full simulation, while achieving a relative timing gain of two orders of magnitude. FATRAS was designed to provide a fast feedback cycle for tuning the MC simulation with real data: this includes the material distribution inside the detector, the integration of misalignment and current conditions, as well as calibration at the hit level. We present the updated and calibrated version of FATRAS based on the first LHC data. Extensive comparisons of the fast track simulation with the full simulation and data at 900 GeV are shown.

  10. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    NASA Astrophysics Data System (ADS)

    Haines, Brian M.; Grim, Gary P.; Fincke, James R.; Shah, Rahul C.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; Hayes-Sterbenz, Anna C.; Jungman, Gerard; Klein, Andreas; Rundberg, Robert S.; Steinkamp, Michael J.; Wilhelmy, Jerry B.

    2016-07-01

    We present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a "CD Mixcap," is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employ any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  11. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    SciTech Connect

    Haines, Brian M.; Grim, Gary P.; Fincke, James R.; Shah, Rahul C.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; Hayes-Sterbenz, Anna C.; Jungman, Gerard; Klein, Andreas; Rundberg, Robert S.; Steinkamp, Michael J.; Wilhelmy, Jerry B.

    2016-07-01

    We present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a “CD Mixcap,” is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employ any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  12. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    DOE PAGES

    Haines, Brian Michael; Grim, Gary P.; Fincke, James R.; ...

    2016-07-29

    Here, we present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a “CD Mixcap,” is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employmore » any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  13. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    SciTech Connect

    Haines, Brian Michael; Grim, Gary P.; Fincke, James R.; Shah, Rahul C.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; Hayes-Sterbenz, Anna Catherine; Jungman, Gerard; Klein, Andreas; Rundberg, Robert S.; Steinkamp, Michael James; Wilhelmy, Jerry B.

    2016-07-29

    Here, we present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a “CD Mixcap,” is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employ any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  14. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    SciTech Connect

    Haines, Brian Michael; Grim, Gary P.; Fincke, James R.; Shah, Rahul C.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; Hayes-Sterbenz, Anna Catherine; Jungman, Gerard; Klein, Andreas; Rundberg, Robert S.; Steinkamp, Michael James; Wilhelmy, Jerry B.

    2016-07-29

    Here, we present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a “CD Mixcap,” is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employ any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  15. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    SciTech Connect

    Haines, Brian M. Fincke, James R.; Shah, Rahul C.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; Hayes-Sterbenz, Anna C.; Jungman, Gerard; Klein, Andreas; Rundberg, Robert S.; Steinkamp, Michael J.; Wilhelmy, Jerry B.; Grim, Gary P.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.

    2016-07-15

    We present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a “CD Mixcap,” is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employ any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  16. Fusion core start-up, ignition and burn simulations of reversed-field pinch (RFP) reactors

    SciTech Connect

    Chu, Yuh-Yi

    1988-01-01

    A transient reactor simulation model is developed to investigate and simulate the start-up, ignition and burn of a reversed-field pinch reactor. The simulation is based upon a spatially averaged plasma balance model with field profiles obtained from MHD quasi-equilibrium analysis. Alpha particle heating is estimated from Fokker-Planck calculations. The instantaneous plasma current is derived from a self-consistent circuit analysis for plasma/coil/eddy current interactions. The simulation code is applied to the TITAN RFP reactor design which features a compact, high-power-density reversed-field pinch fusion system. A contour analysis is performed using the steady-state global plasma balance. The results are presented with contours of constant plasma current. A saddle point is identified in the contour plot which determines the minimum value of plasma current required to achieve ignition. An optimized start-up to ignition and burn path can be obtained by passing through the saddle point. The simulation code is used to study and optimize the start-up scenario. In the simulations of the TITAN RFP reactor, the OH-driven superconducting EF coils are found to deviate from the required equilibrium values as the induced plasma current increases. This results in the modification of superconducting EF coils and the addition of a set of EF trim coils. The design of the EF coil system is performed with the simulation code subject to the optimization of trim-coil power and current. In addition, the trim-coil design is subject to the constraints of vertical-field stability index and maintenance access. A power crowbar is also needed to prevent the superconducting EF coils from generating excessive vertical field. A set of basic results from the simulation of TITAN RFP reactor yield a picture of RFP plasma operation in a reactor. Investigations of eddy current are also presented. 145 refs., 37 figs., 2 tabs.

  17. Theoretical and simulation research of hydrodynamic instabilities in inertial-confinement fusion implosions

    NASA Astrophysics Data System (ADS)

    Wang, LiFeng; Ye, WenHua; He, XianTu; Wu, JunFeng; Fan, ZhengFeng; Xue, Chuang; Guo, HongYu; Miao, WenYong; Yuan, YongTeng; Dong, JiaQin; Jia, Guo; Zhang, Jing; Li, YingJun; Liu, Jie; Wang, Min; Ding, YongKun; Zhang, WeiYan

    2017-05-01

    Inertial fusion energy (IFE) has been considered a promising, nearly inexhaustible source of sustainable carbon-free power for the world's energy future. It has long been recognized that the control of hydrodynamic instabilities is of critical importance for ignition and high-gain in the inertial-confinement fusion (ICF) hot-spot ignition scheme. In this mini-review, we summarize the progress of theoretical and simulation research of hydrodynamic instabilities in the ICF central hot-spot implosion in our group over the past decade. In order to obtain sufficient understanding of the growth of hydrodynamic instabilities in ICF, we first decompose the problem into different stages according to the implosion physics processes. The decomposed essential physics pro- cesses that are associated with ICF implosions, such as Rayleigh-Taylor instability (RTI), Richtmyer-Meshkov instability (RMI), Kelvin-Helmholtz instability (KHI), convergent geometry effects, as well as perturbation feed-through are reviewed. Analyti- cal models in planar, cylindrical, and spherical geometries have been established to study different physical aspects, including density-gradient, interface-coupling, geometry, and convergent effects. The influence of ablation in the presence of preheating on the RTI has been extensively studied by numerical simulations. The KHI considering the ablation effect has been discussed in detail for the first time. A series of single-mode ablative RTI experiments has been performed on the Shenguang-II laser facility. The theoretical and simulation research provides us the physical insights of linear and weakly nonlinear growths, and nonlinear evolutions of the hydrodynamic instabilities in ICF implosions, which has directly supported the research of ICF ignition target design. The ICF hot-spot ignition implosion design that uses several controlling features, based on our current understanding of hydrodynamic instabilities, to address shell implosion stability, has

  18. Three-dimensional particle simulation of heavy-ion fusion beams*

    NASA Astrophysics Data System (ADS)

    Friedman, Alex; Grote, David P.; Haber, Irving

    1992-07-01

    The beams in a heavy-ion-beam-driven inertial fusion (HIF) accelerator are collisionless, nonneutral plasmas, confined by applied magnetic and electric fields. These space-charge-dominated beams must be focused onto small (few mm) spots at the fusion target, and so preservation of a small emittance is crucial. The nonlinear beam self-fields can lead to emittance growth, and so a self-consistent field description is needed. To this end, a multidimensional particle simulation code, warp [Friedman et al., Part. Accel. 37-38, 131 (1992)], has been developed and is being used to study the transport of HIF beams. The code's three-dimensional (3-D) package combines features of an accelerator code and a particle-in-cell plasma simulation. Novel techniques allow it to follow beams through many accelerator elements over long distances and around bends. This paper first outlines the algorithms employed in warp. A number of applications and corresponding results are then presented. These applications include studies of: beam drift-compression in a misaligned lattice of quadrupole focusing magnets; beam equilibria, and the approach to equilibrium; and the MBE-4 experiment [AIP Conference Proceedings 152 (AIP, New York, 1986), p. 145] recently concluded at Lawrence Berkeley Laboratory (LBL). Finally, 3-D simulations of bent-beam dynamics relevant to the planned Induction Linac Systems Experiments (ILSE) [Fessenden, Nucl. Instrum. Methods Plasma Res. A 278, 13 (1989)] at LBL are described. Axially cold beams are observed to exhibit little or no root-mean-square emittance growth at midpulse in transiting a (sharp) bend. Axially hot beams, in contrast, do exhibit some emittance growth.

  19. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    SciTech Connect

    G. R. Odette; G. E. Lucas

    2005-11-15

    This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

  20. Project SAFE: Simulating Alternative Futures in Education.

    ERIC Educational Resources Information Center

    Debenham, Jerry Dean

    Simulating Alternative Futures in Education (SAFE) is a simulation game dealing with the future of education from 1975 to 2024 and beyond. It is computerized on an APL direct-interaction system and can be played at any location over telephone lines. It takes approximately 1.8 hours of computer time to play, with 5 to 9 hours of preparation, and…

  1. Angular radiation temperature simulation for time-dependent capsule drive prediction in inertial confinement fusion

    SciTech Connect

    Jing, Longfei; Yang, Dong; Li, Hang; Zhang, Lu; Lin, Zhiwei; Li, Liling; Kuang, Longyu; Jiang, Shaoen Ding, Yongkun; Huang, Yunbao

    2015-02-15

    The x-ray drive on a capsule in an inertial confinement fusion setup is crucial for ignition. Unfortunately, a direct measurement has not been possible so far. We propose an angular radiation temperature simulation to predict the time-dependent drive on the capsule. A simple model, based on the view-factor method for the simulation of the radiation temperature, is presented and compared with the experimental data obtained using the OMEGA laser facility and the simulation results acquired with VISRAD code. We found a good agreement between the time-dependent measurements and the simulation results obtained using this model. The validated model was then used to analyze the experimental results from the Shenguang-III prototype laser facility. More specifically, the variations of the peak radiation temperatures at different view angles with the albedo of the hohlraum, the motion of the laser spots, the closure of the laser entrance holes, and the deviation of the laser power were investigated. Furthermore, the time-dependent radiation temperature at different orientations and the drive history on the capsule were calculated. The results indicate that the radiation temperature from “U20W112” (named according to the diagnostic hole ID on the target chamber) can be used to approximately predict the drive temperature on the capsule. In addition, the influence of the capsule on the peak radiation temperature is also presented.

  2. Three-Dimensional Simulations of the Deceleration Phase of Inertial Fusion Implosions

    NASA Astrophysics Data System (ADS)

    Woo, K. M.; Betti, R.; Bose, A.; Epstein, R.; Delettrez, J. A.; Anderson, K. S.; Yan, R.; Chang, P.-Y.; Jonathan, D.; Charissis, M.

    2015-11-01

    The three-dimensional radiation-hydrodynamics code DEC3D has been developed to model the deceleration phase of direct-drive inertial confinement fusion implosions. The code uses the approximate Riemann solver on a moving mesh to achieve high resolution near discontinuities. The domain decomposition parallelization strategy is implemented to maintain high computation efficiency for the 3-D calculation through message passing interface. The implicit thermal diffusion is solved by the parallel successive-over-relaxation iteration. Results from 3-D simulations of low-mode Rayleigh-Taylor instability are presented and compared with 2-D results. A systematic comparison of yields, pressures, temperatures, and areal densities between 2-D and 3-D is carried out to determine the additional degradation in target performance caused by the three-dimensionality of the nonuniformities. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944 and DE-FC02-04ER54789 (Fusion Science Center).

  3. Application of proton boron fusion reaction to radiation therapy: A Monte Carlo simulation study

    SciTech Connect

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk

    2014-12-01

    Three alpha particles are emitted from the point of reaction between a proton and boron. The alpha particles are effective in inducing the death of a tumor cell. After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton's maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here, we show that the effectiveness of the proton boron fusion therapy was verified using Monte Carlo simulations. We found that a dramatic increase by more than half of the proton's maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton's maximum dose point was located within the boron uptake region. In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.

  4. Neural-network accelerated fusion simulation with self-consistent core-pedestal coupling

    NASA Astrophysics Data System (ADS)

    Meneghini, O.; Candy, J.; Snyder, P. B.; Staebler, G.; Belli, E.

    2016-10-01

    Practical fusion Whole Device Modeling (WDM) simulations require the ability to perform predictions that are fast, but yet account for the sensitivity of the fusion performance to the boundary constraint that is imposed by the pedestal structure of H-mode plasmas due to the stiff core transport models. This poster presents the development of a set of neural-network (NN) models for the pedestal structure (as predicted by the EPED model), and the neoclassical and turbulent transport fluxes (as predicted by the NEO and TGLF codes, respectively), and their self-consistent coupling within the TGYRO transport code. The results are benchmarked with the ones obtained via the coupling scheme described in [Meneghini PoP 2016]. By substituting the most demanding codes with their NN-accelerated versions, the solution can be found at a fraction of the computation cost of the original coupling scheme, thereby combining the accuracy of a high-fidelity model with the fast turnaround time of a reduced model. Work supported by U.S. DOE DE-FC02-04ER54698 and DE-FG02-95ER54309.

  5. Application of proton boron fusion reaction to radiation therapy: A Monte Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk

    2014-12-01

    Three alpha particles are emitted from the point of reaction between a proton and boron. The alpha particles are effective in inducing the death of a tumor cell. After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton's maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here, we show that the effectiveness of the proton boron fusion therapy was verified using Monte Carlo simulations. We found that a dramatic increase by more than half of the proton's maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton's maximum dose point was located within the boron uptake region. In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.

  6. Multimode guidance project low frequency ECM simulator: Hardware description

    NASA Astrophysics Data System (ADS)

    Kaye, H. M.

    1982-10-01

    The Multimode Guidance(MMG) Project, part of the Army/Navy Area Defense SAM Technology Prototyping Program, was established to conduct a feasibility demonstration of multimode guidance concepts. Prototype guidance units for advanced, long range missiles are being built and tested under MMG Project sponsorship. The Johns Hopkins University Applied Physics Laboratory has been designated as Government Agent for countermeasures for this project. In support of this effort, a family of computer-controlled ECM simulators is being developed for validation of contractor's multimode guidance prototype designs. The design of the Low Frequency ECM Simulator is documented in two volumes. This report, Volume A, describes the hardware design of the simulator; Volume B describes the software design. This computer-controlled simulator can simulate up to six surveillance frequency jammers in B through F bands and will be used to evaluate the performance of home-on-jamming guidance modes in multiple jammer environments.

  7. Simulations of alpha parameters in a TFTR DT supershot with high fusion power

    SciTech Connect

    Budny, R.V.; Bell, M.G.; Janos, A.C.

    1995-07-01

    A TFTR supershot with a plasma current of 2.5 MA, neutral beam heating power of 33.7 MW, and a peak DT fusion power of 7.5 MW is studied using the TRANSP plasma analysis code. Simulations of alpha parameters such as the alpha heating, pressure, and distributions in energy and v{sub parallel}/v are given. The effects of toroidal ripple and mixing of the fast alpha particles during the sawteeth observed after the neutral beam injection phase are modeled. The distributions of alpha particles on the outer midplane are peaked near forward and backward v{sub parallel}/v. Ripple losses deplete the distributions in the vicinity of v{sub parallel}/v {approximately}{minus}0.4. Sawtooth mixing of fast alpha particles is computed to reduce their central density and broaden their width in energy.

  8. Integrated simulation of magnetic-field-assist fast ignition laser fusion

    NASA Astrophysics Data System (ADS)

    Johzaki, T.; Nagatomo, H.; Sunahara, A.; Sentoku, Y.; Sakagami, H.; Hata, M.; Taguchi, T.; Mima, K.; Kai, Y.; Ajimi, D.; Isoda, T.; Endo, T.; Yogo, A.; Arikawa, Y.; Fujioka, S.; Shiraga, H.; Azechi, H.

    2017-01-01

    To enhance the core heating efficiency in fast ignition laser fusion, the concept of relativistic electron beam guiding by external magnetic fields was evaluated by integrated simulations for FIREX class targets. For the cone-attached shell target case, the core heating performance deteriorates by applying magnetic fields since the core is considerably deformed and most of the fast electrons are reflected due to the magnetic mirror formed through the implosion. On the other hand, in the case of a cone-attached solid ball target, the implosion is more stable under the kilo-tesla-class magnetic field. In addition, feasible magnetic field configuration is formed through the implosion. As a result, the core heating efficiency doubles by magnetic guiding. The dependence of core heating properties on the heating pulse shot timing was also investigated for the solid ball target.

  9. Flow design and simulation of a gas compression system for hydrogen fusion energy production

    NASA Astrophysics Data System (ADS)

    Avital, E. J.; Salvatore, E.; Munjiza, A.; Suponitsky, V.; Plant, D.; Laberge, M.

    2017-08-01

    An innovative gas compression system is proposed and computationally researched to achieve a short time response as needed in engineering applications such as hydrogen fusion energy reactors and high speed hammers. The system consists of a reservoir containing high pressure gas connected to a straight tube which in turn is connected to a spherical duct, where at the sphere’s centre plasma resides in the case of a fusion reactor. Diaphragm located inside the straight tube separates the reservoir’s high pressure gas from the rest of the plenum. Once the diaphragm is breached the high pressure gas enters the plenum to drive pistons located on the inner wall of the spherical duct that will eventually end compressing the plasma. Quasi-1D and axisymmetric flow formulations are used to design and analyse the flow dynamics. A spike is designed for the interface between the straight tube and the spherical duct to provide a smooth geometry transition for the flow. Flow simulations show high supersonic flow hitting the end of the spherical duct, generating a return shock wave propagating upstream and raising the pressure above the reservoir pressure as in the hammer wave problem, potentially giving temporary pressure boost to the pistons. Good agreement is revealed between the two flow formulations pointing to the usefulness of the quasi-1D formulation as a rapid solver. Nevertheless, a mild time delay in the axisymmetric flow simulation occurred due to moderate two-dimensionality effects. The compression system is settled down in a few milliseconds for a spherical duct of 0.8 m diameter using Helium gas and a uniform duct cross-section area. Various system geometries are analysed using instantaneous and time history flow plots.

  10. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  11. Overview of the Simulation of Wave Interactions with MHD Project (SWIM)

    NASA Astrophysics Data System (ADS)

    Batchelor, Donald

    2010-11-01

    The SWIM center has the scientific objectives of: improving our understanding of interactions that both RF wave and particle sources have on extended-MHD phenomena, improving our capability for predicting and optimizing the performance of burning plasmas, developing an integrated computational system for treating multi-physics phenomena with the required flexibility and extensibility to serve as a prototype for the Fusion Simulation Project, addressing mathematics issues related to the multi-scale, coupled physics of RF waves and extended MHD, and optimizing the integrated system on high performance computers. Our Center has now built an end-to-end computational system that allows existing physics codes to be able to function together in a parallel environment and connects them to utility software components and data management systems. We have used this framework to couple together state-of-the-art fusion energy codes to produce a unique and world-class simulation capability. A physicist's overview of the Integrated Plasma Simulator (IPS) will be given and applications described. For example the IPS is being employed to support ITER with operational scenario studies.

  12. Classical molecular dynamics simulations of fusion and fragmentation in fullerene-fullerene collisions

    NASA Astrophysics Data System (ADS)

    Verkhovtsev, Alexey; Korol, Andrei V.; Solovyov, Andrey V.

    2017-08-01

    We present the results of classical molecular dynamics simulations of collision-induced fusion and fragmentation of C60 fullerenes, performed by means of the MBN Explorer software package. The simulations provide information on structural differences of the fused compound depending on kinematics of the collision process. The analysis of fragmentation dynamics at different initial conditions shows that the size distributions of produced molecular fragments are peaked for dimers, which is in agreement with a well-established mechanism of C60 fragmentation via preferential C2 emission.Atomic trajectories of the colliding particles are analyzed and different fragmentation patterns are observed and discussed. On the basis of the performed simulations, characteristic time of C2 emission is estimated as a function of collision energy.The results are compared with experimental time-of-flight distributions of molecular fragments and with earlier theoretical studies. Considering the widely explored case study of C60-C60 collisions, we demonstrate broad capabilities of the MBN Explorer software, which can be utilized for studying collisions of a broad variety of nanoscale and biomolecular systems by means of classical molecular dynamics.

  13. Project ITCH: Interactive Digital Simulation in Electrical Engineering Education.

    ERIC Educational Resources Information Center

    Bailey, F. N.; Kain, R. Y.

    A two-stage project is investigating the educational potential of a low-cost time-sharing system used as a simulation tool in Electrical Engineering (EE) education. Phase I involves a pilot study and Phase II a full integration. The system employs interactive computer simulation to teach engineering concepts which are not well handled by…

  14. A simulation study for radiation treatment planning based on the atomic physics of the proton-boron fusion reaction

    NASA Astrophysics Data System (ADS)

    Kim, Sunmi; Yoon, Do-Kun; Shin, Han-Back; Jung, Joo-Young; Kim, Moo-Sub; Kim, Kyeong-Hyeon; Jang, Hong-Seok; Suh, Tae Suk

    2017-03-01

    The purpose of this research is to demonstrate, based on a Monte Carlo simulation code, the procedure of radiation treatment planning for proton-boron fusion therapy (PBFT). A discrete proton beam (60 - 120 MeV) relevant to the Bragg peak was simulated using a Monte Carlo n-particle extended (MCNPX, Ver. 2.6.0, National Laboratory, Los Alamos NM, USA) simulation code. After computed tomography (CT) scanning of a virtual water phantom including air cavities, the acquired CT images were converted using the simulation source code. We set the boron uptake regions (BURs) in the simulated water phantom to achieve the proton-boron fusion reaction. Proton sources irradiated the BUR, in the phantom. The acquired dose maps were overlapped with the original CT image of the phantom to analyze the dose volume histogram (DVH). We successfully confirmed amplifications of the proton doses (average: 130%) at the target regions. From the DVH result for each simulation, we acquired a relatively accurate dose map for the treatment. A simulation was conducted to characterize the dose distribution and verify the feasibility of proton-boron fusion therapy (PBFT). We observed a variation in proton range and developed a tumor-targeting technique for treatment that was more accurate and powerful than both conventional proton therapy and boron-neutron capture therapy.

  15. Three dimensional simulations of space charge dominated heavy ion beams with applications to inertial fusion energy

    SciTech Connect

    Grote, David Peter

    1994-11-01

    Heavy ion fusion requires injection, transport and acceleration of high current beams. Detailed simulation of such beams requires fully self-consistent space charge fields and three dimensions. WARP3D, developed for this purpose, is a particle-in-cell plasma simulation code optimized to work within the framework of an accelerator`s lattice of accelerating, focusing, and bending elements. The code has been used to study several test problems and for simulations and design of experiments. Two applications are drift compression experiments on the MBE-4 facility at LBL and design of the electrostatic quadrupole injector for the proposed ILSE facility. With aggressive drift compression on MBE-4, anomalous emittance growth was observed. Simulations carried out to examine possible causes showed that essentially all the emittance growth is result of external forces on the beam and not of internal beam space-charge fields. Dominant external forces are the dodecapole component of focusing fields, the image forces on the surrounding pipe and conductors, and the octopole fields that result from the structure of the quadrupole focusing elements. Goal of the design of the electrostatic quadrupole injector is to produce a beam of as low emittance as possible. The simulations show that the dominant effects that increase the emittance are the nonlinear octopole fields and the energy effect (fields in the axial direction that are off-axis). Injectors were designed that minimized the beam envelope in order to reduce the effect of the nonlinear fields. Alterations to the quadrupole structure that reduce the nonlinear fields further were examined. Comparisons were done with a scaled experiment resulted in very good agreement.

  16. Modelling and simulation of new generation powerful gyrotrons for the fusion research

    NASA Astrophysics Data System (ADS)

    Sabchevski, S.; Zhelyazkov, I.

    2007-04-01

    One of the important issues related with the cyclotron resonance heating (CRH) and current drive of fusion plasmas in thermonuclear reactors (tokamaks and stellarators) is the development of multi-megawatt class gyrotrons. There are generally three stages of the implementation of that task, notably (i) elaborating a novel generation of software tools for the physical modelling and simulation of such kind of gyrotrons, (ii) their computer aided design (CAD) and construction on the basis of the simulation's results, and finally, (iii) gyrotrons' testing in real experimental conditions. This tutorial paper concerns the first item-the development of software tools. In co-operation with the Institute for Pulsed Power and Microwave Technology at the Forschungszentrum Karlsruhe, Germany, and Centre de Recherches en Physique des Plasmas at École Polytechnique Fédérale de Lausanne, Switzerland, we work on the conceptual design of the software tools under development. The basic conclusions are that the numerical codes for gyrotrons' modelling should possess the following essential characteristics: (a) portability, (b) extensibility, (c) to be oriented toward the solution of practical problems (i.e., elaborating of computer programs that can be used in the design process), (d) to be based on self-consistent 3D physical models, which take into account the departure from axial symmetry, and (e) ability to simulate time dependent processes (electrostatic PIC simulation) alongside with a trajectory analysis (ray tracing simulation). Here, we discuss how various existing numerical codes have to be improved and implemented via the advanced programming technologies for state-of-the-art computer systems including clusters, grid, parallel platforms, and supercomputers.

  17. The IDA Advanced Technology Combat Simulation Project

    DTIC Science & Technology

    1990-09-01

    Codes Dt Avail and/or r DtDDist Special4 A I I ! I I 5 PREFACE This paper was prepared as part of IDA Project 9000-623 under the IDA Central Research...Grotte, Ken Ratkiewicz , Phillip Merkey, Paul B. Schneck, Eleanor L. Schwartz, Shawn Sheridan, William Stoltz, Victor U.goff, Lowell Miller, Valyncia...benefit from the use of these methods. v HI I CONTENTS1 P R E F A C E

  18. Final Report for the "Fusion Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Cary, John R; Kruger, Scott

    2014-10-02

    The FACETS project over its lifetime developed the first self-consistent core-edge coupled capabilities, a new transport solver for modeling core transport in tokamak cores, developed a new code for modeling wall physics over long time scales, and significantly improved the capabilities and performance of legacy components, UEDGE, NUBEAM, GLF23, GYRO, and BOUT++. These improved capabilities leveraged the team’s expertise in applied mathematics (solvers and algorithms) and computer science (performance improvements and language interoperability). The project pioneered new methods for tackling the complexity of simulating the concomitant complexity of tokamak experiments.

  19. Using a Scientific Process for Curriculum Development and Formative Evaluation: Project FUSION

    ERIC Educational Resources Information Center

    Doabler, Christian; Cary, Mari Strand; Clarke, Benjamin; Fien, Hank; Baker, Scott; Jungjohann, Kathy

    2011-01-01

    Given the vital importance of using a scientific approach for curriculum development, the authors employed a design experiment methodology (Brown, 1992; Shavelson et al., 2003) to develop and evaluate, FUSION, a first grade mathematics intervention intended for students with or at-risk for mathematics disabilities. FUSION, funded through IES…

  20. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  1. FY2014 FES (Fusion Energy Sciences) Theory & Simulation Performance Target, Final Report

    SciTech Connect

    Fu, Guoyong; Budny, Robert; Gorelenkov, Nikolai; Poli, Francesca; Chen, Yang; McClenaghan, Joseph; Lin, Zhihong; Spong, Don; Bass, Eric; Waltz, Ron

    2014-10-14

    We report here the work done for the FY14 OFES Theory Performance Target as given below: "Understanding alpha particle confinement in ITER, the world's first burning plasma experiment, is a key priority for the fusion program. In FY 2014, determine linear instability trends and thresholds of energetic particle-driven shear Alfven eigenmodes in ITER for a range of parameters and profiles using a set of complementary simulation models (gyrokinetic, hybrid, and gyrofluid). Carry out initial nonlinear simulations to assess the effects of the unstable modes on energetic particle transport". In the past year (FY14), a systematic study of the alpha-driven Alfven modes in ITER has been carried out jointly by researchers from six institutions involving seven codes including the transport simulation code TRANSP (R. Budny and F. Poli, PPPL), three gyrokinetic codes: GEM (Y. Chen, Univ. of Colorado), GTC (J. McClenaghan, Z. Lin, UCI), and GYRO (E. Bass, R. Waltz, UCSD/GA), the hybrid code M3D-K (G.Y. Fu, PPPL), the gyro-fluid code TAEFL (D. Spong, ORNL), and the linear kinetic stability code NOVA-K (N. Gorelenkov, PPPL). A range of ITER parameters and profiles are specified by TRANSP simulation of a hybrid scenario case and a steady-state scenario case. Based on the specified ITER equilibria linear stability calculations are done to determine the stability boundary of alpha-driven high-n TAEs using the five initial value codes (GEM, GTC, GYRO, M3D-K, and TAEFL) and the kinetic stability code (NOVA-K). Both the effects of alpha particles and beam ions have been considered. Finally, the effects of the unstable modes on energetic particle transport have been explored using GEM and M3D-K.

  2. Uncertainty Quantification For Physical and Numerical Diffusion Models In Inertial Confinement Fusion Simulations

    NASA Astrophysics Data System (ADS)

    Rana, Verinder S.

    This thesis concerns simulations of Inertial Confinement Fusion. Inertial confinement is carried out in a large scale facility at National Ignition Facility. The experiments have failed to reproduce design calculations, and so uncertainty quantification of calculations is an important asset. Uncertainties can be classified as aleatoric or epistemic. This thesis is concerned with aleatoric uncertainty quantification. Among the many uncertain aspects that affect the simulations, we have narrowed our study of possible uncertainties. The first source of uncertainty we present is the amount of pre-heating of the fuel done by hot electrons. The second source of uncertainty we consider is the effect of the algorithmic and physical transport diffusion and their effect on the hot spot thermodynamics. Physical transport mechanisms play an important role for the entire duration of the ICF capsule, so modeling them correctly becomes extremely vital. In addition, codes that simulate material mixing introduce numerical (algorithmically) generated transport across the material interfaces. This adds another layer of uncertainty in the solution through the artificially added diffusion. The third source of uncertainty we consider is physical model uncertainty. The fourth source of uncertainty we focus on a single localized surface perturbation (a divot) which creates a perturbation to the solution that can potentially enter the hot spot to diminish the thermonuclear environment. Jets of ablator material are hypothesized to enter the hot spot and cool the core, contributing to the observed lower reactions than predicted levels. A plasma transport package, Transport for Inertial Confinement Fusion (TICF) has been implemented into the Radiation Hydrodynamics code FLASH, from the University of Chicago. TICF has thermal, viscous and mass diffusion models that span the entire ICF implosion regime. We introduced a Quantum Molecular Dynamics calibrated thermal conduction model due to Hu for

  3. Modern gyrokinetic particle-in-cell simulation of fusion plasmas on top supercomputers

    DOE PAGES

    Wang, Bei; Ethier, Stephane; Tang, William; ...

    2017-06-29

    The Gyrokinetic Toroidal Code at Princeton (GTC-P) is a highly scalable and portable particle-in-cell (PIC) code. It solves the 5D Vlasov-Poisson equation featuring efficient utilization of modern parallel computer architectures at the petascale and beyond. Motivated by the goal of developing a modern code capable of dealing with the physics challenge of increasing problem size with sufficient resolution, new thread-level optimizations have been introduced as well as a key additional domain decomposition. GTC-P's multiple levels of parallelism, including inter-node 2D domain decomposition and particle decomposition, as well as intra-node shared memory partition and vectorization have enabled pushing the scalability ofmore » the PIC method to extreme computational scales. In this paper, we describe the methods developed to build a highly parallelized PIC code across a broad range of supercomputer designs. This particularly includes implementations on heterogeneous systems using NVIDIA GPU accelerators and Intel Xeon Phi (MIC) co-processors and performance comparisons with state-of-the-art homogeneous HPC systems such as Blue Gene/Q. New discovery science capabilities in the magnetic fusion energy application domain are enabled, including investigations of Ion-Temperature-Gradient (ITG) driven turbulence simulations with unprecedented spatial resolution and long temporal duration. Performance studies with realistic fusion experimental parameters are carried out on multiple supercomputing systems spanning a wide range of cache capacities, cache-sharing configurations, memory bandwidth, interconnects and network topologies. These performance comparisons using a realistic discovery-science-capable domain application code provide valuable insights on optimization techniques across one of the broadest sets of current high-end computing platforms worldwide.« less

  4. Large-scale molecular dynamics simulations of dense plasmas: The Cimarron Project

    NASA Astrophysics Data System (ADS)

    Graziani, Frank R.; Batista, Victor S.; Benedict, Lorin X.; Castor, John I.; Chen, Hui; Chen, Sophia N.; Fichtl, Chris A.; Glosli, James N.; Grabowski, Paul E.; Graf, Alexander T.; Hau-Riege, Stefan P.; Hazi, Andrew U.; Khairallah, Saad A.; Krauss, Liam; Langdon, A. Bruce; London, Richard A.; Markmann, Andreas; Murillo, Michael S.; Richards, David F.; Scott, Howard A.; Shepherd, Ronnie; Stanton, Liam G.; Streitz, Fred H.; Surh, Michael P.; Weisheit, Jon C.; Whitley, Heather D.

    2012-03-01

    We describe the status of a new time-dependent simulation capability for dense plasmas. The backbone of this multi-institutional effort - the Cimarron Project - is the massively parallel molecular dynamics (MD) code "ddcMD," developed at Lawrence Livermore National Laboratory. The project's focus is material conditions such as exist in inertial confinement fusion experiments, and in many stellar interiors: high temperatures, high densities, significant electromagnetic fields, mixtures of high- and low- Z elements, and non-Maxwellian particle distributions. Of particular importance is our ability to incorporate into this classical MD code key atomic, radiative, and nuclear processes, so that their interacting effects under non-ideal plasma conditions can be investigated. This paper summarizes progress in computational methodology, discusses strengths and weaknesses of quantum statistical potentials as effective interactions for MD, explains the model used for quantum events possibly occurring in a collision, describes two new experimental efforts that play a central role in our validation work, highlights some significant results obtained to date, outlines concepts now being explored to deal more efficiently with the very disparate dynamical timescales that arise in fusion plasmas, and provides a careful comparison of quantum effects on electron trajectories predicted by more elaborate dynamical methods.

  5. Fusion studies with low-intensity radioactive ion beams using an active-target time projection chamber

    NASA Astrophysics Data System (ADS)

    Kolata, J. J.; Howard, A. M.; Mittig, W.; Ahn, T.; Bazin, D.; Becchetti, F. D.; Beceiro-Novo, S.; Chajecki, Z.; Febbrarro, M.; Fritsch, A.; Lynch, W. G.; Roberts, A.; Shore, A.; Torres-Isea, R. O.

    2016-09-01

    The total fusion excitation function for 10Be+40Ar has been measured over the center-of-momentum (c.m.) energy range from 12 to 24 MeV using a time-projection chamber (TPC). The main purpose of this experiment, which was carried out in a single run of duration 90 h using a ≈100 particle per second (pps) 10Be beam, was to demonstrate the capability of an active-target TPC to determine fusion excitation functions for extremely weak radioactive ion beams. Cross sections as low as 12 mb were measured with acceptable (50%) statistical accuracy. It also proved to be possible to separate events in which charged particles were emitted from the fusion residue from those in which only neutrons were evaporated. The method permits simultaneous measurement of incomplete fusion, break-up, scattering, and transfer reactions, and therefore fully exploits the opportunities presented by the very exotic beams that will be available from the new generation of radioactive beam facilities.

  6. Proposed best practice for projects that involve modelling and simulation.

    PubMed

    O'Kelly, Michael; Anisimov, Vladimir; Campbell, Chris; Hamilton, Sinéad

    2017-03-01

    Modelling and simulation has been used in many ways when developing new treatments. To be useful and credible, it is generally agreed that modelling and simulation should be undertaken according to some kind of best practice. A number of authors have suggested elements required for best practice in modelling and simulation. Elements that have been suggested include the pre-specification of goals, assumptions, methods, and outputs. However, a project that involves modelling and simulation could be simple or complex and could be of relatively low or high importance to the project. It has been argued that the level of detail and the strictness of pre-specification should be allowed to vary, depending on the complexity and importance of the project. This best practice document does not prescribe how to develop a statistical model. Rather, it describes the elements required for the specification of a project and requires that the practitioner justify in the specification the omission of any of the elements and, in addition, justify the level of detail provided about each element. This document is an initiative of the Special Interest Group for modelling and simulation. The Special Interest Group for modelling and simulation is a body open to members of Statisticians in the Pharmaceutical Industry and the European Federation of Statisticians in the Pharmaceutical Industry. Examples of a very detailed specification and a less detailed specification are included as appendices. Copyright © 2016 John Wiley & Sons, Ltd.

  7. A web-based repository of surgical simulator projects.

    PubMed

    Leskovský, Peter; Harders, Matthias; Székely, Gábor

    2006-01-01

    The use of computer-based surgical simulators for training of prospective surgeons has been a topic of research for more than a decade. As a result, a large number of academic projects have been carried out, and a growing number of commercial products are available on the market. Keeping track of all these endeavors for established groups as well as for newly started projects can be quite arduous. Gathering information on existing methods, already traveled research paths, and problems encountered is a time consuming task. To alleviate this situation, we have established a modifiable online repository of existing projects. It contains detailed information about a large number of simulator projects gathered from web pages, papers and personal communication. The database is modifiable (with password protected sections) and also allows for a simple statistical analysis of the collected data. For further information, the surgical repository web page can be found at www.virtualsurgery.vision.ee.ethz.ch.

  8. Using gaming engines and editors to construct simulations of fusion algorithms for situation management

    NASA Astrophysics Data System (ADS)

    Lewis, Lundy M.; DiStasio, Nolan; Wright, Christopher

    2010-04-01

    In this paper we discuss issues in testing various cognitive fusion algorithms for situation management. We provide a proof-of-principle discussion and demo showing how gaming technologies and platforms could be used to devise and test various fusion algorithms, including input, processing, and output, and we look at how the proof-of-principle could lead to more advanced test beds and methods for high-level fusion in support of situation management. We develop four simple fusion scenarios and one more complex scenario in which a simple rule-based system is scripted to govern the behavior of battlespace entities.

  9. Mechanisms of Plastic and Fracture Instabilities for Alloy Development of Fusion Materials. Final Project Report for period July 15, 1998 - July 14, 2003

    SciTech Connect

    Ghoniem, N. M.

    2003-07-14

    The main objective of this research was to develop new computational tools for the simulation and analysis of plasticity and fracture mechanisms of fusion materials, and to assist in planning and assessment of corresponding radiation experiments.

  10. Perceptually aligning apical frequency regions leads to more binaural fusion of speech in a cochlear implant simulation.

    PubMed

    Staisloff, Hannah E; Lee, Daniel H; Aronoff, Justin M

    2016-07-01

    For bilateral cochlear implant users, the left and right arrays are typically not physically aligned, resulting in a degradation of binaural fusion, which can be detrimental to binaural abilities. Perceptually aligning the two arrays can be accomplished by disabling electrodes in one ear that do not have a perceptually corresponding electrode in the other side. However, disabling electrodes at the edges of the array will cause compression of the input frequency range into a smaller cochlear extent, which may result in reduced spectral resolution. An alternative approach to overcome this mismatch would be to only align one edge of the array. By aligning either only the apical or basal end of the arrays, fewer electrodes would be disabled, potentially causing less reduction in spectral resolution. The goal of this study was to determine the relative effect of aligning either the basal or apical end of the electrode with regards to binaural fusion. A vocoder was used to simulate cochlear implant listening conditions in normal hearing listeners. Speech signals were vocoded such that the two ears were either predominantly aligned at only the basal or apical end of the simulated arrays. The experiment was then repeated with a spectrally inverted vocoder to determine whether the detrimental effects on fusion were related to the spectral-temporal characteristics of the stimuli or the location in the cochlea where the misalignment occurred. In Experiment 1, aligning the basal portion of the simulated arrays led to significantly less binaural fusion than aligning the apical portions of the simulated array. However, when the input was spectrally inverted, aligning the apical portion of the simulated array led to significantly less binaural fusion than aligning the basal portions of the simulated arrays. These results suggest that, for speech, with its predominantly low frequency spectral-temporal modulations, it is more important to perceptually align the apical portion of

  11. ICCS network simulation LDRD project final report summary

    SciTech Connect

    Bryant, B

    1999-01-09

    A critical component of the NIF Integrated Computer Controls System (ICCS) is the local area network (LAN) that enables timely and reliable communication between control applications running on the 600+ computer systems distributed throughout the NIF facility. This project analyzed critical portions of the NIF ICCS network (referred to as "the network" in this report) applying the OPNET Modeler discrete event simulation package to model and simulate network operation and the Network Associates Distributed Sniffer network analyzer to collect actual network performance data in the ICCS Testbed. These tools were selected and procured for use on this project. Simulations and initial network analysis indicate that the network is capable of meeting system requirements. ICCS application software is currently in development, so test software was used to collect performance data. As application software is tested in the Testbed environment, more accurate timing information can be collected which will allow for more accurate large-scale simulations.

  12. M3D project for simulation studies of plasmas

    SciTech Connect

    Park, W.; Belova, E.V.; Fu, G.Y.; Strauss, H.R.; Sugiyama, L.E.

    1998-12-31

    The M3D (Multi-level 3D) project carries out simulation studies of plasmas of various regimes using multi-levels of physics, geometry, and mesh schemes in one code package. This paper and papers by Strauss, Sugiyama, and Belova in this workshop describe the project, and present examples of current applications. The currently available physics models of the M3D project are MHD, two-fluids, gyrokinetic hot particle/MHD hybrid, and gyrokinetic particle ion/two-fluid hybrid models. The code can be run with both structured and unstructured meshes.

  13. Modeling and simulation support for ICRF heating of fusion plasmas. Annual report, 1990

    SciTech Connect

    1990-03-15

    Recent experimental, theoretical and computational results have shown the need and usefulness of a combined approach to the design, analysis and evaluation of ICH antenna configurations. The work at the University of Wisconsin (UW) in particular has shown that much needed information on the vacuum operation of ICH antennas can be obtained by a modest experimental and computational effort. These model experiments at UW and SAIC simulations have shown dramatically the potential for positive impact upon the ICRF program. Results of the UW-SAIC joint ICRF antenna analysis effort have been presented at several international meetings and numerous meetings in the United States. The PPPL bay M antenna has been modeled using the ARGUS code. The results of this effort are shown in Appendix C. SAIC has recently begun a collaboration with the ICRF antenna design and analysis group at ORNL. At present there are two separate projects underway. The first is associated with the simulation of and determination of the effect of adding slots in the antenna septum and side walls. The second project concerns the modeling and simulation of the ORNL folded waveguide (FWG) concept.

  14. Atomistic simulations of deuterium irradiation on iron-based alloys in future fusion reactors

    DOE PAGES

    Safi, E.; Polvi, J.; Lasa, A.; ...

    2016-10-14

    Iron-based alloys are now being considered as plasma-facing materials for the first wall of future fusion reactors. Therefore, the iron (Fe) and carbon (C) erosion will play a key role in predicting the life-time and viability of reactors with steel walls. In this work, the surface erosion and morphology changes due to deuterium (D) irradiation in pure Fe, Fe with 1% C impurity and the cementite, are studied using molecular dynamics (MD) simulations, varying surface temperature and impact energy. The sputtering yields for both Fe and C were found to increase with incoming energy. In iron carbide, C sputtering wasmore » preferential to Fe and the deuterium was mainly trapped as D2 in bubbles, while mostly atomic D was present in Fe and Fe–1%C. The sputtering yields obtained from MD were compared to SDTrimSP yields. At lower impact energies, the sputtering mechanism was of both physical and chemical origin, while at higher energies (>100 eV) the physical sputtering dominated.« less

  15. Simulation of normal and pathological gaits using a fusion knowledge strategy

    PubMed Central

    2013-01-01

    Gait distortion is the first clinical manifestation of many pathological disorders. Traditionally, the gait laboratory has been the only available tool for supporting both diagnosis and prognosis, but under the limitation that any clinical interpretation depends completely on the physician expertise. This work presents a novel human gait model which fusions two important gait information sources: an estimated Center of Gravity (CoG) trajectory and learned heel paths, by that means allowing to reproduce kinematic normal and pathological patterns. The CoG trajectory is approximated with a physical compass pendulum representation that has been extended by introducing energy accumulator elements between the pendulum ends, thereby emulating the role of the leg joints and obtaining a complete global gait description. Likewise, learned heel paths captured from actual data are learned to improve the performance of the physical model, while the most relevant joint trajectories are estimated using a classical inverse kinematic rule. The model is compared with standard gait patterns, obtaining a correlation coefficient of 0.96. Additionally,themodel simulates neuromuscular diseases like Parkinson (phase 2, 3 and 4) and clinical signs like the Crouch gait, case in which the averaged correlation coefficient is 0.92. PMID:23844901

  16. Atomistic simulations of deuterium irradiation on iron-based alloys in future fusion reactors

    SciTech Connect

    Safi, E.; Polvi, J.; Lasa, A.; Nordlund, K.

    2016-10-14

    Iron-based alloys are now being considered as plasma-facing materials for the first wall of future fusion reactors. Therefore, the iron (Fe) and carbon (C) erosion will play a key role in predicting the life-time and viability of reactors with steel walls. In this work, the surface erosion and morphology changes due to deuterium (D) irradiation in pure Fe, Fe with 1% C impurity and the cementite, are studied using molecular dynamics (MD) simulations, varying surface temperature and impact energy. The sputtering yields for both Fe and C were found to increase with incoming energy. In iron carbide, C sputtering was preferential to Fe and the deuterium was mainly trapped as D2 in bubbles, while mostly atomic D was present in Fe and Fe–1%C. The sputtering yields obtained from MD were compared to SDTrimSP yields. At lower impact energies, the sputtering mechanism was of both physical and chemical origin, while at higher energies (>100 eV) the physical sputtering dominated.

  17. Parallel mesh support for particle-in-cell methods in magnetic fusion simulations

    NASA Astrophysics Data System (ADS)

    Yoon, Eisung; Shephard, Mark S.; Seol, E. Seegyoung; Kalyanaraman, Kaushik; Ibanez, Daniel

    2016-10-01

    As supercomputing power continues to increase Particle-In-Cell (PIC) methods are being widely adopted for transport simulations of magnetic fusion devices. Current implementations place a copy of the entire continuum mesh and its fields used in the PIC calculations on every node. This is in general not a scalable solution as computational power continues to grow faster than node level memory. To address this scalability issue, while still maintaining sufficient mesh per node to control costly inter-node communication, a new unstructured mesh distribution methods and associated mesh based PIC calculation procedure is being developed building on the parallel unstructured mesh infrastructure (PUMI). Key components to be outlined in the presentation include (i) the mesh distribution strategy, (ii) how the particles are tracked during a push cycle taking advantage of the unstructured mesh adjacency structures and searches based on that structure, and (iii) how the field solve steps and particle migration are controlled. Performance comparisons to the current approach will also be presented.

  18. Simulations of fusion plasmas by A 3-D, E-M particle code

    NASA Astrophysics Data System (ADS)

    Buneman, O.; Storey, L. R. O.

    1985-03-01

    The work described in the present report arose out of a proposal submitted to the U.S. Department of Energy in August 1983. It was for renewal of the then existing Contract No. DE-AS03-7GSF0032G titled Simulations of fusion plasmas by a 3-D, E-M particle code. Under this contract, in previous years, the Principal Investigator and his students had developed a fully electromagnetic particle code that was then - and still remains, to the best of our knowledge - the most advanced code of its kind in existence. This TRI-dimensional STANford code, now called TRISTAN, follows the motion of about 5 x 10 to the 6 particles in a cubical volume divided up into 128 cells, i.e., each side of the cube is divided into 128 units. TRISTAN is written in ASSEMBLER language for the Cray-1 computer, and is carefully optimized. See Buneman, et al., (1980) for an account of the ideas that have gone into this code.

  19. Simulation of X-ray Irradiation on Optics and Chamber Wall Materials for Inertial Fusion Energy

    SciTech Connect

    Reyes, S; Latkowski, J F; Abbott, R P; Stein, W

    2003-09-10

    We have used the ABLATOR code to analyze the effect of the x-ray emission from direct drive targets on the optics and the first wall of a conceptual laser Inertial Fusion Energy (IFE) power plant. For this purpose, the ABLATOR code has been modified to incorporate the predicted x-ray spectrum from a generic direct drive target. We have also introduced elongation calculations in ABLATOR to predict the thermal stresses in the optic and first wall materials. These results have been validated with thermal diffusion calculations, using the LLNL heat transfer and dynamic structural finite element codes Topaz3d and Dyna3d. One of the most relevant upgrades performed in the ABLATOR code consists of the possibility to accommodate multi-material simulations. This new feature allows for a more realistic modeling of typical IFE optics and first wall materials, which may have a number of different layers. Finally, we have used the XAPPER facility, at LLNL, to develop our predictive capability and validate the results. The ABLATOR code will be further modified, as necessary, to predict the effects of x-ray irradiation in both the IFE real case and our experiments on the XAPPER facility.

  20. SIMRAND I- SIMULATION OF RESEARCH AND DEVELOPMENT PROJECTS

    NASA Technical Reports Server (NTRS)

    Miles, R. F.

    1994-01-01

    The Simulation of Research and Development Projects program (SIMRAND) aids in the optimal allocation of R&D resources needed to achieve project goals. SIMRAND models the system subsets or project tasks as various network paths to a final goal. Each path is described in terms of task variables such as cost per hour, cost per unit, availability of resources, etc. Uncertainty is incorporated by treating task variables as probabilistic random variables. SIMRAND calculates the measure of preference for each alternative network. The networks yielding the highest utility function (or certainty equivalence) are then ranked as the optimal network paths. SIMRAND has been used in several economic potential studies at NASA's Jet Propulsion Laboratory involving solar dish power systems and photovoltaic array construction. However, any project having tasks which can be reduced to equations and related by measures of preference can be modeled. SIMRAND analysis consists of three phases: reduction, simulation, and evaluation. In the reduction phase, analytical techniques from probability theory and simulation techniques are used to reduce the complexity of the alternative networks. In the simulation phase, a Monte Carlo simulation is used to derive statistics on the variables of interest for each alternative network path. In the evaluation phase, the simulation statistics are compared and the networks are ranked in preference by a selected decision rule. The user must supply project subsystems in terms of equations based on variables (for example, parallel and series assembly line tasks in terms of number of items, cost factors, time limits, etc). The associated cumulative distribution functions and utility functions for each variable must also be provided (allowable upper and lower limits, group decision factors, etc). SIMRAND is written in Microsoft FORTRAN 77 for batch execution and has been implemented on an IBM PC series computer operating under DOS.

  1. Fast discontinuous Galerkin lattice-Boltzmann simulations on GPUs via maximal kernel fusion

    NASA Astrophysics Data System (ADS)

    Mazzeo, Marco D.

    2013-03-01

    A GPU implementation of the discontinuous Galerkin lattice-Boltzmann method with square spectral elements, and highly optimised for speed and precision of calculations is presented. An extensive analysis of the numerous variants of the fluid solver unveils that best performance is obtained by maximising CUDA kernel fusion and by arranging the resulting kernel tasks so as to trigger memory coherent and scattered loads in a specific manner, albeit at the cost of introducing cross-thread load unbalancing. Surprisingly, any attempt to vanish this, to maximise thread occupancy and to adopt conventional work tiling or distinct custom kernels highly tuned via ad hoc data and computation layouts invariably deteriorate performance. As such, this work sheds light into the possibility to hide fetch latencies of workloads involving heterogeneous loads in a way that is more effective than what is achieved with frequently suggested techniques. When simulating the lid-driven cavity on a NVIDIA GeForce GTX 480 via a 5-stage 4th-order Runge-Kutta (RK) scheme, the first four digits of the obtained centreline velocity values, or more, converge to those of the state-of-the-art literature data at a simulation speed of 7.0G primitive variable updates per second during the collision stage and 4.4G ones during each RK step of the advection by employing double-precision arithmetic (DPA) and a computational grid of 642 4×4-point elements only. The new programming engine leads to about 2× performance w.r.t. the best programming guidelines in the field. The new fluid solver on the above GPU is also 20-30 times faster than a highly optimised version running on a single core of a Intel Xeon X5650 2.66 GHz.

  2. Microscopic dynamics simulations of heavy-ion fusion reactions induced by neutron-rich nuclei

    NASA Astrophysics Data System (ADS)

    Wang, Ning; Ou, Li; Zhang, Yingxun; Li, Zhuxia

    2014-06-01

    The heavy-ion fusion reactions induced by neutron-rich nuclei are investigated with the improved quantum molecular dynamics (ImQMD) model. With a subtle consideration of the neutron skin thickness of nuclei and the symmetry potential, the stability of nuclei and the fusion excitation functions of heavy-ion fusion reactions O16 + Ge76, O16 + Sm154, Ca40 + Zr96, and Sn132 + Ca40 are systematically studied. The fusion cross sections of these reactions at energies around the Coulomb barrier can be well reproduced by using the ImQMD model. The corresponding slope parameter of the symmetry energy adopted in the calculations is L ≈78 MeV and the surface energy coefficient is gsur=18±1.5 MeV fm2. In addition, it is found that the surface-symmetry term significantly influences the fusion cross sections of neutron-rich fusion systems. For sub-barrier fusion, the dynamical fluctuations in the densities of the reaction partners and the enhanced surface diffuseness at neck side result in the lowering of the fusion barrier.

  3. Exploring International Investment through a Classroom Portfolio Simulation Project

    ERIC Educational Resources Information Center

    Chen, Xiaoying; Yur-Austin, Jasmine

    2013-01-01

    A rapid integration of financial markets has prevailed during the last three decades. Investors are able to diversify investment beyond national markets to mitigate return volatility of a "pure domestic portfolio." This article discusses a simulation project through which students learn the role of international investment by managing…

  4. Exploring International Investment through a Classroom Portfolio Simulation Project

    ERIC Educational Resources Information Center

    Chen, Xiaoying; Yur-Austin, Jasmine

    2013-01-01

    A rapid integration of financial markets has prevailed during the last three decades. Investors are able to diversify investment beyond national markets to mitigate return volatility of a "pure domestic portfolio." This article discusses a simulation project through which students learn the role of international investment by managing…

  5. NASA/Haughton-Mars Project 2006 Lunar Medical Contingency Simulation

    NASA Technical Reports Server (NTRS)

    Scheuring, Richard A.; Jones, J. A.; Lee, P.; Comtois, J. M.; Chappell, S.; Rafiq, A.; Braham, S.

    2007-01-01

    A viewgraph presentation describing NASA's Haughton-Mars Project (HMP) medical requirements and lunar surface operations is shown. The topics onclude: 1) Mission Purpose/ Overview; 2) HMP as a Moon/Mars Analog; 3) Simulation objectives; 4) Discussion; and 5) Forward work.

  6. Random projections in reducing the dimensionality of climate simulation data

    NASA Astrophysics Data System (ADS)

    Seitola, Teija; Mikkola, Visa; Silen, Johan; Järvinen, Heikki

    2014-05-01

    Climate simulation data is often high dimensional with several variables and thousands of time steps and grid points. High dimensionality presents a problem by making I/O and post-processing expensive and time consuming. It also excludes use of some analysis methods. Random projection (RP) is a dimensionality reduction method that has been earlier applied to high dimensional data sets, for instance, in image processing. Here we introduce random projection as a dimensionality reduction method applied on simulated global surface temperature data (so called Millennium simulation data of MPI-M) and show how the projected data preserves the essential structure of the original data. We apply Principal component analysis (PCA) on original and randomly projected lower dimensional data to analyze how RP preserves structures when original data is compressed down to 10% or 1% of the original volume. We also demonstrate the application of the RP method on very high dimensional data of the atmospheric temperature in three-dimensions. Our experiments show that information is naturally lost in RP but the main spatial patterns (the principal components) and temporal signatures (spectra of time-dependent coefficients) can still be recovered from the randomly projected low-dimensional subspaces. Our results imply that RP could be used as a pre-processing step before analyzing the structure of large data sets. This might allow investigating the dynamics of truly high dimensional climate data sets of several state variables, time steps and spatial locations.

  7. The SIMRAND methodology - Simulation of Research and Development Projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  8. The SIMRAND methodology - Simulation of Research and Development Projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  9. Response to FESAC survey, non-fusion connections to Fusion Energy Sciences. Applications of the FES-supported beam and plasma simulation code, Warp

    SciTech Connect

    Friedman, A.; Grote, D. P.; Vay, J. L.

    2015-05-29

    The Fusion Energy Sciences Advisory Committee’s subcommittee on non-fusion applications (FESAC NFA) is conducting a survey to obtain information from the fusion community about non-fusion work that has resulted from their DOE-funded fusion research. The subcommittee has requested that members of the community describe recent developments connected to the activities of the DOE Office of Fusion Energy Sciences. Two questions in particular were posed by the subcommittee. This document contains the authors’ responses to those questions.

  10. Integrated Simulation Studies of Plasma Performances and Fusion Reactions in the Deuterium Experiment of LHD

    NASA Astrophysics Data System (ADS)

    Murakami, S.; Yamaguchi, H.; Homma, M.; Maeta, S.; Saito, Y.; Fukuyama, A.; Nagaoka, K.; Takahashi, H.; Nakano, H.; Osakabe, M.; Yokoyama, M.; Tanaka, K.; Ida, K.; Yoshinuma, M.; Isobe, M.; Tomita, H.; Ogawa, K.; LHD Exp Group Team

    2016-10-01

    The deuterium experiment project from 2017 is planned in LHD, where the deuterium NBI heating beams with the power more than 30MW are injected into the deuterium plasma. Principal objects of this project are to clarify the isotope effect on the heat and particle transport in the helical plasma and to study energetic particle confinement in a helical magnetic configuration measuring triton burn-up neutrons. We study the deuterium experiment plasma of LHD applying the integrated simulation code, TASK3D [Murakami, PPCF2015], and the 5-D drift kinetic equation solver, GNET [Murakami, NF2006]. (i) More than 20% of ion temperature increment is obtained in the deuterium plasma (nD /nH +nD = 0.8) due to the isotope effect assuming the turbulent transport model based on the H/He plasma experiment of LHD. (ii) The triton burn-up simulation shows the triton slowing down distribution and the strong magnetic configuration dependency of the triton burn-up ratio in LHD. This work was supported by JSPS KAKENHI Grant Number 26420851.

  11. The GeantV project: Preparing the future of simulation

    SciTech Connect

    Amadio, G.; J. Apostolakis; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; de Fine Licht, J.; Gheata, A.; Iope, R. L.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2015-12-23

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. Furthermore, a set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.

  12. The GeantV project: preparing the future of simulation

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; de Fine Licht, J.; Gheata, A.; Iope, R. L.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2015-12-01

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. A set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.

  13. The GeantV project: Preparing the future of simulation

    DOE PAGES

    Amadio, G.; J. Apostolakis; Bandieramonte, M.; ...

    2015-12-23

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energymore » Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. Furthermore, a set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.« less

  14. Thermal-to-fusion neutron convertor and Monte Carlo coupled simulation of deuteron/triton transport and secondary products generation

    NASA Astrophysics Data System (ADS)

    Wang, Guan-bo; Liu, Han-gang; Wang, Kan; Yang, Xin; Feng, Qi-jie

    2012-09-01

    Thermal-to-fusion neutron convertor has being studied in China Academy of Engineering Physics (CAEP). Current Monte Carlo codes, such as MCNP and GEANT, are inadequate when applied in this multi-step reactions problems. A Monte Carlo tool RSMC (Reaction Sequence Monte Carlo) has been developed to simulate such coupled problem, from neutron absorption, to charged particle ionization and secondary neutron generation. "Forced particle production" variance reduction technique has been implemented to improve the calculation speed distinctly by making deuteron/triton induced secondary product plays a major role. Nuclear data is handled from ENDF or TENDL, and stopping power from SRIM, which described better for low energy deuteron/triton interactions. As a validation, accelerator driven mono-energy 14 MeV fusion neutron source is employed, which has been deeply studied and includes deuteron transport and secondary neutron generation. Various parameters, including fusion neutron angle distribution, average neutron energy at different emission directions, differential and integral energy distributions, are calculated with our tool and traditional deterministic method as references. As a result, we present the calculation results of convertor with RSMC, including conversion ratio of 1 mm 6LiD with a typical thermal neutron (Maxwell spectrum) incidence, and fusion neutron spectrum, which will be used for our experiment.

  15. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  16. WE-EF-207-04: An Inter-Projection Sensor Fusion (IPSF) Approach to Estimate Missing Projection Signal in Synchronized Moving Grid (SMOG) System

    SciTech Connect

    Zhang, H; Kong, V; Jin, J; Ren, L; Zhang, Y; Giles, W

    2015-06-15

    Purpose: A synchronized moving grid (SMOG) has been proposed to reduce scatter and lag artifacts in cone beam computed tomography (CBCT). However, information is missing in each projection because certain areas are blocked by the grid. A previous solution to this issue is acquiring 2 complimentary projections at each position, which increases scanning time. This study reports our first Result using an inter-projection sensor fusion (IPSF) method to estimate missing projection in our prototype SMOG-based CBCT system. Methods: An in-house SMOG assembling with a 1:1 grid of 3 mm gap has been installed in a CBCT benchtop. The grid moves back and forth in a 3-mm amplitude and up-to 20-Hz frequency. A control program in LabView synchronizes the grid motion with the platform rotation and x-ray firing so that the grid patterns for any two neighboring projections are complimentary. A Catphan was scanned with 360 projections. After scatter correction, the IPSF algorithm was applied to estimate missing signal for each projection using the information from the 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was applied to reconstruct CBCT images. The CBCTs were compared to those reconstructed using normal projections without applying the SMOG system. Results: The SMOG-IPSF method may reduce image dose by half due to the blocked radiation by the grid. The method almost completely removed scatter related artifacts, such as the cupping artifacts. The evaluation of line pair patterns in the CatPhan suggested that the spatial resolution degradation was minimal. Conclusion: The SMOG-IPSF is promising in reducing scatter artifacts and improving image quality while reducing radiation dose.

  17. Community Petascale Project for Accelerator Science and Simulation

    SciTech Connect

    Warren B. Mori

    2013-02-01

    The UCLA Plasma Simulation Group is a major partner of the "Community Petascale Project for Accelerator Science and Simulation. This is the final technical report. We include an overall summary, a list of publications and individual progress reports for each years. During the past five years we have made tremendous progress in enhancing the capabilities of OSIRIS and QuickPIC, in developing new algorithms and data structures for PIC codes to run on GPUS and many future core architectures, and in using these codes to model experiments and in making new scientific discoveries. Here we summarize some highlights for which SciDAC was a major contributor.

  18. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  19. Adaptive quantum computation in changing environments using projective simulation.

    PubMed

    Tiersch, M; Ganahl, E J; Briegel, H J

    2015-08-11

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent's learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent's performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover's search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  20. Adaptive quantum computation in changing environments using projective simulation

    NASA Astrophysics Data System (ADS)

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-08-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  1. The AGORA High-resolution Galaxy Simulations Comparison Project

    NASA Astrophysics Data System (ADS)

    Kim, Ji-hoon; Abel, Tom; Agertz, Oscar; Bryan, Greg L.; Ceverino, Daniel; Christensen, Charlotte; Conroy, Charlie; Dekel, Avishai; Gnedin, Nickolay Y.; Goldbaum, Nathan J.; Guedes, Javiera; Hahn, Oliver; Hobbs, Alexander; Hopkins, Philip F.; Hummels, Cameron B.; Iannuzzi, Francesca; Keres, Dusan; Klypin, Anatoly; Kravtsov, Andrey V.; Krumholz, Mark R.; Kuhlen, Michael; Leitner, Samuel N.; Madau, Piero; Mayer, Lucio; Moody, Christopher E.; Nagamine, Kentaro; Norman, Michael L.; Onorbe, Jose; O'Shea, Brian W.; Pillepich, Annalisa; Primack, Joel R.; Quinn, Thomas; Read, Justin I.; Robertson, Brant E.; Rocha, Miguel; Rudd, Douglas H.; Shen, Sijing; Smith, Britton D.; Szalay, Alexander S.; Teyssier, Romain; Thompson, Robert; Todoroki, Keita; Turk, Matthew J.; Wadsley, James W.; Wise, John H.; Zolotov, Adi; AGORA Collaboration29,the

    2014-01-01

    We introduce the Assembling Galaxies Of Resolved Anatomy (AGORA) project, a comprehensive numerical study of well-resolved galaxies within the ΛCDM cosmology. Cosmological hydrodynamic simulations with force resolutions of ~100 proper pc or better will be run with a variety of code platforms to follow the hierarchical growth, star formation history, morphological transformation, and the cycle of baryons in and out of eight galaxies with halo masses M vir ~= 1010, 1011, 1012, and 1013 M ⊙ at z = 0 and two different ("violent" and "quiescent") assembly histories. The numerical techniques and implementations used in this project include the smoothed particle hydrodynamics codes GADGET and GASOLINE, and the adaptive mesh refinement codes ART, ENZO, and RAMSES. The codes share common initial conditions and common astrophysics packages including UV background, metal-dependent radiative cooling, metal and energy yields of supernovae, and stellar initial mass function. These are described in detail in the present paper. Subgrid star formation and feedback prescriptions will be tuned to provide a realistic interstellar and circumgalactic medium using a non-cosmological disk galaxy simulation. Cosmological runs will be systematically compared with each other using a common analysis toolkit and validated against observations to verify that the solutions are robust—i.e., that the astrophysical assumptions are responsible for any success, rather than artifacts of particular implementations. The goals of the AGORA project are, broadly speaking, to raise the realism and predictive power of galaxy simulations and the understanding of the feedback processes that regulate galaxy "metabolism." The initial conditions for the AGORA galaxies as well as simulation outputs at various epochs will be made publicly available to the community. The proof-of-concept dark-matter-only test of the formation of a galactic halo with a z = 0 mass of M vir ~= 1.7 × 1011 M ⊙ by nine different

  2. IERAPSI project: simulation of a canal wall-up mastoidectomy.

    PubMed

    Neri, E; Sellari Franceschini, S; Berrettini, S; Caramella, D; Bartolozzi, C

    2006-03-01

    Among the various EU research projects concerning the medical application of virtual reality, the project Ist-1999-12175, called IERAPSI (Integrated Environment for the Rehearsal and Planning of Surgical Interventions), specifically addressed the creation of a virtual and interactive surgical field for the temporal bone using three-dimensional images derived from CT data. We report on the experience obtained in the IERAPSI project in simulating a canal wall-up mastoidectomy. A surgeon with extensive experience in surgery of the petrous bone performed the mastoidectomy. The operative field included the mastoid, with its substantial differences in density between the cortex and the pneumatized bone, together with soft tissue structures, both on the border and inside the bone. The simulation is better in the first part of the operation than in the second part, suffering from a lack of haptic feedback from soft tissue and the surgical tool in deeper contexts, and under-representation of the variability inherent in pneumatized bone. This said, the excellent representation of dust production and removal, 3D simulation through color, and very good visual and haptic feedback in the early stage of the procedure are impressive. IERAPSI represents a potential surgical planning theater for the training of students and young surgeons, but is also expected to aid expert surgeons in the preoperative planning of difficult cases.

  3. Progress of the NASAUSGS Lunar Regolith Simulant Project

    NASA Technical Reports Server (NTRS)

    Rickman, Douglas; McLemore, C.; Stoeser, D.; Schrader, C.; Fikes, J.; Street, K.

    2009-01-01

    Beginning in 2004 personnel at MSFC began serious efforts to develop a new generation of lunar simulants. The first two products were a replication of the previous JSC-1 simulant under a contract to Orbitec and a major workshop in 2005 on future simulant development. It was recognized in early 2006 there were serious limitations with the standard approach of simply taking a single terrestrial rock and grinding it. To a geologist, even a cursory examination of the Lunar Sourcebook shows that matching lunar heterogeneity, crystal size, relative mineral abundances, lack of H2O, plagioclase chemistry and glass abundance simply can not be done with any simple combination of terrestrial rocks. Thus the project refocused its efforts and approached simulant development in a new and more comprehensive manner, examining new approaches in simulant development and ways to more accurately compare simulants to actual lunar materials. This led to a multi-year effort with five major tasks running in parallel. The five tasks are Requirements, Lunar Analysis, Process Development, Feed Stocks, and Standards.

  4. Equation Free Projective Integration and its Applicability for Simulating Plasma

    NASA Astrophysics Data System (ADS)

    Jemella, B.; Shay, M. A.; Drake, J. F.; Dorland, W.

    2004-12-01

    We examine a novel simulation scheme called equation free projective integration1 which has the potential to allow global simulations of plasmas while still including the global effects of microscale physics. These simulation codes would be ideal for such multiscale problems as the Earth's magnetosphere, tokamaks, and the solar corona. In this method, the global plasma variables stepped forward in time are not time-integrated directly using dynamical differential equations, hence the name "equation free." Instead, these variables are represented on a microgrid using a kinetic simulation. This microsimulation is integrated forward long enough to determine the time derivatives of the global plasma variables, which are then used to integrate forward the global variables with much larger time steps. We are exploring the feasibility of applying this scheme to simulate plasma, and we will present the results of exploratory test problems including the development of 1-D shocks and magnetic reconnection. 1 I. G. Kevrekidis et. al., ``Equation-free multiscale computation: Enabling microscopic simulators to perform system-level tasks,'' arXiv:physics/0209043.

  5. Equation free projective integration and its applicability for simulating plasma

    NASA Astrophysics Data System (ADS)

    Shay, Michael A.; Drake, James F.; Dorland, William; Swisdak, Marc

    2004-11-01

    We examine a novel simulation scheme called equation free projective integration^1 which has the potential to allow global simulations of plasmas while still including the global effects of microscale physics. These simulation codes would be ideal for such multiscale problems as tokamaks, the Earth's magnetosphere, and the solar corona. In this method, the global plasma variables stepped forward in time are not time-integrated directly using dynamical differential equations, hence the name ``equation free.'' Instead, these variables are represented on a microgrid using a kinetic simulation. This microsimulation is integrated forward long enough to determine the time derivatives of the global plasma variables, which are then used to integrate forward the global variables with much larger time steps. We are exploring the feasibility of applying this scheme to simulate plasma, and we will present the results of exploratory test problems including the development of 1-D shocks and magnetic reconnection. ^1 I. G. Kevrekidis et. al., ``Equation-free multiscale computation: Enabling microscopic simulators to perform system-level tasks,'' arXiv:physics/0209043.

  6. Three-dimensional gyrokinetic particle-in-cell simulation of plasmas on a massively parallel computer: Final report on LDRD Core Competency Project, FY 1991--FY 1993

    SciTech Connect

    Byers, J.A.; Williams, T.J.; Cohen, B.I.; Dimits, A.M.

    1994-04-27

    One of the programs of the Magnetic fusion Energy (MFE) Theory and computations Program is studying the anomalous transport of thermal energy across the field lines in the core of a tokamak. We use the method of gyrokinetic particle-in-cell simulation in this study. For this LDRD project we employed massively parallel processing, new algorithms, and new algorithms, and new formal techniques to improve this research. Specifically, we sought to take steps toward: researching experimentally-relevant parameters in our simulations, learning parallel computing to have as a resource for our group, and achieving a 100 {times} speedup over our starting-point Cray2 simulation code`s performance.

  7. SciDAC - Center for Plasma Edge Simulation - Project Summary

    SciTech Connect

    Parker, Scott

    2014-11-03

    Final Technical Report: Center for Plasma Edge Simulation (CPES) Principal Investigator: Scott Parker, University of Colorado, Boulder Description/Abstract First-principle simulations of edge pedestal micro-turbulence are performed with the global gyrokinetic turbulence code GEM for both low and high confinement tokamak plasmas. The high confinement plasmas show a larger growth rate, but nonlinearly a lower particle and heat flux. Numerical profiles are obtained from the XGC0 neoclassical code. XGC0/GEM code coupling is implemented under the EFFIS (“End-to-end Framework for Fusion Integrated Simulation”) framework. Investigations are underway to clearly identify the micro-instabilities in the edge pedestal using global and flux-tube gyrokinetic simulation with realistic experimental high confinement profiles. We use both experimental profiles and those obtained using the EFFIS XGC0/GEM coupled code framework. We find there are three types of instabilities at the edge: a low-n, high frequency electron mode, a high-n, low frequency ion mode, and possibly an ion mode like kinetic ballooning mode (KBM). Investigations are under way for the effects of the radial electric field. Finally, we have been investigating how plasmas dominated by ion-temperature gradient (ITG) driven turbulence, how cold Deuterium and Tritium ions near the edge will naturally pinch radially inward towards the core. We call this mechanism “natural fueling.” It is due to the quasi-neutral heat flux dominated nature of the turbulence and still applies when trapped and passing kinetic electron effects are included. To understand this mechanism, examine the situation where the electrons are adiabatic, and there is an ion heat flux. In such a case, lower energy particles move inward and higher energy particles move outward. If a trace amount of cold particles are added, they will move inward.

  8. Review of fusion synfuels

    SciTech Connect

    Fillo, J.A.

    1980-01-01

    Thermonuclear fusion offers an inexhaustible source of energy for the production of hydrogen from water. Depending on design, electric generation efficiencies of approx. 40 to 60% and hydrogen production efficiencies by high-temperature electrolysis of approx. 50 to 65% are projected for fusion reactors using high-temperatures blankets. Fusion/coal symbiotic systems appear economically promising for the first generation of commercial fusion synfuels plants. Coal production requirements and the environmental effects of large-scale coal usage would be greatly reduced by a fusion/coal system. In the long term, there could be a gradual transition to an inexhaustible energy system based solely on fusion.

  9. 15 MW HArdware-in-the-loop Grid Simulation Project

    SciTech Connect

    Rigas, Nikolaos; Fox, John Curtiss; Collins, Randy; Tuten, James; Salem, Thomas; McKinney, Mark; Hadidi, Ramtin; Gislason, Benjamin; Boessneck, Eric; Leonard, Jesse

    2014-10-31

    The 15MW Hardware-in-the-loop (HIL) Grid Simulator project was to (1) design, (2) construct and (3) commission a state-of-the-art grid integration testing facility for testing of multi-megawatt devices through a ‘shared facility’ model open to all innovators to promote the rapid introduction of new technology in the energy market to lower the cost of energy delivered. The 15 MW HIL Grid Simulator project now serves as the cornerstone of the Duke Energy Electric Grid Research, Innovation and Development (eGRID) Center. This project leveraged the 24 kV utility interconnection and electrical infrastructure of the US DOE EERE funded WTDTF project at the Clemson University Restoration Institute in North Charleston, SC. Additionally, the project has spurred interest from other technology sectors, including large PV inverter and energy storage testing and several leading edge research proposals dealing with smart grid technologies, grid modernization and grid cyber security. The key components of the project are the power amplifier units capable of providing up to 20MW of defined power to the research grid. The project has also developed a one of a kind solution to performing fault ride-through testing by combining a reactive divider network and a large power converter into a hybrid method. This unique hybrid method of performing fault ride-through analysis will allow for the research team at the eGRID Center to investigate the complex differences between the alternative methods of performing fault ride-through evaluations and will ultimately further the science behind this testing. With the final goal of being able to perform HIL experiments and demonstration projects, the eGRID team undertook a significant challenge with respect to developing a control system that is capable of communicating with several different pieces of equipment with different communication protocols in real-time. The eGRID team developed a custom fiber optical network that is based upon FPGA

  10. Fusing simulation and experiment: The effect of mutations on the structure and activity of the influenza fusion peptide

    PubMed Central

    Lousa, Diana; Pinto, Antónia R. T.; Victor, Bruno L.; Laio, Alessandro; Veiga, Ana S.; Castanho, Miguel A. R. B.; Soares, Cláudio M.

    2016-01-01

    During the infection process, the influenza fusion peptide (FP) inserts into the host membrane, playing a crucial role in the fusion process between the viral and host membranes. In this work we used a combination of simulation and experimental techniques to analyse the molecular details of this process, which are largely unknown. Although the FP structure has been obtained by NMR in detergent micelles, there is no atomic structure information in membranes. To answer this question, we performed bias-exchange metadynamics (BE-META) simulations, which showed that the lowest energy states of the membrane-inserted FP correspond to helical-hairpin conformations similar to that observed in micelles. BE-META simulations of the G1V, W14A, G12A/G13A and G4A/G8A/G16A/G20A mutants revealed that all the mutations affect the peptide’s free energy landscape. A FRET-based analysis showed that all the mutants had a reduced fusogenic activity relative to the WT, in particular the mutants G12A/G13A and G4A/G8A/G16A/G20A. According to our results, one of the major causes of the lower activity of these mutants is their lower membrane affinity, which results in a lower concentration of peptide in the bilayer. These findings contribute to a better understanding of the influenza fusion process and open new routes for future studies. PMID:27302370

  11. A Particle-in-Cell Simulation for the Traveling Wave Direct Energy Converter (TWDEC) for Fusion Propulsion

    NASA Technical Reports Server (NTRS)

    Chap, Andrew; Tarditi, Alfonso G.; Scott, John H.

    2013-01-01

    A Particle-in-cell simulation model has been developed to study the physics of the Traveling Wave Direct Energy Converter (TWDEC) applied to the conversion of charged fusion products into electricity. In this model the availability of a beam of collimated fusion products is assumed; the simulation is focused on the conversion of the beam kinetic energy into alternating current (AC) electric power. The model is electrostatic, as the electro-dynamics of the relatively slow ions can be treated in the quasistatic approximation. A two-dimensional, axisymmetric (radial-axial coordinates) geometry is considered. Ion beam particles are injected on one end and travel along the axis through ring-shaped electrodes with externally applied time-varying voltages, thus modulating the beam by forming a sinusoidal pattern in the beam density. Further downstream, the modulated beam passes through another set of ring electrodes, now electrically oating. The modulated beam induces a time alternating potential di erence between adjacent electrodes. Power can be drawn from the electrodes by connecting a resistive load. As energy is dissipated in the load, a corresponding drop in beam energy is measured. The simulation encapsulates the TWDEC process by reproducing the time-dependent transfer of energy and the particle deceleration due to the electric eld phase time variations.

  12. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. III. Collisionless tearing mode

    SciTech Connect

    Liu, Dongjian; Bao, Jian; Han, Tao; Wang, Jiaqi; Lin, Zhihong

    2016-02-15

    A finite-mass electron fluid model for low frequency electromagnetic fluctuations, particularly the collisionless tearing mode, has been implemented in the gyrokinetic toroidal code. Using this fluid model, linear properties of the collisionless tearing mode have been verified. Simulations verify that the linear growth rate of the single collisionless tearing mode is proportional to D{sub e}{sup 2}, where D{sub e} is the electron skin depth. On the other hand, the growth rate of a double tearing mode is proportional to D{sub e} in the parameter regime of fusion plasmas.

  13. Reconstruction of accurate 3-D surfaces with sharp edges using digital structured light projection and multi-dimensional image fusion

    NASA Astrophysics Data System (ADS)

    Le, Manh-Trung; Chen, Liang-Chia; Lin, Chih-Jer

    2017-09-01

    The study presents a novel method that uses structured illumination imaging and data fusion to address one of the most difficult problems in 3-D optical measurement where an accurate 3-D sharp edge must be reconstructed, to allow automated inspection and reconstruction of a 3-D object. An innovative algorithm for reconstructing a 3-D surface profile with a sharp-edge boundary using multi-dimensional data fusion is proposed. An accurate 2-D surface edge is extracted from an image with high spatial-resolution, that is reconstructed using structured illumination imaging (SIM), so the projected edge contour of 2-D contour along the optical imaging axis can be accurately determined. The neighboring surface between the 2-D detected edge and the identified 3-D surface contour is reconstructed by extrapolating the surface using NURBS surface fitting to detect the intersecting edges. Experiments are performed to confirm the feasibility, effectiveness and accuracy of the developed method and there is a comparison between the results for a reconstructed 3-D sharp edge and a pre-calibrated high precision instrument. The proposed method ensures that a maximum deviation between the reference target and the reconstructed critical dimension is 3 μm so a resolution for the optical imaging system of less than 0.5 pixel can be achieved. The experimental results demonstrate that the proposed method is both effective and accurate.

  14. Uncertainties of soil moisture in historical simulations and future projections

    NASA Astrophysics Data System (ADS)

    Cheng, Shanjun; Huang, Jianping; Ji, Fei; Lin, Lei

    2017-02-01

    Uncertainties of soil moisture in historical simulations (1920-2005) and future projections (2006-2080) were investigated by using the outputs from the Coupled Model Intercomparison Project Phase 5 and Community Earth System Model. The results showed that soil moisture climatology varies greatly among models despite the good agreement between the ensemble mean of simulated soil moisture and the Global Land Data Assimilation System data. The uncertainties of initial conditions and model structure showed similar spatial patterns and magnitudes, with high uncertainties in dry regions and low uncertainties in wet regions. In addition, the long-term variability of model structure uncertainty rapidly decreased before 1980 and increased thereafter, but the uncertainty in initial conditions showed an upward trend over the entire time span. The model structure and initial conditions can cause uncertainties at all time scales. Despite these large uncertainties, almost all of the simulations showed significant decreasing linear trends in soil moisture for the 21st century, especially in the Mediterranean region, northeast and southwest South America, southern Africa, and southwestern USA.

  15. Progress of the NASA/USGS Lunar Regolith Simulant Project

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; MLemore, Carole; Wilson, Steve; Stoeser, Doug; Schrader, Christian; Fikes, John; Street, Kenneth

    2009-01-01

    Beginning in 2004 personnel at MSFC began serious efforts to develop a new generation of lunar simulants. The first two products were a replication of the previous JSC-1 simulant under a contract to Orbitec and a major workshop in 2005 on future simulant development. Beginning in 2006 the project refocused its efforts and approached simulant development in a new and more comprehensive manner, examining new approaches in simulant development and ways to more accurately compare simulants to actual lunar materials. This led to a multi-year effort with five major tasks running in parallel. The five tasks are Requirements, Lunar Analysis, Process Development, Feed Stocks, and Standards. Major progress has been made in all five areas. A substantial draft of a formal requirements document now exists and has been largely stable since 2007. It does evolve as specific details of the standards and Lunar Analysis efforts proceed. Lunar Analysis has turned out to be vastly more difficult than anticipated. After great effort to mine existing published and gray literature, the team has realized the necessity of making new measurements of the Apollo samples, an effort that is currently in progress. Process development is substantially ahead of expectations in 2006. It is now practical to synthesize glasses of appropriate composition and purity. It is also possible to make agglutinate particles in significant quantities. A series of minerals commonly found on the Moon has been synthesized. Separation of mineral constituents from starting rock material is also proceeding. Customized grinding and mixing processes have been developed and tested are now being documented. Identification and development of appropriate feedstocks has been both easier and more difficult than anticipated. The Stillwater Mining Company, operating in the Stillwater layered mafic intrusive complex of Montana, has been an amazing resource for the project, but finding adequate sources for some of the components

  16. Progress of the NASA/USGS Lunar Regolith Simulant Project

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; MLemore, Carole; Wilson, Steve; Stoeser, Doug; Schrader, Christian; Fikes, John; Street, Kenneth

    2009-01-01

    Beginning in 2004 personnel at MSFC began serious efforts to develop a new generation of lunar simulants. The first two products were a replication of the previous JSC-1 simulant under a contract to Orbitec and a major workshop in 2005 on future simulant development. Beginning in 2006 the project refocused its efforts and approached simulant development in a new and more comprehensive manner, examining new approaches in simulant development and ways to more accurately compare simulants to actual lunar materials. This led to a multi-year effort with five major tasks running in parallel. The five tasks are Requirements, Lunar Analysis, Process Development, Feed Stocks, and Standards. Major progress has been made in all five areas. A substantial draft of a formal requirements document now exists and has been largely stable since 2007. It does evolve as specific details of the standards and Lunar Analysis efforts proceed. Lunar Analysis has turned out to be vastly more difficult than anticipated. After great effort to mine existing published and gray literature, the team has realized the necessity of making new measurements of the Apollo samples, an effort that is currently in progress. Process development is substantially ahead of expectations in 2006. It is now practical to synthesize glasses of appropriate composition and purity. It is also possible to make agglutinate particles in significant quantities. A series of minerals commonly found on the Moon has been synthesized. Separation of mineral constituents from starting rock material is also proceeding. Customized grinding and mixing processes have been developed and tested are now being documented. Identification and development of appropriate feedstocks has been both easier and more difficult than anticipated. The Stillwater Mining Company, operating in the Stillwater layered mafic intrusive complex of Montana, has been an amazing resource for the project, but finding adequate sources for some of the components

  17. Three-dimensional simulation strategy to determine the effects of turbulent mixing on inertial-confinement-fusion capsule performance.

    PubMed

    Haines, Brian M; Grinstein, Fernando F; Fincke, James R

    2014-05-01

    In this paper, we present and justify an effective strategy for performing three-dimensional (3D) inertial-confinement-fusion (ICF) capsule simulations. We have evaluated a frequently used strategy in which two-dimensional (2D) simulations are rotated to 3D once sufficient relevant 2D flow physics has been captured and fine resolution requirements can be restricted to relatively small regions. This addresses situations typical of ICF capsules which are otherwise prohibitively intensive computationally. We tested this approach for our previously reported fully 3D simulations of laser-driven reshock experiments where we can use the available 3D data as reference. Our studies indicate that simulations that begin as purely 2D lead to significant underprediction of mixing and turbulent kinetic energy production at later time when compared to the fully 3D simulations. If, however, additional suitable nonuniform perturbations are applied at the time of rotation to 3D, we show that one can obtain good agreement with the purely 3D simulation data, as measured by vorticity distributions as well as integrated mixing and turbulent kinetic energy measurements. Next, we present results of simulations of a simple OMEGA-type ICF capsule using the developed strategy. These simulations are in good agreement with available experimental data and suggest that the dominant mechanism for yield degradation in ICF implosions is hydrodynamic instability growth seeded by long-wavelength surface defects. This effect is compounded by drive asymmetries and amplified by repeated shock interactions with an increasingly distorted shell, which results in further yield reduction. Our simulations are performed with and without drive asymmetries in order to compare the importance of these effects to those of surface defects; our simulations indicate that long-wavelength surface defects degrade yield by approximately 60% and short-wavelength drive asymmetry degrades yield by a further 30%.

  18. Integrated fusion simulation with self-consistent core-pedestal coupling

    SciTech Connect

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, D. L.; Elwasif, W.; Grierson, B. A.; Holland, C.

    2016-04-20

    In this study, accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments. An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff.

  19. Integrated fusion simulation with self-consistent core-pedestal coupling

    DOE PAGES

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; ...

    2016-04-20

    In this study, accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments.more » An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff.« less

  20. Integrated fusion simulation with self-consistent core-pedestal coupling

    NASA Astrophysics Data System (ADS)

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, D. L.; Elwasif, W.; Grierson, B. A.; Holland, C.

    2016-04-01

    Accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile, and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self-consistent solution to this strongly coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Testing against a DIII-D discharge shows that the workflow is capable of robustly predicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in a good agreement with the experiments. An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff .

  1. Integrated fusion simulation with self-consistent core-pedestal coupling

    SciTech Connect

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, D. L.; Elwasif, W.; Grierson, B. A.; Holland, C.

    2016-04-20

    In this study, accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments. An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff.

  2. Inertial electrostatic confinement and DD fusion at interelectrode media of nanosecond vacuum discharge. PIC simulations and experiment

    NASA Astrophysics Data System (ADS)

    Kurilenkov, Yu K.; Tarakanov, V. P.; Skowronek, M.; Guskov, S. Yu; Dufty, J.

    2009-05-01

    The generation of energetic ions and DD neutrons from microfusion at the interelectrode space of a low-energy nanosecond vacuum discharge has been demonstrated recently [1, 2]. However, the physics of fusion processes and some results regarding the neutron yield from the database accumulated were poorly understood. The present work presents a detailed particle-in-cell (PIC) simulation of the discharge experimental conditions using a fully electrodynamic code. The dynamics of all charge particles was reconstructed in time and anode-cathode (AC) space. The principal role of a virtual cathode (VC) and the corresponding single and double potential wells formed in the interelectrode space are recognized. The calculated depth of the quasistationary potential well (PW) of the VC is about 50-60 keV, and the D+ ions being trapped by this well accelerate up to energy values needed to provide collisional DD nuclear synthesis. The correlation between the calculated potential well structures (and dynamics) and the neutron yield observed is discussed. In particular, ions in the potential well undergo high-frequency (~80 MHz) harmonic oscillations accompanied by a corresponding regime of oscillatory neutron yield. Both experiment and PIC simulations illustrate favorable scaling of the fusion power density for the chosen IECF scheme based on nanosecond vacuum discharge.

  3. The Jefferson Project: Large-eddy simulations of a watershed

    NASA Astrophysics Data System (ADS)

    Watson, C.; Cipriani, J.; Praino, A. P.; Treinish, L. A.; Tewari, M.; Kolar, H.

    2015-12-01

    The Jefferson Project is a new endeavor at Lake George, NY by IBM Research, Rensselaer Polytechnic Institute (RPI) and The Fund for Lake George. Lake George is an oligotrophic lake - one of low nutrients - and a 30-year study recently published by RPI's Darrin Fresh Water Institute highlighted the renowned water quality is declining from the injection of salt (from runoff), algae, and invasive species. In response, the Jefferson Project is developing a system to provide extensive data on relevant physical, chemical and biological parameters that drive ecosystem function. The system will be capable of real-time observations and interactive modeling of the atmosphere, watershed hydrology, lake circulation and food web dynamics. In this presentation, we describe the development of the operational forecast system used to simulate the atmosphere in the model stack, Deep ThunderTM (a configuration of the ARW-WRF model). The model performs 48-hr forecasts twice daily in a nested configuration, and in this study we present results from ongoing tests where the innermost domains are dx = 333-m and 111-m. We discuss the model's ability to simulate boundary layer processes, lake surface conditions (an input into the lake model), and precipitation (an input into the hydrology model) during different weather regimes, and the challenges of data assimilation and validation at this scale. We also explore the potential for additional nests over select regions of the watershed to better capture turbulent boundary layer motions.

  4. Laser fusion

    SciTech Connect

    Smit, W.A.; Boskma, P.

    1980-12-01

    Unrestricted laser fusion offers nations an opportunity to circumvent arms control agreements and develop thermonuclear weapons. Early laser weapons research sought a clean radiation-free bomb to replace the fission bomb, but this was deceptive because a fission bomb was needed to trigger the fusion reaction and additional radioactivity was induced by generating fast neutrons. As laser-implosion experiments focused on weapons physics, simulating weapons effects, and applications for new weapons, the military interest shifted from developing a laser-ignited hydrogen bomb to more sophisticated weapons and civilian applications for power generation. Civilian and military research now overlap, making it possible for several countries to continue weapons activities and permitting proliferation of nuclear weapons. These countries are reluctant to include inertial confinement fusion research in the Non-Proliferation Treaty. 16 references. (DCK)

  5. Toward a Designable Extracellular Matrix: Molecular Dynamics Simulations of an Engineered Laminin-Mimetic, Elastin-Like Fusion Protein.

    PubMed

    Tang, James D; McAnany, Charles E; Mura, Cameron; Lampe, Kyle J

    2016-10-10

    Native extracellular matrices (ECMs) exhibit networks of molecular interactions between specific matrix proteins and other tissue components. Guided by these naturally self-assembling supramolecular systems, we have designed a matrix-derived protein chimera that contains a laminin globular-like (LG) domain fused to an elastin-like polypeptide (ELP). This bipartite design offers a flexible protein engineering platform: (i) laminin is a key multifunctional component of the ECM in human brains and other neural tissues, making it an ideal bioactive component of our fusion, and (ii) ELPs, known to be well-tolerated in vivo, provide a self-assembly scaffold with tunable physicochemical (viscoelastic, thermoresponsive) properties. Experimental characterization of novel proteins is resource-intensive, and examining many conceivable designs would be a formidable challenge in the laboratory. Computational approaches offer a way forward: molecular dynamics (MD) simulations can be used to analyze the structural/physical behavior of candidate LG-ELP fusion proteins, particularly in terms of conformational properties salient to our design goals, such as assembly propensity in a temperature range spanning the inverse temperature transition. As a first step in examining the physical characteristics of a model LG-ELP fusion protein, including its temperature-dependent structural behavior, we simulated the protein over a range of physiologically relevant temperatures (290-320 K). We find that the ELP region, built upon the archetypal (VPGXG)5 scaffold, is quite flexible and has a propensity for β-rich secondary structures near physiological (310-315 K) temperatures. Our trajectories indicate that the temperature-dependent burial of hydrophobic patches in the ELP region, coupled to the local water structure dynamics and mediated by intramolecular contacts between aliphatic side chains, correlates with the temperature-dependent structural transitions in known ELP polymers. Because of

  6. Predictive Simulation of Process Windows for Powder Bed Fusion Additive Manufacturing: Influence of the Powder Bulk Density.

    PubMed

    Rausch, Alexander M; Küng, Vera E; Pobel, Christoph; Markl, Matthias; Körner, Carolin

    2017-09-22

    The resulting properties of parts fabricated by powder bed fusion additive manufacturing processes are determined by their porosity, local composition, and microstructure. The objective of this work is to examine the influence of the stochastic powder bed on the process window for dense parts by means of numerical simulation. The investigations demonstrate the unique capability of simulating macroscopic domains in the range of millimeters with a mesoscopic approach, which resolves the powder bed and the hydrodynamics of the melt pool. A simulated process window reveals the influence of the stochastic powder layer. The numerical results are verified with an experimental process window for selective electron beam-melted Ti-6Al-4V. Furthermore, the influence of the powder bulk density is investigated numerically. The simulations predict an increase in porosity and surface roughness for samples produced with lower powder bulk densities. Due to its higher probability for unfavorable powder arrangements, the process stability is also decreased. This shrinks the actual parameter range in a process window for producing dense parts.

  7. NASA GRC UAS Project: Communications Modeling and Simulation Status

    NASA Technical Reports Server (NTRS)

    Kubat, Greg

    2013-01-01

    The integration of Unmanned Aircraft Systems (UAS) in the National Airspace represents new operational concepts required in civil aviation. These new concepts are evolving as the nation moves toward the Next Generation Air Transportation System (NextGen) under the leadership of the Joint Planning and Development Office (JPDO), and through ongoing work by the Federal Aviation Administration (FAA). The desire and ability to fly UAS in the National Air Space (NAS) in the near term has increased dramatically, and this multi-agency effort to develop and implement a national plan to successfully address the challenges of UAS access to the NAS in a safe and timely manner is well underway. As part of the effort to integrate UAS in the National Airspace, NASA Glenn Research Center is currently involved with providing research into Communications systems and Communication system operations in order to assist with developing requirements for this implementation. In order to provide data and information regarding communication systems performance that will be necessary, NASA GRC is tasked with developing and executing plans for simulations of candidate future UAS command and control communications, in line with architectures and communications technologies being developed and/or proposed by NASA and relevant aviation organizations (in particular, RTCA SC-203). The simulations and related analyses will provide insight into the ability of proposed communications technologies and system architectures to enable safe operation of UAS, meeting UAS in the NAS project goals (including performance requirements, scalability, and interoperability), and ultimately leading to a determination of the ability of NextGen communication systems to accommodate UAS. This presentation, compiled by the NASA GRC team, will provide a view of the overall planned simulation effort and objectives, a description of the simulation concept and status of the design and development that has occurred to date.

  8. An Electrothermal Plasma Source Developed for Simulation of Transient Heat Loads in Future Large Fusion Devices

    NASA Astrophysics Data System (ADS)

    Gebhart, Trey; Baylor, Larry; Winfrey, Leigh

    2016-10-01

    The realization of fusion energy requires materials that can withstand high heat and particle fluxes at the plasma material interface. In this work, an electrothermal (ET) plasma source has been designed as a possible transient heat flux source for a linear plasma material interaction device. An ET plasma source operates in the ablative arc regime, which is driven by a DC capacitive discharge. The current travels through the 4mm bore of a boron nitride liner and subsequently ablates and ionizes the liner material. This results in a high density plasma with a large unidirectional bulk flow out of the source exit. The pulse length for the ET source has been optimized using a pulse forming network to have a duration of 1ms at full-width half maximum. The peak currents and maximum source energies seen in this system are 2kA and 5kJ. The goal of this work is to show that the ET source produces electron densities and heat fluxes that are comparable to transient events in future large magnetic confinement fusion devices. Heat flux, plasma temperature, and plasma density were determined for each test shot using infrared imaging and optical spectroscopy techniques. This work will compare the ET source output (heat flux, temperature, and density) with and without an applied magnetic field. Research sponsored by the Laboratory Directed Research and Development Program of Oak Ridge National Laboratory, managed by UT-Battelle, LLC, for the U. S. Department of Energy.

  9. Tomographic data fusion with CFD simulations associated with a planar sensor

    NASA Astrophysics Data System (ADS)

    Liu, J.; Liu, S.; Sun, S.; Zhou, W.; Schlaberg, I. H. I.; Wang, M.; Yan, Y.

    2017-04-01

    Tomographic techniques have great abilities to interrogate the combustion processes, especially when it is combined with the physical models of the combustion itself. In this study, a data fusion algorithm is developed to investigate the flame distribution of a swirl-induced environmental (EV) burner, a new type of burner for low NOx combustion. An electric capacitance tomography (ECT) system is used to acquire 3D flame images and computational fluid dynamics (CFD) is applied to calculate an initial distribution of the temperature profile for the EV burner. Experiments were also carried out to visualize flames at a series of locations above the burner. While the ECT images essentially agree with the CFD temperature distribution, discrepancies exist at a certain height. When data fusion is applied, the discrepancy is visibly reduced and the ECT images are improved. The methods used in this study can lead to a new route where combustion visualization can be much improved and applied to clean energy conversion and new burner development.

  10. Membrane insertion of fusion peptides from Ebola and Marburg viruses studied by replica-exchange molecular dynamics simulations.

    PubMed

    Olson, Mark A; Lee, Michael S; Yeh, In-Chul

    2017-01-28

    This work presents replica-exchange molecular dynamics simulations of inserting a 16-residue Ebola virus fusion peptide into a membrane bilayer. A computational approach is applied for modeling the peptide at the explicit all-atom level and the membrane-aqueous bilayer by a generalized Born continuum model with a smoothed switching function (GBSW). We provide an assessment of the model calculations in terms of three metrics: (1) the ability to reproduce the NMR structure of the peptide determined in the presence of SDS micelles and comparable structural data on other fusion peptides; (2) determination of the effects of the mutation Trp-8 to Ala and sequence discrimination of the homologous Marburg virus; and (3) calculation of potentials of mean force for estimating the partitioning free energy and their comparison to predictions from the Wimley-White interfacial hydrophobicity scale. We found the GBSW implicit membrane model to produce results of limited accuracy in conformational properties of the peptide when compared to the NMR structure, yet the model resolution is sufficient to determine the effect of sequence differentiation on peptide-membrane integration. © 2016 Wiley Periodicals, Inc.

  11. NASA GRC UAS Project - Communications Modeling and Simulation Development Status

    NASA Technical Reports Server (NTRS)

    Apaza, Rafael; Bretmersky, Steven; Dailey, Justin; Satapathy, Goutam; Ditzenberger, David; Ye, Chris; Kubat, Greg; Chevalier, Christine; Nguyen, Thanh

    2014-01-01

    The integration of Unmanned Aircraft Systems (UAS) in the National Airspace represents new operational concepts required in civil aviation. These new concepts are evolving as the nation moves toward the Next Generation Air Transportation System (NextGen) under the leadership of the Joint Planning and Development Office (JPDO), and through ongoing work by the Federal Aviation Administration (FAA). The desire and ability to fly UAS in the National Air Space (NAS) in the near term has increased dramatically, and this multi-agency effort to develop and implement a national plan to successfully address the challenges of UAS access to the NAS in a safe and timely manner is well underway. As part of the effort to integrate UAS in the National Airspace, NASA Glenn Research Center is currently involved with providing research into Communications systems and Communication system operations in order to assist with developing requirements for this implementation. In order to provide data and information regarding communication systems performance that will be necessary, NASA GRC is tasked with developing and executing plans for simulations of candidate future UAS command and control communications, in line with architectures and communications technologies being developed and or proposed by NASA and relevant aviation organizations (in particular, RTCA SC-203). The simulations and related analyses will provide insight into the ability of proposed communications technologies and system architectures to enable safe operation of UAS, meeting UAS in the NAS project goals (including performance requirements, scalability, and interoperability), and ultimately leading to a determination of the ability of NextGen communication systems to accommodate UAS. This presentation, compiled by the NASA GRC Modeling and Simulation team, will provide an update to this ongoing effort at NASA GRC as follow-up to the overview of the planned simulation effort presented at ICNS in 2013. The objective

  12. GUMICS4 Synthetic and Dynamic Simulations of the ECLAT Project

    NASA Astrophysics Data System (ADS)

    Facsko, G.; Palmroth, M. M.; Gordeev, E.; Hakkinen, L. V.; Honkonen, I. J.; Janhunen, P.; Sergeev, V. A.; Kauristie, K.; Milan, S. E.

    2012-12-01

    The European Commission funded the European Cluster Assimilation Techniques (ECLAT) project as a collaboration of five leader European universities and research institutes. A main contribution of the Finnish Meteorological Institute (FMI) is to provide a wide range of global MHD runs with the Grand Unified Magnetosphere Ionosphere Coupling simulation (GUMICS). The runs are divided in two categories: synthetic runs investigating the extent of solar wind drivers that can influence magnetospheric dynamics, as well as dynamic runs using measured solar wind data as input. Here we consider the first set of runs with synthetic solar wind input. The solar wind density, velocity and the interplanetary magnetic field had different magnitudes and orientations; furthermore two F10.7 flux values were selected for solar radiation minimum and maximum values. The solar wind parameter values were constant such that a constant stable solution was archived. All configurations were run several times with three different (-15°, 0°, +15°) tilt angles in the GSE X-Z plane. The Cray XT supercomputer of the FMI provides a unique opportunity in global magnetohydrodynamic simulation: running the GUMICS-4 based on one year real solar wind data. Solar wind magnetic field, density, temperature and velocity data based on Advanced Composition Explorer (ACE) and WIND measurements are downloaded from the OMNIWeb open database and a special input file is created for each Cluster orbit. All data gaps are replaced with linear interpolations between the last and first valid data values before and after the data gap. Minimum variance transformation is applied for the Interplanetary Magnetic Field data to clean and avoid the code of divergence. The Cluster orbits are divided into slices allowing parallel computation and each slice has an average tilt angle value. The file timestamps start one hour before the perigee to provide time for building up a magnetosphere in the simulation space. The real

  13. The benefits of dynamic simulation to the Mensa development project

    SciTech Connect

    Lamey, M.F.; Lang, P.; Rainey, M.; Turner, S.; Wasden, F.

    1998-12-31

    Deep water production and sub sea tiebacks have presented new operational challenges for production operators. Shell Deepwater Development Inc. recent developed a dynamic simulation of its record setting Mensa gas field sub sea development. The project has set world records for water depth and pipeline length. The model revealed several transient conditions that were not readily apparent from earlier work and/or development tools. Changes to the start up plan and normal operating guidelines resulted. The model was also used to train operating personnel in the dynamic response of the system, which is significantly different from conventional offshore systems. A data collection plan was developed and implemented to compare actual vs. predicted operating conditions. Results and recommendation for future developments are presented.

  14. Neutral Buoyancy Simulator-EASE Project (NB32)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Once the United States' space program had progressed from Earth's orbit into outerspace, the prospect of building and maintaining a permanent presence in space was realized. To accomplish this feat, NASA launched a temporary workstation, Skylab, to discover the effects of low gravity and weightlessness on the human body, and also to develop tools and equipment that would be needed in the future to build and maintain a more permanent space station. The structures, techniques, and work schedules had to be carefully designed to fit this unique construction site. The components had to be lightweight for transport into orbit, yet durable. The station also had to be made with removable parts for easy servicing and repairs by astronauts. All of the tools necessary for service and repairs had to be designed for easy manipulation by a suited astronaut. Construction methods had to be efficient due to the limited time the astronauts could remain outside their controlled environment. In lieu of all the specific needs for this project, an environment on Earth had to be developed that could simulate a low gravity atmosphere. A Neutral Buoyancy Simulator (NBS) was constructed by NASA's Marshall Space Flight Center (MSFC) in 1968. Since then, NASA scientists have used this facility to understand how humans work best in low gravity and also provide information about the different kinds of structures that can be built. Pictured is a Massachusetts Institute of Technology (MIT) student working in a spacesuit on the Experimental Assembly of Structures in Extravehicular Activity (EASE) project which was developed as a joint effort between MFSC and MIT. The EASE experiment required that crew members assemble small components to form larger components, working from the payload bay of the space shuttle. The MIT student in this photo is assembling two six-beam tetrahedrons.

  15. Computer simulations for minds-on learning with ``Project Spectra!''

    NASA Astrophysics Data System (ADS)

    Wood, E. L.; Renfrow, S.; Marks, N.; Christofferson, R.

    2010-12-01

    How do we gain information about the Sun? How do we know Mars has CO2 or that Titan has a nitrogen-rich atmosphere? How do we use light in astronomy? These concepts are something education professionals generally struggle with because they are abstract. Making use of visualizations and presenting material so it can be manipulated is the easiest way to conquer abstractions to bring them home to students. Using simulations and computer interactives (games) where students experience and manipulate the information makes concepts accessible. “Project Spectra!” is a science and engineering program that uses computer-based Flash interactives to expose students to astronomical spectroscopy and actual data in a way that is not possible with traditional in-class activities. Visualizing lessons with multi-media is a way to solidify understanding and retention of knowledge and is completely unlike its paper-and-pencil counterpart. To engage students in “Project Spectra!”, students are given a mission, which connects them with the research at hand. Missions range from exploring remote planetary atmospheres and surfaces, experimenting with the Sun using different filters, and comparing spectroscopic atmospheric features between different bodies. Additionally, students have an opportunity to learn about NASA missions, view movies, and see images connected with their mission. In the end, students are asked critical thinking questions and conduct web-based research. These interactives complement the in-class activities where students engineer spectrographs and explore the electromagnetic spectrum.

  16. Susitna Hydroelectric Project: terrestrial environmental workshop and preliminary simulation model

    USGS Publications Warehouse

    Everitt, Robert R.; Sonntag, Nicholas C.; Auble, Gregory T.; Roelle, James E.; Gazey, William

    1982-01-01

    The technical feasibility, economic viability, and environmental impacts of a hydroelectric development project in the Susitna River Basin are being studied by Acres American, Inc. on behalf of the Alaska Power Authority. As part of these studies, Acres American recently contracted LGL Alaska Research Associates, Inc. to coordinate the terrestrial environmental studies being performed by the Alaska Department of Fish and Game and, as subcontractors to LGL, several University of Alaska research groups. LGL is responsible for further quantifying the potential impacts of the project on terrestrial wildlife and vegetation, and for developing a plan to mitigate adverse impacts on the terrestrial environment. The impact assessment and mitigation plan will be included as part of a license application to the Federal Energy Regulatory Commission (FERC) scheduled for the first quarter of 1983. The quantification of impacts, mitigation planning, and design of future research is being organized using a computer simulation modelling approach. Through a series of workshops attended by researchers, resource managers, and policy-makers, a computer model is being developed and refined for use in the quantification of impacts on terrestrial wildlife and vegetation, and for evaluating different mitigation measures such as habitat enhancement and the designation of replacement lands to be managed by wildlife habitat. This report describes the preliminary model developed at the first workshop held August 23 -27, 1982 in Anchorage.

  17. A fully non-linear multi-species Fokker-Planck-Landau collision operator for simulation of fusion plasma

    NASA Astrophysics Data System (ADS)

    Hager, Robert; Yoon, E. S.; Ku, S.; D'Azevedo, E. F.; Worley, P. H.; Chang, C. S.

    2016-06-01

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. In this article, the non-linear single-species Fokker-Planck-Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. The finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker-Planck-Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computing systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. The collision operator's good weak and strong scaling behavior are shown.

  18. Accuracy and convergence of coupled finite-volume/Monte Carlo codes for plasma edge simulations of nuclear fusion reactors

    SciTech Connect

    Ghoos, K.; Dekeyser, W.; Samaey, G.; Börner, P.; Baelmans, M.

    2016-10-01

    The plasma and neutral transport in the plasma edge of a nuclear fusion reactor is usually simulated using coupled finite volume (FV)/Monte Carlo (MC) codes. However, under conditions of future reactors like ITER and DEMO, convergence issues become apparent. This paper examines the convergence behaviour and the numerical error contributions with a simplified FV/MC model for three coupling techniques: Correlated Sampling, Random Noise and Robbins Monro. Also, practical procedures to estimate the errors in complex codes are proposed. Moreover, first results with more complex models show that an order of magnitude speedup can be achieved without any loss in accuracy by making use of averaging in the Random Noise coupling technique.

  19. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    SciTech Connect

    Hager, Robert; Yoon, E. S.; Ku, S.; D'Azevedo, E. F.; Worley, P. H.; Chang, C. S.

    2016-04-04

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. The non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. Moreover, the finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computing systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. As a result, the collision operator's good weak and strong scaling behavior are shown.

  20. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    DOE PAGES

    Hager, Robert; Yoon, E. S.; Ku, S.; ...

    2016-04-04

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. The non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. Moreover, the finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computingmore » systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. As a result, the collision operator's good weak and strong scaling behavior are shown.« less

  1. Transition from Beam-Target to Thermonuclear Fusion in High-Current Deuterium Z-Pinch Simulations.

    PubMed

    Offermann, Dustin T; Welch, Dale R; Rose, Dave V; Thoma, Carsten; Clark, Robert E; Mostrom, Chris B; Schmidt, Andrea E W; Link, Anthony J

    2016-05-13

    Fusion yields from dense, Z-pinch plasmas are known to scale with the drive current, which is favorable for many potential applications. Decades of experimental studies, however, show an unexplained drop in yield for currents above a few mega-ampere (MA). In this work, simulations of DD Z-Pinch plasmas have been performed in 1D and 2D for a constant pinch time and initial radius using the code Lsp, and observations of a shift in scaling are presented. The results show that yields below 3 MA are enhanced relative to pure thermonuclear scaling by beamlike particles accelerated in the Rayleigh-Taylor induced electric fields, while yields above 3 MA are reduced because of energy lost by the instability and the inability of the beamlike ions to enter the pinch region.

  2. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    SciTech Connect

    Hager, Robert; Yoon, E.S.; Ku, S.; D'Azevedo, E.F.; Worley, P.H.; Chang, C.S.

    2016-06-15

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. In this article, the non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. The finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computing systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. The collision operator's good weak and strong scaling behavior are shown.

  3. Mixed Waste Treatment Project: Computer simulations of integrated flowsheets

    SciTech Connect

    Dietsche, L.J.

    1993-12-01

    The disposal of mixed waste, that is waste containing both hazardous and radioactive components, is a challenging waste management problem of particular concern to DOE sites throughout the United States. Traditional technologies used for the destruction of hazardous wastes need to be re-evaluated for their ability to handle mixed wastes, and in some cases new technologies need to be developed. The Mixed Waste Treatment Project (MWTP) was set up by DOE`s Waste Operations Program (EM30) to provide guidance on mixed waste treatment options. One of MWTP`s charters is to develop flowsheets for prototype integrated mixed waste treatment facilities which can serve as models for sites developing their own treatment strategies. Evaluation of these flowsheets is being facilitated through the use of computer modelling. The objective of the flowsheet simulations is to provide mass and energy balances, product compositions, and equipment sizing (leading to cost) information. The modelled flowsheets need to be easily modified to examine how alternative technologies and varying feed streams effect the overall integrated process. One such commercially available simulation program is ASPEN PLUS. This report contains details of the Aspen Plus program.

  4. MULTI-IFE-A one-dimensional computer code for Inertial Fusion Energy (IFE) target simulations

    NASA Astrophysics Data System (ADS)

    Ramis, R.; Meyer-ter-Vehn, J.

    2016-06-01

    The code MULTI-IFE is a numerical tool devoted to the study of Inertial Fusion Energy (IFE) microcapsules. It includes the relevant physics for the implosion and thermonuclear ignition and burning: hydrodynamics of two component plasmas (ions and electrons), three-dimensional laser light ray-tracing, thermal diffusion, multigroup radiation transport, deuterium-tritium burning, and alpha particle diffusion. The corresponding differential equations are discretized in spherical one-dimensional Lagrangian coordinates. Two typical application examples, a high gain laser driven capsule and a low gain radiation driven marginally igniting capsule are discussed. In addition to phenomena relevant for IFE, the code includes also components (planar and cylindrical geometries, transport coefficients at low temperature, explicit treatment of Maxwell's equations) that extend its range of applicability to laser-matter interaction at moderate intensities (<1016 W cm-2). The source code design has been kept simple and structured with the aim to encourage user's modifications for specialized purposes.

  5. A general CFD framework for fault-resilient simulations based on multi-resolution information fusion

    NASA Astrophysics Data System (ADS)

    Lee, Seungjoon; Kevrekidis, Ioannis G.; Karniadakis, George Em

    2017-10-01

    We develop a general CFD framework for multi-resolution simulations to target multiscale problems but also resilience in exascale simulations, where faulty processors may lead to gappy, in space-time, simulated fields. We combine approximation theory and domain decomposition together with statistical learning techniques, e.g. coKriging, to estimate boundary conditions and minimize communications by performing independent parallel runs. To demonstrate this new simulation approach, we consider two benchmark problems. First, we solve the heat equation (a) on a small number of spatial ;patches; distributed across the domain, simulated by finite differences at fine resolution and (b) on the entire domain simulated at very low resolution, thus fusing multi-resolution models to obtain the final answer. Second, we simulate the flow in a lid-driven cavity in an analogous fashion, by fusing finite difference solutions obtained with fine and low resolution assuming gappy data sets. We investigate the influence of various parameters for this framework, including the correlation kernel, the size of a buffer employed in estimating boundary conditions, the coarseness of the resolution of auxiliary data, and the communication frequency across different patches in fusing the information at different resolution levels. In addition to its robustness and resilience, the new framework can be employed to generalize previous multiscale approaches involving heterogeneous discretizations or even fundamentally different flow descriptions, e.g. in continuum-atomistic simulations.

  6. Software development infrastructure for the HYBRID modeling and simulation project

    SciTech Connect

    Aaron S. Epiney; Robert A. Kinoshita; Jong Suk Kim; Cristian Rabiti; M. Scott Greenwood

    2016-09-01

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the whole problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers

  7. QUANTIFYING OBSERVATIONAL PROJECTION EFFECTS USING MOLECULAR CLOUD SIMULATIONS

    SciTech Connect

    Beaumont, Christopher N.; Offner, Stella S.R.; Shetty, Rahul; Glover, Simon C. O.; Goodman, Alyssa A.

    2013-11-10

    The physical properties of molecular clouds are often measured using spectral-line observations, which provide the only probes of the clouds' velocity structure. It is hard, though, to assess whether and to what extent intensity features in position-position-velocity (PPV) space correspond to 'real' density structures in position-position-position (PPP) space. In this paper, we create synthetic molecular cloud spectral-line maps of simulated molecular clouds, and present a new technique for measuring the reality of individual PPV structures. Using a dendrogram algorithm, we identify hierarchical structures in both PPP and PPV space. Our procedure projects density structures identified in PPP space into corresponding intensity structures in PPV space and then measures the geometric overlap of the projected structures with structures identified from the synthetic observation. The fractional overlap between a PPP and PPV structure quantifies how well the synthetic observation recovers information about the three-dimensional structure. Applying this machinery to a set of synthetic observations of CO isotopes, we measure how well spectral-line measurements recover mass, size, velocity dispersion, and virial parameter for a simulated star-forming region. By disabling various steps of our analysis, we investigate how much opacity, chemistry, and gravity affect measurements of physical properties extracted from PPV cubes. For the simulations used here, which offer a decent, but not perfect, match to the properties of a star-forming region like Perseus, our results suggest that superposition induces a ∼40% uncertainty in masses, sizes, and velocity dispersions derived from {sup 13}CO (J = 1-0). As would be expected, superposition and confusion is worst in regions where the filling factor of emitting material is large. The virial parameter is most affected by superposition, such that estimates of the virial parameter derived from PPV and PPP information typically disagree

  8. Enhancing chemical identification efficiency by SAW sensor transients through a data enrichment and information fusion strategy—a simulation study

    NASA Astrophysics Data System (ADS)

    Singh, Prashant; Yadava, R. D. S.

    2013-05-01

    The paper proposes a new approach for improving the odor recognition efficiency of a surface acoustic wave (SAW) transient sensor system based on a single polymer coating. The vapor identity information is hidden in transient response shapes through dependences on specific vapor solvation and diffusion parameters in the polymer coating. The variations in the vapor exposure and purge durations and the sensor operating frequency have been used to create diversity in transient shapes via termination of the vapor-polymer equilibration process up to different stages. The transient signals were analyzed by the discrete wavelet transform using Daubechies-4 mother wavelet basis. The wavelet approximation coefficients were then processed by principal component analysis for creating feature space. The set of principal components define the vapor identity information. In an attempt to enhance vapor class separability we analyze two types of information fusion methods. In one, the sensor operation frequency is fixed and the sensing and purge durations are varied, and in the second, the sensing and purge durations are fixed and the sensor operating frequency is varied. The fusion is achieved by concatenation of discrete wavelet coefficients corresponding to various transients prior to the principal component analysis. The simulation experiments with polyisobutylene SAW sensor coating for operation frequencies over [55-160] MHz and sensing durations over [5-60] s were analyzed. The target vapors are seven volatile organics: chloroform, chlorobenzene, o-dichlorobenzene, n-heptane, toluene, n-hexane and n-octane whose concentrations were varied over [10-100] ppm. The simulation data were generated using a SAW sensor transient response model that incorporates the viscoelastic effects due to polymer coating and an additive noise source in the output. The analysis reveals that: (i) in single transient analysis the class separability increases with sensing duration for a given

  9. Simulation and Experimental Study on the Efficiency of Traveling Wave Direct Energy Conversion for Application to Aneutronic Fusion Reactions

    NASA Astrophysics Data System (ADS)

    Tarditi, Alfonso; Chap, Andrew; Miley, George; Scott, John

    2013-10-01

    A study based on both Particle-in-cell (PIC) simulation and experiments is being developed to study the physics of the Traveling Wave Direct Energy Converter (TWDEC,) with the perspective of application to aneutronic fusion reaction products and space propulsion. The PIC model is investigating in detail the key TWDEC physics process by simulating the time-dependent transfer of energy from the ion beam to an electric load connected to ring-type electrodes in cylindrical symmetry. An experimental effort is in progress on a TWDEC test article at NASA, Johnson Space Center with the purpose of studying the conditions for improving the efficiency of the direct energy conversion process. Using a scaled-down ion energy source, the experiment is primarily focused on the effect of the (bunched) beam density on the efficiency and on the optimization of the electrode design. The simulation model is guiding the development of the experimental configuration and will provide details of the beam dynamics for direct comparison with experimental diagnostics. Work supported by NASA, Johnson Space Center.

  10. Models and Simulations of C60-Fullerene Plasma Jets for Disruption Mitigation and Magneto-Inertial Fusion

    NASA Astrophysics Data System (ADS)

    Bogatu, Ioan-Niculae; Galkin, Sergei A.; Kim, Jin-Soo

    2009-11-01

    We present the models and simulation results of C60-fullerene plasma jets proposed to be used for the disruption mitigation on ITER and for magneto-inertial fusion (MIF). The model describing the fast production of a large mass of C60 molecular gas in the pulsed power source by explosive sublimation of C60 micro-grains is detailed. Several aspects of the magnetic ``piston'' model and the 2D interchange (magnetic Rayleigh-Taylor) instability in the rail gun arc dynamics are described. A plasma jet adiabatic expansion model is used to investigate the in-flight three-body recombination during jet transport to the plasma boundary. Our LSP PIC code 3D simulations show that heavy C60 plasmoid penetrates deeply through a transverse magnetic barrier demonstrating self-polarization and magnetic field expulsion effects. The LSP code 3D simulation of two plasma jets head-on injection along a magnetic field lines for MIF are also discussed.

  11. Appreciating the Complexity of Project Management Execution: Using Simulation in the Classroom

    ERIC Educational Resources Information Center

    Hartman, Nathan S.; Watts, Charles A.; Treleven, Mark D.

    2013-01-01

    As the popularity and importance of project management increase, so does the need for well-prepared project managers. This article discusses our experiences using a project management simulation in undergraduate and MBA classes to help students better grasp the complexity of project management. This approach gives students hands-on experience with…

  12. Appreciating the Complexity of Project Management Execution: Using Simulation in the Classroom

    ERIC Educational Resources Information Center

    Hartman, Nathan S.; Watts, Charles A.; Treleven, Mark D.

    2013-01-01

    As the popularity and importance of project management increase, so does the need for well-prepared project managers. This article discusses our experiences using a project management simulation in undergraduate and MBA classes to help students better grasp the complexity of project management. This approach gives students hands-on experience with…

  13. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    NASA Astrophysics Data System (ADS)

    Vold, E. L.; Joglekar, A. S.; Ortega, M. I.; Moll, R.; Fenn, D.; Molvig, K.

    2015-11-01

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion (ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. We have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasma viscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasma viscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasma viscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Plasma viscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  14. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    DOE PAGES

    Vold, Erik Lehman; Joglekar, Archis S.; Ortega, Mario I.; ...

    2015-11-20

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion(ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. In this paper, we have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasmaviscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasmaviscosity andmore » to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasmaviscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Finally, plasmaviscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.« less

  15. Hydrogen species in diamond: Molecular dynamics simulation in bulk diamond for fusion applications

    NASA Astrophysics Data System (ADS)

    Delgado, D.; Vila, R.

    2014-09-01

    For an electron cyclotron resonance heating system, a diamond window seems to be the only material able to withstand the high microwave power and radiation effects and at the same time act as a tritium barrier. Therefore it is important to understand the evolution of hydrogen isotopes in diamond. Both, hydrogen content and radiation can quite rapidly degrade its excellent properties. Hydrogen isotopes can be introduced in the material by two processes: (1) during the growth process of synthetic samples and (2) as a neutron radiation effect when devices are exposed to a fusion irradiation environment. In the last case, both device performance (thermal, optical and dielectric properties degradation) and hands-on maintenance of the window (tritium inventory), demand a good knowledge of hydrogen species concentrations and their evolution with lattice damage. In this paper, a classical molecular dynamics study analyses the hydrogen equilibrium sites in diamond, and also their bulk and interstitial vibrational characteristics, including isotopic shifts. Some interesting results are presented and discussed. We confirm that the bond-centred site is the more stable configuration for H. Vibrational studies show lines in the C-H stretching region. Isotopic studies reveal ratios close to the theoretical ones for BC and ET sites. On the contrary, the AB site vibrations obtained suggest the existence of a local carbon oscillation.

  16. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    SciTech Connect

    Vold, Erik Lehman; Joglekar, Archis S.; Ortega, Mario I.; Moll, Ryan; Fenn, Daniel; Molvig, Kim

    2015-11-20

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion(ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. In this paper, we have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasmaviscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasmaviscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasmaviscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Finally, plasmaviscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  17. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    SciTech Connect

    Vold, E. L.; Molvig, K.; Joglekar, A. S.; Ortega, M. I.; Moll, R.; Fenn, D.

    2015-11-15

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion (ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. We have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasma viscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasma viscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasma viscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Plasma viscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  18. Concept Development of the Eindhoven Diabetes Education Simulator Project.

    PubMed

    Maas, Anne H; van der Molen, Pieta; van de Vijver, Reinier; Chen, Wei; van Pul, Carola; Cottaar, Eduardus J E; van Riel, Natal A W; Hilbers, Peter A J; Haak, Harm R

    2016-04-01

    This study was designed to define the concept of an educational diabetes game following a user-centered design approach. The concept development of the Eindhoven Diabetes Education Simulator (E-DES) project can be divided in two phases: concept generation and concept evaluation. Four concepts were designed by the multidisciplinary development team based on the outcomes of user interviews. Four other concepts resulted from the Diabetes Game Jam. Several users and experts evaluated the concepts. These user evaluations and a feasibility analysis served as input for an overall evaluation and discussion by the development team resulting in the final concept choice. The four concepts of the development team are a digital board game, a quiz platform, a lifestyle simulator, and a puzzle game. The Diabetes Game Jam resulted in another digital board game, two mobile swipe games, and a fairy tale-themed adventure game. The combined user evaluations and feasibility analysis ranked the quiz platform and the digital board game equally high. Each of these games fits one specific subgroup of users best: the quiz platform best fits an eager-to-learn, more individualistic patient, whereas the board game best fits a less-eager-to-learn, family-oriented patient. The choice for a specific concept is therefore highly dependent on the choice of our specific target audience. The user-centered design approach with multiple evaluations has enabled us to choose the most promising concept from eight different options. A digital board game is chosen for further development because the target audience for E-DES is the less-motivated, family-oriented patients.

  19. Projections of African drought extremes in CORDEX regional climate simulations

    NASA Astrophysics Data System (ADS)

    Gbobaniyi, Emiola; Nikulin, Grigory; Jones, Colin; Kjellström, Erik

    2013-04-01

    We investigate trends in drought extremes for different climate regions of the African continent over a combined historical and future period 1951-2100. Eight CMIP5 coupled atmospheric global climate models (CanESM2, CNRM-CM5, HadGEM2-ES, NorESM1-M, EC-EARTH, MIROC5, GFDL-ESM2M and MPI-ESM-LR) under two forcing scenarios, the relative concentration pathways (RCP) 4.5 and 8.5, with spatial resolution varying from about 1° to 3° are downscaled to 0.44° resolution by the Rossby Centre (SMHI) regional climate model RCA4. We use data from the ensuing ensembles of CORDEX-Africa regional climate simulations to explore three drought indices namely: standardized precipitation index (SPI), moisture index (MI) and difference in precipitation and evaporation (P-E). Meteorological and agricultural drought conditions are assessed in our analyses and a climate change signal is obtained for the SPI by calculating gamma functions for future SPI with respect to a baseline present climate. Results for the RCP4.5 and RCP8.5 scenarios are inter-compared to assess uncertainties in the future projections. We show that there is a pronounced sensitivity to the choice of forcing GCM which indicates that assessments of future drought conditions in Africa would benefit from large model ensembles. We also note that the results are sensitive to the choice of drought index. We discuss both spatial and temporal variability of drought extremes for different climate zones of Africa and the importance of the ensemble mean. Our study highlights the usefulness of CORDEX simulations in identifying possible future impacts of climate at local and regional scales.

  20. Effects of preoperative simulation on minimally invasive Hybrid Lumbar Interbody Fusion (MIS-HLIF).

    PubMed

    Rieger, Bernhard; Jiang, Hongzhen; Reinshagen, Clemens; Molcanyi, Marek; Zivcak, Jozef; Grönemeyer, Dietrich; Bosche, Bert; Schackert, Gabriele; Ruess, Daniel

    2017-07-10

    The main focus of this study was to evaluate how preoperative simulation affects the surgical workflow, radiation exposure and outcome of MIS-HLIF. 132 patients who underwent single level MIS-HLIF were enrolled in a cohort study design. Dose area product was analyzed in addition to surgical data. Once preoperative simulation was established, 66 cases (SIM cohort) were compared with 66 patients who had previously undergone MIS-HLIF without preoperative simulation (NO-SIM cohort). Dose area product was reduced considerably in the SIM cohort (320 cGy·cm(2) NO-SIM cohort: 470 cGy·cm(2); p<0.01). Surgical time was shorter for the SIM cohort (155 minutes; NO-SIM cohort: 182 minutes; p<0.05). SIM cohort had a better outcome in numeric rating scale (NRS) back at 6 months follow-up when compared to NO-SIM cohort (p<0.05). Preoperative simulation reduced radiation exposure and resulted in less back pain at the 6 months follow-up time point. Preoperative simulation provided guidance in determining the correct cage height. Outcome controls enabled the surgeon to improve the procedure and the software algorithm. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Experiments with Memory-to-Memory Coupling for End-to-End fusion Simulation Workflows

    SciTech Connect

    Docan, Ciprian; Zhang, Fan; Parashar, Manish; Cummings, Julian; Podhorszki, Norbert; Klasky, Scott A

    2010-01-01

    Scientific applications are striving to accurately simulate multiple interacting physical processes that comprise complex phenomena being modeled. Efficient and scalable parallel implementations of these coupled simulations present challenging interaction and coordination requirements, especially when the coupled physical processes are computationally heterogeneous and progress at different speeds. In this paper, we present the design, implementation and evaluation of a memory-to-memory coupling framework for coupled scientific simulations on high-performance parallel computing platforms. The framework is driven by the coupling requirements of the Center for Plasma Edge Simulation, and it provides simple coupling abstractions as well as efficient asynchronous (RDMA-based) memory-to-memory data transport mechanisms that complement existing parallel programming systems and data sharing frameworks. The framework enables flexible coupling behaviors that are asynchronous in time and space, and it supports dynamic coupling between heterogeneous simulation processes without enforcing any synchronization constraints. We evaluate the performance and scalability of the coupling framework using a specific coupling scenario, on the Jaguar Cray XT5 system at Oak Ridge National Laboratory.

  2. Analysis of the sEMG/force relationship using HD-sEMG technique and data fusion: A simulation study.

    PubMed

    Al Harrach, Mariam; Carriou, Vincent; Boudaoud, Sofiane; Laforet, Jeremy; Marin, Frederic

    2017-04-01

    The relationship between the surface Electromyogram (sEMG) signal and the force of an individual muscle is still ambiguous due to the complexity of experimental evaluation. However, understanding this relationship should be useful for the assessment of neuromuscular system in healthy and pathological contexts. In this study, we present a global investigation of the factors governing the shape of this relationship. Accordingly, we conducted a focused sensitivity analysis of the sEMG/force relationship form with respect to neural, functional and physiological parameters variation. For this purpose, we used a fast generation cylindrical model for the simulation of an 8×8 High Density-sEMG (HD-sEMG) grid and a twitch based force model for the muscle force generation. The HD-sEMG signals as well as the corresponding force signals were simulated in isometric non-fatiguing conditions and were based on the Biceps Brachii (BB) muscle properties. A total of 10 isometric constant contractions of 5s were simulated for each configuration of parameters. The Root Mean Squared (RMS) value was computed in order to quantify the sEMG amplitude. Then, an image segmentation method was used for data fusion of the 8×8 RMS maps. In addition, a comparative study between recent modeling propositions and the model proposed in this study is presented. The evaluation was made by computing the Normalized Root Mean Squared Error (NRMSE) of their fitting to the simulated relationship functions. Our results indicated that the relationship between the RMS (mV) and muscle force (N) can be modeled using a 3rd degree polynomial equation. Moreover, it appears that the obtained coefficients are patient-specific and dependent on physiological, anatomical and neural parameters. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. A simulation-based and analytic analysis of the off-Hugoniot response of alternative inertial confinement fusion ablator materials

    NASA Astrophysics Data System (ADS)

    Moore, Alastair S.; Prisbrey, Shon; Baker, Kevin L.; Celliers, Peter M.; Fry, Jonathan; Dittrich, Thomas R.; Wu, Kuang-Jen J.; Kervin, Margaret L.; Schoff, Michael E.; Farrell, Mike; Nikroo, Abbas; Hurricane, Omar A.

    2016-09-01

    The attainment of self-propagating fusion burn in an inertial confinement target at the National Ignition Facility will require the use of an ablator with high rocket-efficiency and ablation pressure. The ablation material used during the National Ignition Campaign (Lindl et al. 2014) [1], a glow-discharge polymer (GDP), does not couple as efficiently as simulations indicated to the multiple-shock inducing radiation drive environment created by laser power profile (Robey et al., 2012). We investigate the performance of two other ablators, boron carbide (B4C) and high-density carbon (HDC) compared to the performance of GDP under the same hohlraum conditions. Ablation performance is determined through measurement of the shock speed produced in planar samples of the ablator material subjected to the identical multiple-shock inducing radiation drive environments that are similar to a generic three-shock ignition drive. Simulations are in better agreement with the off-Hugoniot performance of B4C than either HDC or GDP, and analytic estimations of the ablation pressure indicate that while the pressure produced by B4C and GDP is similar when the ablator is allowed to release, the pressure reached by B4C seems to exceed that of HDC when backed by a Au/quartz layer.

  4. Negative ion extraction via particle simulation for fusion: critical assessment of recent contributions

    NASA Astrophysics Data System (ADS)

    Garrigues, L.; Fubiani, G.; Boeuf, J. P.

    2017-01-01

    Particle-in-cell (PIC) models have been extensively used in the last few years to describe negative ion extraction for neutral beam injection applications. We show that some of these models have been employed in conditions far from the requirements of particle simulations and that questionable conclusions about negative ion extraction, not supported by experimental evidence, have been obtained. We present a critical analysis of the method that has led to these conclusions and propose directions toward a more accurate and realistic description of negative ion extraction. We show in particular that, as expected in PIC simulations, mesh convergence is reached only if the grid spacing is on the order of or smaller than the minimum Debye length in the simulation domain, and that strong aberrations in the extracted beam are observed if this constraint is not respected. The method of injection of charged particles in the simulated plasma is also discussed, and we show that some injection methods used in the literature lead to unphysical results.

  5. Teaching Engineering Statistics with Technology, Group Learning, Contextual Projects, Simulation Models and Student Presentations

    ERIC Educational Resources Information Center

    Romeu, Jorge Luis

    2008-01-01

    This article discusses our teaching approach in graduate level Engineering Statistics. It is based on the use of modern technology, learning groups, contextual projects, simulation models, and statistical and simulation software to entice student motivation. The use of technology to facilitate group projects and presentations, and to generate,…

  6. Teaching Engineering Statistics with Technology, Group Learning, Contextual Projects, Simulation Models and Student Presentations

    ERIC Educational Resources Information Center

    Romeu, Jorge Luis

    2008-01-01

    This article discusses our teaching approach in graduate level Engineering Statistics. It is based on the use of modern technology, learning groups, contextual projects, simulation models, and statistical and simulation software to entice student motivation. The use of technology to facilitate group projects and presentations, and to generate,…

  7. Model-data fusion across ecosystems: from multi-site optimizations to global simulations

    NASA Astrophysics Data System (ADS)

    Kuppel, S.; Peylin, P.; Maignan, F.; Chevallier, F.; Kiely, G.; Montagnani, L.; Cescatti, A.

    2014-05-01

    This study uses a variational data assimilation framework to simultaneously constrain a global ecosystem model with eddy covariance measurements of daily net carbon (NEE) and latent heat (LE) fluxes from a large number of sites grouped in seven plant functional types (PFTs). It is an attempt to bridge the gap between the numerous site-specific parameter optimization works found in the literature and the generic parameterization used by most land surface models within each PFT. The present multi-site approach allows deriving PFT-generic sets of optimized parameters enhancing the agreement between measured and simulated fluxes at most of the sites considered, with performances often comparable to those of the corresponding site-specific optimizations. Besides reducing the PFT-averaged model-data root-mean-square difference (RMSD) and the associated daily output uncertainty, the optimization improves the simulated CO2 balance at tropical and temperate forests sites. The major site-level NEE adjustments at the seasonal scale are: reduced amplitude in C3 grasslands and boreal forests, increased seasonality in temperate evergreen forests, and better model-data phasing in temperate deciduous broadleaf forests. Conversely, the poorer performances in tropical evergreen broadleaf forests points to deficiencies regarding the modeling of phenology and soil water stress for this PFT. An evaluation with data-oriented estimates of photosynthesis (GPP) and ecosystem respiration (Reco) rates indicates distinctively improved simulations of both gross fluxes. The multi-site parameter sets are then tested against CO2 concentrations measured at 53 locations around the globe, showing significant adjustments of the modeled seasonality of atmospheric CO2 concentration, whose relevance seems PFT-dependent, along with an improved interannual variability. Lastly, a global scale evaluation with remote sensing NDVI measurements indicates an improvement of the simulated seasonal variations of

  8. Model-data fusion across ecosystems: from multisite optimizations to global simulations

    NASA Astrophysics Data System (ADS)

    Kuppel, S.; Peylin, P.; Maignan, F.; Chevallier, F.; Kiely, G.; Montagnani, L.; Cescatti, A.

    2014-11-01

    This study uses a variational data assimilation framework to simultaneously constrain a global ecosystem model with eddy covariance measurements of daily net ecosystem exchange (NEE) and latent heat (LE) fluxes from a large number of sites grouped in seven plant functional types (PFTs). It is an attempt to bridge the gap between the numerous site-specific parameter optimization works found in the literature and the generic parameterization used by most land surface models within each PFT. The present multisite approach allows deriving PFT-generic sets of optimized parameters enhancing the agreement between measured and simulated fluxes at most of the sites considered, with performances often comparable to those of the corresponding site-specific optimizations. Besides reducing the PFT-averaged model-data root-mean-square difference (RMSD) and the associated daily output uncertainty, the optimization improves the simulated CO2 balance at tropical and temperate forests sites. The major site-level NEE adjustments at the seasonal scale are reduced amplitude in C3 grasslands and boreal forests, increased seasonality in temperate evergreen forests, and better model-data phasing in temperate deciduous broadleaf forests. Conversely, the poorer performances in tropical evergreen broadleaf forests points to deficiencies regarding the modelling of phenology and soil water stress for this PFT. An evaluation with data-oriented estimates of photosynthesis (GPP - gross primary productivity) and ecosystem respiration (Reco) rates indicates distinctively improved simulations of both gross fluxes. The multisite parameter sets are then tested against CO2 concentrations measured at 53 locations around the globe, showing significant adjustments of the modelled seasonality of atmospheric CO2 concentration, whose relevance seems PFT-dependent, along with an improved interannual variability. Lastly, a global-scale evaluation with remote sensing NDVI (normalized difference vegetation index

  9. Simulation of self-generated magnetic fields in an inertial fusion hohlraum environment

    NASA Astrophysics Data System (ADS)

    Farmer, W. A.; Koning, J. M.; Strozzi, D. J.; Hinkel, D. E.; Berzak Hopkins, L. F.; Jones, O. S.; Rosen, M. D.

    2017-05-01

    We present radiation-hydrodynamic simulations of self-generated magnetic field in a hohlraum, which show an increased temperature in large regions of the underdense fill. Non-parallel gradients in electron density and temperature in a laser-heated plasma give rise to a self-generated field by the "Biermann battery" mechanism. Here, HYDRA simulations of three hohlraum designs on the National Ignition Facility are reported, which use a partial magnetohydrodynamic (MHD) description that includes the self-generated source term, resistive dissipation, and advection of the field due to both the plasma flow and the Nernst term. Anisotropic electron heat conduction parallel and perpendicular to the field is included, but not the Righi-Leduc heat flux. The field strength is too small to compete significantly with plasma pressure, but affects plasma conditions by reducing electron heat conduction perpendicular to the field. Significant reductions in heat flux can occur, especially for high Z plasma, at modest values of the Hall parameter, Ωeτei≲1 , where Ωe=e B /mec and τei is the electron-ion collision time. The inclusion of MHD in the simulations leads to 1 keV hotter electron temperatures in the laser entrance hole and high-Z wall blowoff, which reduces inverse-bremsstrahlung absorption of the laser beam. This improves propagation of the inner beams pointed at the hohlraum equator, resulting in a symmetry shift of the resulting capsule implosion towards a more prolate shape. The time of peak x-ray production in the capsule shifts later by only 70 ps (within experimental uncertainty), but a decomposition of the hotspot shape into Legendre moments indicates a shift of P2/P0 by ˜20 % . This indicates that MHD cannot explain why simulated x-ray drive exceeds measured levels, but may be partially responsible for failures to correctly model the symmetry.

  10. Fusion Studies in Japan

    NASA Astrophysics Data System (ADS)

    Ogawa, Yuichi

    2016-05-01

    A new strategic energy plan decided by the Japanese Cabinet in 2014 strongly supports the steady promotion of nuclear fusion development activities, including the ITER project and the Broader Approach activities from the long-term viewpoint. Atomic Energy Commission (AEC) in Japan formulated the Third Phase Basic Program so as to promote an experimental fusion reactor project. In 2005 AEC has reviewed this Program, and discussed on selection and concentration among many projects of fusion reactor development. In addition to the promotion of ITER project, advanced tokamak research by JT-60SA, helical plasma experiment by LHD, FIREX project in laser fusion research and fusion engineering by IFMIF were highly prioritized. Although the basic concept is quite different between tokamak, helical and laser fusion researches, there exist a lot of common features such as plasma physics on 3-D magnetic geometry, high power heat load on plasma facing component and so on. Therefore, a synergetic scenario on fusion reactor development among various plasma confinement concepts would be important.

  11. Simulation of plasma-surface interactions in a fusion reactor by means of QSPA plasma streams: recent results and prospects

    NASA Astrophysics Data System (ADS)

    Garkusha, I. E.; Aksenov, N. N.; Byrka, O. V.; Makhlaj, V. A.; Herashchenko, S. S.; Malykhin, S. V.; Petrov, Yu V.; Staltsov, V. V.; Surovitskiy, S. V.; Wirtz, M.; Linke, J.; Sadowski, M. J.; Skladnik-Sadowska, E.

    2016-09-01

    This paper is devoted to plasma-surface interaction issues at high heat-loads which are typical for fusion reactors. For the International Thermonuclear Experimental Reactor (ITER), which is now under construction, the knowledge of erosion processes and the behaviour of various constructional materials under extreme conditions is a very critical issue, which will determine a successful realization of the project. The most important plasma-surface interaction (PSI) effects in 3D geometry have been studied using a QSPA Kh-50 powerful quasi-stationary plasma accelerator. Mechanisms of the droplet and dust generation have been investigated in detail. It was found that the droplets emission from castellated surfaces has a threshold character and a cyclic nature. It begins only after a certain number of the irradiating plasma pulses when molten and shifted material is accumulated at the edges of the castellated structure. This new erosion mechanism, connected with the edge effects, results in an increase in the size of the emitted droplets (as compared with those emitted from a flat surface). This mechanism can even induce the ejection of sub-mm particles. A concept of a new-generation QSPA facility, the current status of this device maintenance, and prospects for further experiments are also presented.

  12. Cyclokinetic models and simulations for high-frequency turbulence in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Deng, Zhao; Waltz, R. E.; Wang, Xiaogang

    2016-10-01

    Gyrokinetics is widely applied in plasma physics. However, this framework is limited to weak turbulence levels and low drift-wave frequencies because high-frequency gyro-motion is reduced by the gyro-phase averaging. In order to test where gyrokinetics breaks down, Waltz and Zhao developed a new theory, called cyclokinetics [R. E. Waltz and Zhao Deng, Phys. Plasmas 20, 012507 (2013)]. Cyclokinetics dynamically follows the high-frequency ion gyro-motion which is nonlinearly coupled to the low-frequency drift-waves interrupting and suppressing gyro-averaging. Cyclokinetics is valid in the high-frequency (ion cyclotron frequency) regime or for high turbulence levels. The ratio of the cyclokinetic perturbed distribution function over equilibrium distribution function δf/ F can approach 1. This work presents, for the first time, a numerical simulation of nonlinear cyclokinetic theory for ions, and describes the first attempt to completely solve the ion gyro-phase motion in a nonlinear turbulence system. Simulations are performed [Zhao Deng and R. E. Waltz, Phys. Plasmas 22(5), 056101 (2015)] in a local flux-tube geometry with the parallel motion and variation suppressed by using a newly developed code named rCYCLO, which is executed in parallel by using an implicit time-advanced Eulerian (or continuum) scheme [Zhao Deng and R. E. Waltz, Comp. Phys. Comm. 195, 23 (2015)]. A novel numerical treatment of the magnetic moment velocity space derivative operator guarantee saccurate conservation of incremental entropy. By comparing the more fundamental cyclokinetic simulations with the corresponding gyrokinetic simulations, the gyrokinetics breakdown condition is quantitatively tested. Gyrokinetic transport and turbulence level recover those of cyclokinetics at high relative ion cyclotron frequencies and low turbulence levels, as required. Cyclokinetic transport and turbulence level are found to be lower than those of gyrokinetics at high turbulence levels and low- Ω* values

  13. Subcascade formation in displacement cascade simulations: Implications for fusion reactor materials

    SciTech Connect

    Stoller, R.E.; Greenwood, L.R.

    1998-03-01

    Primary radiation damage formation in iron has been investigated by the method of molecular dynamics (MD) for cascade energies up to 40 keV. The initial energy EMD given to the simulated PKA is approximately equivalent to the damage energy in the standard secondary displacement model by Norgett, Robinson, and Torrens (NRT); hence, EMD is less than the corresponding PKA energy. Using the values of EMD in Table 1, the corresponding EPKA and the NRT defects in iron have been calculated using the procedure described in Ref. 1 with the recommended 40 eV displacement threshold. These values are also listed in Table 1. Note that the difference between the EMD and the PKA energy increases as the PKA energy increases and that the highest simulated PKA energy of 61.3 keV is the average for a collision with a 1.77 MeV neutron. Thus, these simulations have reached well into the fast neutron energy regime. For purposes of comparison, the parameters for the maximum DT neutron energy of 14.1 MeV are also included in Table 1. Although the primary damage parameters derived from the MD cascades exhibited a strong dependence on cascade energy up to 10 keV, this dependence was diminished and slightly reversed between 20 and 40 keV, apparently due to the formation of well-defined subcascades in this energy region. Such an explanation is only qualitative at this time, and additional analysis of the high energy cascades is underway in an attempt to obtain a quantitative measure of the relationship between cascade morphology and defect survival.

  14. Magnetic fusion reactor economics

    SciTech Connect

    Krakowski, R.A.

    1995-12-01

    An almost primordial trend in the conversion and use of energy is an increased complexity and cost of conversion systems designed to utilize cheaper and more-abundant fuels; this trend is exemplified by the progression fossil fission {yields} fusion. The present projections of the latter indicate that capital costs of the fusion ``burner`` far exceed any commensurate savings associated with the cheapest and most-abundant of fuels. These projections suggest competitive fusion power only if internal costs associate with the use of fossil or fission fuels emerge to make them either uneconomic, unacceptable, or both with respect to expensive fusion systems. This ``implementation-by-default`` plan for fusion is re-examined by identifying in general terms fusion power-plant embodiments that might compete favorably under conditions where internal costs (both economic and environmental) of fossil and/or fission are not as great as is needed to justify the contemporary vision for fusion power. Competitive fusion power in this context will require a significant broadening of an overly focused program to explore the physics and simbiotic technologies leading to more compact, simplified, and efficient plasma-confinement configurations that reside at the heart of an attractive fusion power plant.

  15. Big fusion, little fusion

    NASA Astrophysics Data System (ADS)

    Chen, Frank; ddtuttle

    2016-08-01

    In reply to correspondence from George Scott and Adam Costley about the Physics World focus issue on nuclear energy, and to news of construction delays at ITER, the fusion reactor being built in France.

  16. Recent advances in modeling and simulation of the exposure and response of tungsten to fusion energy conditions

    NASA Astrophysics Data System (ADS)

    Marian, Jaime; Becquart, Charlotte S.; Domain, Christophe; Dudarev, Sergei L.; Gilbert, Mark R.; Kurtz, Richard J.; Mason, Daniel R.; Nordlund, Kai; Sand, Andrea E.; Snead, Lance L.; Suzudo, Tomoaki; Wirth, Brian D.

    2017-09-01

    Under the anticipated operating conditions for demonstration magnetic fusion reactors beyond ITER, structural and plasma-facing materials will be exposed to unprecedented conditions of irradiation, heat flux, and temperature. While such extreme environments remain inaccessible experimentally, computational modeling and simulation can provide qualitative and quantitative insights into materials response and complement the available experimental measurements with carefully validated predictions. For plasma-facing components such as the first wall and the divertor, tungsten (W) has been selected as the leading candidate material due to its superior high-temperature and irradiation properties, as well as for its low retention of implanted tritium. In this paper we provide a review of recent efforts in computational modeling of W both as a plasma-facing material exposed to He deposition as well as a bulk material subjected to fast neutron irradiation. We use a multiscale modeling approach—commonly used as the materials modeling paradigm—to define the outline of the paper and highlight recent advances using several classes of techniques and their interconnection. We highlight several of the most salient findings obtained via computational modeling and point out a number of remaining challenges and future research directions.

  17. Kinetic simulations of stimulated Raman backscattering and related processes for the shock-ignition approach to inertial confinement fusion

    SciTech Connect

    Riconda, C.; Weber, S.; Tikhonchuk, V. T.; Heron, A.

    2011-09-15

    A detailed description of stimulated Raman backscattering and related processes for the purpose of inertial confinement fusion requires multi-dimensional kinetic simulations of a full speckle in a high-temperature, large-scale, inhomogeneous plasma. In particular for the shock-ignition scheme operating at high laser intensities, kinetic aspects are predominant. High- (I{lambda}{sub o}{sup 2}{approx}5x10{sup 15}W{mu}m{sup 2}/cm{sup 2}) as well as low-intensity (I{lambda}{sub o}{sup 2}{approx}10{sup 15}W{mu}m{sup 2}/cm{sup 2}) cases show the predominance of collisionless, collective processes for the interaction. While the two-plasmon decay instability and the cavitation scenario are hardly affected by intensity variation, inflationary Raman backscattering proves to be very sensitive. Brillouin backscattering evolves on longer time scales and dominates the reflectivities, although it is sensitive to the intensity. Filamentation and self-focusing do occur for all cases but on time scales too long to affect Raman backscattering.

  18. Fission thrust sail as booster for high Δv fusion based propulsion

    NASA Astrophysics Data System (ADS)

    Ceyssens, Frederik; Wouters, Kristof; Driesen, Maarten

    2015-12-01

    The fission thrust sail as booster for nuclear fusion-based rocket propulsion for future starships is introduced and studied. First order calculations are used together with Monte Carlo simulations to assess system performance. If a D-D fusion rocket such as e.g. considered in Project Icarus has relatively low efficiency (~30%) in converting fusion fuel to a directed exhaust, adding a fission sail is shown to be beneficial for the obtainable delta-v. In addition, this type of fission-fusion hybrid propulsion has the potential to improve acceleration and act as a micrometeorite shield.

  19. Exponential yield sensitivity to long-wavelength asymmetries in three-dimensional simulations of inertial confinement fusion capsule implosions

    SciTech Connect

    Haines, Brian M.

    2015-08-15

    In this paper, we perform a series of high-resolution 3D simulations of an OMEGA-type inertial confinement fusion (ICF) capsule implosion with varying levels of initial long-wavelength asymmetries in order to establish the physical energy loss mechanism for observed yield degradation due to long-wavelength asymmetries in symcap (gas-filled capsule) implosions. These simulations demonstrate that, as the magnitude of the initial asymmetries is increased, shell kinetic energy is increasingly retained in the shell instead of being converted to fuel internal energy. This is caused by the displacement of fuel mass away from and shell material into the center of the implosion due to complex vortical flows seeded by the long-wavelength asymmetries. These flows are not fully turbulent, but demonstrate mode coupling through non-linear instability development during shell stagnation and late-time shock interactions with the shell interface. We quantify this effect by defining a separation lengthscale between the fuel mass and internal energy and show that this is correlated with yield degradation. The yield degradation shows an exponential sensitivity to the RMS magnitude of the long-wavelength asymmetries. This strong dependence may explain the lack of repeatability frequently observed in OMEGA ICF experiments. In contrast to previously reported mechanisms for yield degradation due to turbulent instability growth, yield degradation is not correlated with mixing between shell and fuel material. Indeed, an integrated measure of mixing decreases with increasing initial asymmetry magnitude due to delayed shock interactions caused by growth of the long-wavelength asymmetries without a corresponding delay in disassembly.

  20. Simulation of motions of the plasma in a fusion reactor for obtaining of future energy

    NASA Astrophysics Data System (ADS)

    Zhumabekov, Askhat

    2017-01-01

    According to the most conservative estimates, by the middle of the XXI century in the world energy consumption will double. This will be a consequence of the global economic development, population growth and other geopolitical and economic factors. Energy consumption in the world is growing much faster than its production and industrial use of new advanced technologies in the energy sector, for objective reasons, will not begin until 2030. This paper discusses how to obtain and develop nuclear energy on the experience of the National Nuclear Center. Implemented model for the problem of plasma confinement, and also presents the main achievements of modern construction and Megaproject National Nuclear Center in Kurchatov, the Republic of Kazakhstan. Spend a social survey in the East Kazakhstan region on the theme: “Prospects for the development of nuclear energy in Kazakhstan” and the citizens’ opinion. Narration new priorities for May 22, 2015 in Ust-Kamenogorsk in the industrial park “Altai” based on the competition of innovation projects green technology in the international exhibition “OSKEMEN EXPO - 2015”, with the participation of the regional authorities of the Republic of Kazakhstan, representatives of JSC NC “Astana Expo” and delegations from Japan, Russia, Canada, USA, South Korea.

  1. Pedestal transport in H-mode plasmas for fusion gain

    NASA Astrophysics Data System (ADS)

    Kotschenreuther, M.; Hatch, D. R.; Mahajan, S.; Valanju, P.; Zheng, L.; Liu, X.

    2017-06-01

    The first high fidelity gyrokinetic simulations of the energy losses in the transport barriers of large tokamaks in pursuit of fusion gain are presented. These simulations calculate the turbulent energy losses with an extensive treatment of relevant physical effects—fully kinetic, non-linear, electromagnetic—inclusive of all major plasma species, and in equilibria with relevant shape and local bootstrap current for fusion-relevant cases. We find that large plasmas with a small normalized gyroradius lie in an unexpected regime of enhanced losses that can prevent the projected energy gain. Our simulations are qualitatively consistent with recent experiments on JET with an ITER-like wall. Interestingly and very importantly, the simulations predict parameter regimes of reduced transport that are quite fusion-favorable.

  2. Experimental Characterization of a Plasma Deflagration Accelerator for Simulating Fusion Wall Response to Disruption Events

    NASA Astrophysics Data System (ADS)

    Underwood, Thomas; Loebner, Keith; Cappelli, Mark

    2016-10-01

    In this work, the suitability of a pulsed deflagration accelerator to simulate the interaction of edge-localized modes with plasma first wall materials is investigated. Experimental measurements derived from a suite of diagnostics are presented that focus on the both the properties of the plasma jet and the manner in which such jets couple with material interfaces. Detailed measurements of the thermodynamic plasma state variables within the jet are presented using a quadruple Langmuir probe operating in current-saturation mode. This data in conjunction with spectroscopic measurements of H α Stark broadening via a fast-framing, intensified CCD camera provide spatial and temporal measurements of how the plasma density and temperature scale as a function of input energy. Using these measurements, estimates for the energy flux associated with the deflagration accelerator are found to be completely tunable over a range spanning 150 MW m-2 - 30 GW m-2. The plasma-material interface is investigated using tungsten tokens exposed to the plasma plume under variable conditions. Visualizations of resulting shock structures are achieved through Schlieren cinematography and energy transfer dynamics are discussed by presenting temperature measurements of exposed materials. This work is supported by the U.S. Department of Energy Stewardship Science Academic Program in addition to the National Defense Science Engineering Graduate Fellowship.

  3. Framework for Modernization and Componentization of Fusion Modules

    NASA Astrophysics Data System (ADS)

    Carlsson, J.; Shasharina, S.; Cary, J. R.; Kruger, S.; Strand, P.

    2006-10-01

    There are several ongoing efforts to develop a software framework for integrated fusion simulations in the US (SWIM and CPES) the EU (ITM-TF) and Japan (BPSI). The European and Japanese efforts emphasize a standardization of the interfaces to codes within each subgroup (transport, equilibrium, linear stability, MHD, RF, turbulence, et cetera). The US efforts emphasize pairwise coupling of specific codes. The project ``Framework for Modernization and Componentization of Fusion Modules'' (FMCFM) primarily aims to complement the ongoing and future US integrated-modeling efforts by developing standard interfaces to US fusion codes and implement these interfaces by writing wrapper code for existing fusion libraries (transport, equilibrium and linear stability). Standardized interfaces will make it easier to validate codes and will simplify the maintenance for integrated fusion simulations by allowing drop-in replacement of components. FMCFM will also continue to liaise with related European projects and as far as possible try to ensure compatible standard interfaces. Some effort will also be spent on implementation of more robust and scalable solvers for both the Grad-Shafranov and transport equations. A comprehensive test suite for transport solvers will be developed. We will present results from the successfully concluded Phase I project and our plans for the Phase II project in more detail. *This work is funded by DOE through an SBIR grant.

  4. Simulation Software's Effect on College Students Spreadsheet Project Scores

    ERIC Educational Resources Information Center

    Atkinson, J. Kirk; Thrasher, Evelyn H.; Coleman, Phillip D.

    2011-01-01

    The purpose of this study is to explore the potential impact of support materials on student spreadsheet skill acquisition. Specifically, this study examines the use of an online spreadsheet simulation tool versus a printed book across two independent student groups. This study hypothesizes that the online spreadsheet simulation tool will have a…

  5. Secretarial Administration: Project In/Vest: Insurance Simulation Insures Learning

    ERIC Educational Resources Information Center

    Geier, Charlene

    1978-01-01

    Describes a simulated model office to replicate various insurance occupations set up in Greenfield High School, Wisconsin. Local insurance agents and students from other disciplines, such as distributive education, are involved in the simulation. The training is applicable to other business office positions, as it models not only an insurance…

  6. Project Shuttle simulation math model coordination catalog, revision 1

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A catalog is presented of subsystem and environment math models used or planned for space shuttle simulations. The purpose is to facilitate sharing of similar math models between shuttle simulations. It provides information on mach model requirements, formulations, schedules, and contact persons for further information.

  7. The Maya Project: Numerical Simulations of Black Hole Collisions

    NASA Astrophysics Data System (ADS)

    Smith, Kenneth; Calabrese, Gioel; Garrison, David; Kelly, Bernard; Laguna, Pablo; Lockitch, Keith; Pullin, Jorge; Shoemaker, Deirdre; Tiglio, Manuel

    2001-04-01

    The main objective of the MAYA project is the development of a numerical code to solve the vacuum Einstein's field equations for spacetimes containing multiple black hole singularities. Incorporating knowledge gained from previous similar efforts (Binary Black Holes Alliance and the AGAVE project) as well as one-dimensional numerical studies, MAYA has been built from the ground up within the architecture of Cactus 4.0, with particular attention paid to the software engineering aspects of code development. The goal of this new effort is to ultimately have a robust, efficient, readable, and stable numerical code for black hole evolution. This poster presents an overview of the project, focusing on the innovative aspects of the project as well as its current development status.

  8. A Student Project to use Geant4 Simulations for a TMS-PET combination

    SciTech Connect

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-10-26

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  9. Numerical analysis of applied magnetic field dependence in Malmberg-Penning Trap for compact simulator of energy driver in heavy ion fusion

    NASA Astrophysics Data System (ADS)

    Sato, T.; Park, Y.; Soga, Y.; Takahashi, K.; Sasaki, T.; Kikuchi, T.; Harada, Nob

    2016-05-01

    To simulate a pulse compression process of space charge dominated beams in heavy ion fusion, we have demonstrated a multi-particle numerical simulation as an equivalent beam using the Malmberg-Penning trap device. The results show that both transverse and longitudinal velocities as a function of external magnetic field strength are increasing during the longitudinal compression. The influence of space-charge effect, which is related to the external magnetic field, was observed as the increase of high velocity particles at the weak external magnetic field.

  10. Designing the Sniper: Improving Targeted Human Cytolytic Fusion Proteins for Anti-Cancer Therapy via Molecular Simulation

    PubMed Central

    Bochicchio, Anna; Jordaan, Sandra; Losasso, Valeria; Chetty, Shivan; Casasnovas Perera, Rodrigo; Ippoliti, Emiliano; Barth, Stefan; Carloni, Paolo

    2017-01-01

    Targeted human cytolytic fusion proteins (hCFPs) are humanized immunotoxins for selective treatment of different diseases including cancer. They are composed of a ligand specifically binding to target cells genetically linked to a human apoptosis-inducing enzyme. hCFPs target cancer cells via an antibody or derivative (scFv) specifically binding to e.g., tumor associated antigens (TAAs). After internalization and translocation of the enzyme from endocytosed endosomes, the human enzymes introduced into the cytosol are efficiently inducing apoptosis. Under in vivo conditions such enzymes are subject to tight regulation by native inhibitors in order to prevent inappropriate induction of cell death in healthy cells. Tumor cells are known to up-regulate these inhibitors as a survival mechanism resulting in escape of malignant cells from elimination by immune effector cells. Cytosolic inhibitors of Granzyme B and Angiogenin (Serpin P9 and RNH1, respectively), reduce the efficacy of hCFPs with these enzymes as effector domains, requiring detrimentally high doses in order to saturate inhibitor binding and rescue cytolytic activity. Variants of Granzyme B and Angiogenin might feature reduced affinity for their respective inhibitors, while retaining or even enhancing their catalytic activity. A powerful tool to design hCFPs mutants with improved potency is given by in silico methods. These include molecular dynamics (MD) simulations and enhanced sampling methods (ESM). MD and ESM allow predicting the enzyme-protein inhibitor binding stability and the associated conformational changes, provided that structural information is available. Such “high-resolution” detailed description enables the elucidation of interaction domains and the identification of sites where particular point mutations may modify those interactions. This review discusses recent advances in the use of MD and ESM for hCFP development from the viewpoints of scientists involved in both fields. PMID

  11. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  12. Warm starting the projected Gauss-Seidel algorithm for granular matter simulation

    NASA Astrophysics Data System (ADS)

    Wang, Da; Servin, Martin; Berglund, Tomas

    2016-03-01

    The effect on the convergence of warm starting the projected Gauss-Seidel solver for nonsmooth discrete element simulation of granular matter are investigated. It is found that the computational performance can be increased by a factor 2-5.

  13. The Tokamak Fusion Test Reactor decontamination and decommissioning project and the Tokamak Physics Experiment at the Princeton Plasma Physics Laboratory. Environmental Assessment

    SciTech Connect

    1994-05-27

    If the US is to meet the energy needs of the future, it is essential that new technologies emerge to compensate for dwindling supplies of fossil fuels and the eventual depletion of fissionable uranium used in present-day nuclear reactors. Fusion energy has the potential to become a major source of energy for the future. Power from fusion energy would provide a substantially reduced environmental impact as compared with other forms of energy generation. Since fusion utilizes no fossil fuels, there would be no release of chemical combustion products to the atmosphere. Additionally, there are no fission products formed to present handling and disposal problems, and runaway fuel reactions are impossible due to the small amounts of deuterium and tritium present. The purpose of the TPX Project is to support the development of the physics and technology to extend tokamak operation into the continuously operating (steady-state) regime, and to demonstrate advances in fundamental tokamak performance. The purpose of TFTR D&D is to ensure compliance with DOE Order 5820.2A ``Radioactive Waste Management`` and to remove environmental and health hazards posed by the TFTR in a non-operational mode. There are two proposed actions evaluated in this environmental assessment (EA). The actions are related because one must take place before the other can proceed. The proposed actions assessed in this EA are: the decontamination and decommissioning (D&D) of the Tokamak Fusion Test Reactor (TFTR); to be followed by the construction and operation of the Tokamak Physics Experiment (TPX). Both of these proposed actions would take place primarily within the TFTR Test Cell Complex at the Princeton Plasma Physics Laboratory (PPPL). The TFTR is located on ``D-site`` at the James Forrestal Campus of Princeton University in Plainsboro Township, Middlesex County, New Jersey, and is operated by PPPL under contract with the United States Department of Energy (DOE).

  14. How historic simulation-observation discrepancy affects future warming projections in a very large model ensemble

    NASA Astrophysics Data System (ADS)

    Goodwin, Philip

    2016-10-01

    Projections of future climate made by model-ensembles have credibility because the historic simulations by these models are consistent with, or near-consistent with, historic observations. However, it is not known how small inconsistencies between the ranges of observed and simulated historic climate change affects the future projections made by a model ensemble. Here, the impact of historical simulation-observation inconsistencies on future warming projections is quantified in a 4-million member Monte Carlo ensemble from a new efficient Earth System Model (ESM). Of the 4-million ensemble members, a subset of 182,500 are consistent with historic ranges of warming, heat uptake and carbon uptake simulated by the Climate Model Intercomparison Project 5 (CMIP5) ensemble. This simulation-consistent subset projects similar future warming ranges to the CMIP5 ensemble for all four RCP scenarios, indicating the new ESM represents an efficient tool to explore parameter space for future warming projections based on historic performance. A second subset of 14,500 ensemble members are consistent with historic observations for warming, heat uptake and carbon uptake. This observation-consistent subset projects a narrower range for future warming, with the lower bounds of projected warming still similar to CMIP5, but the upper warming bounds reduced by 20-35 %. These findings suggest that part of the upper range of twenty-first century CMIP5 warming projections may reflect historical simulation-observation inconsistencies. However, the agreement of lower bounds for projected warming implies that the likelihood of warming exceeding dangerous levels over the twenty-first century is unaffected by small discrepancies between CMIP5 models and observations.

  15. Non-Gaussian fluctuations and non-Markovian effects in the nuclear fusion process: Langevin dynamics emerging from quantum molecular dynamics simulations.

    PubMed

    Wen, Kai; Sakata, Fumihiko; Li, Zhu-Xia; Wu, Xi-Zhen; Zhang, Ying-Xun; Zhou, Shan-Gui

    2013-07-05

    Macroscopic parameters as well as precise information on the random force characterizing the Langevin-type description of the nuclear fusion process around the Coulomb barrier are extracted from the microscopic dynamics of individual nucleons by exploiting the numerical simulation of the improved quantum molecular dynamics. It turns out that the dissipation dynamics of the relative motion between two fusing nuclei is caused by a non-Gaussian distribution of the random force. We find that the friction coefficient as well as the time correlation function of the random force takes particularly large values in a region a little bit inside of the Coulomb barrier. A clear non-Markovian effect is observed in the time correlation function of the random force. It is further shown that an emergent dynamics of the fusion process can be described by the generalized Langevin equation with memory effects by appropriately incorporating the microscopic information of individual nucleons through the random force and its time correlation function.

  16. The Redesign of PROJECT SIMULATION for Microcomputer-Assisted Instruction in Psychology and Research Methodology.

    ERIC Educational Resources Information Center

    King, Alan R.; King, Barry F.

    1988-01-01

    This article offers guidelines to assist simulation developers in maximizing the lifespan of their software products through structured designs and creative attempts at integrating their programs into standard courses. Current efforts to redesign PROJECT SIMULATION, a computer-assisted instructional software package for teaching methodology in…

  17. Improving Faculty Perceptions of and Intent to Use Simulation: An Intervention Project

    ERIC Educational Resources Information Center

    Tucker, Charles

    2013-01-01

    Human patient simulation is an innovative teaching strategy that can facilitate practice development and preparation for entry into today's healthcare environment for nursing students. Unfortunately, the use of human patient simulation has been limited due to the perceptions of nursing faculty members. This project sought to explore those…

  18. Improving Faculty Perceptions of and Intent to Use Simulation: An Intervention Project

    ERIC Educational Resources Information Center

    Tucker, Charles

    2013-01-01

    Human patient simulation is an innovative teaching strategy that can facilitate practice development and preparation for entry into today's healthcare environment for nursing students. Unfortunately, the use of human patient simulation has been limited due to the perceptions of nursing faculty members. This project sought to explore those…

  19. The Multi-SAG project: filling the MultiDark simulations with semi-analytic galaxies

    NASA Astrophysics Data System (ADS)

    Vega-Martínez, C. A.; Cora, S. A.; Padilla, N. D.; Muñoz Arancibia, A. M.; Orsi, A. A.; Ruiz, A. N.

    2016-08-01

    The semi-analytical model sag is a code of galaxy formation and evolution which is applied to halo catalogs and merger trees extracted from cosmological -body simulations of dark matter. This contribution describes the project of constructing a catalog of simulated galaxies by adapting and applying the model sag over two dark matter simulations of the spanish MultiDark Project publicly available. Those simulations have particles, each, in boxes with sizes of 1000 Mpc and 400 Mpc respectively with Planck cosmological parameters. They cover a large range of masses and have halo mass resolutions of , therefore each simulation is able to produce more than 150 millions of simulated galaxies. A detailed description of the method is explained, and the first statistical results are shown.

  20. Ship Navigation Simulation Study, Pascagoula Harbor Improvement Project, Pascagoula, Mississippi

    DTIC Science & Technology

    1994-08-01

    between the proposed channels hdrian the LASH ship rins (scenmarios 5 and 6) through thdea two channel Prechs Theme results tend to support tie findings...area simulations 76 ChPw 6 FkW Carcmam wn Remom,, ebm ,,, I I I/ Appendix A Pilot Comments and Questionnaires APVPsx A PId Cofmfw nW Oueuaonnas. Al

  1. Final Technical Report for Center for Plasma Edge Simulation Research

    SciTech Connect

    Pankin, Alexei Y.; Bateman, Glenn; Kritz, Arnold H.

    2012-02-29

    The CPES research carried out by the Lehigh fusion group has sought to satisfy the evolving requirements of the CPES project. Overall, the Lehigh group has focused on verification and validation of the codes developed and/or integrated in the CPES project. Consequently, contacts and interaction with experimentalists have been maintained during the course of the project. Prof. Arnold Kritz, the leader of the Lehigh Fusion Group, has participated in the executive management of the CPES project. The code development and simulation studies carried out by the Lehigh fusion group are described in more detail in the sections below.

  2. Revitalizing Fusion via Fission Fusion

    NASA Astrophysics Data System (ADS)

    Manheimer, Wallace

    2001-10-01

    Existing tokamaks could generate significant nuclear fuel. TFTR, operating steady state with DT might generate enough fuel for a 300 MW nuclear reactor. The immediate goals of the magnetic fusion program would necessarily shift from a study of advanced plasma regimes in larger sized devices, to mostly known plasmas regimes, but at steady state or high duty cycle operation in DT plasmas. The science and engineering of breeding blankets would be equally important. Follow on projects could possibly produce nuclear fuel in large quantity at low price. Although today there is strong opposition to nuclear power in the United States, in a 21st century world of 10 billion people, all of whom will demand a middle class life style, nuclear energy will be important. Concern over greenhouse gases will also drive the world toward nuclear power. There are studies indicating that the world will need 10 TW of carbon free energy by 2050. It is difficult to see how this can be achieved without the breeding of nuclear fuel. By using the thorium cycle, proliferation risks are minimized. [1], [2]. 1 W. Manheimer, Fusion Technology, 36, 1, 1999, 2.W. Manheimer, Physics and Society, v 29, #3, p5, July, 2000

  3. A system simulation development project: Leveraging resources through partnerships

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.; Owen, A. Karl; Davis, Milt W.

    1995-01-01

    Partnerships between government agencies are an intellectually attractive method of conducting scientific research; the goal is to establish mutually beneficial participant roles for technology exchange that ultimately pays-off in a stronger R&D program for each partner. Anticipated and current aerospace research budgetary pressures through the 90's provide additional impetus for Government research agencies to candidly assess their R&D for those simulation activities no longer unique enough to warrant 'going it alone,' or for those elements where partnerships or teams can offset development costs. This paper describes a specific inter-agency system simulation activity that leverages the development cost of mutually beneficial R&D. While the direct positive influence of partnerships on complex technology developments is our main thesis, we also address on-going teaming issues and hope to impart to the reader the immense indirect (sometimes immeasurable) benefits that meaningful interagency partnerships can produce.

  4. Fast simulation of x-ray projections of spline-based surfaces using an append buffer.

    PubMed

    Maier, Andreas; Hofmann, Hannes G; Schwemmer, Chris; Hornegger, Joachim; Keil, Andreas; Fahrig, Rebecca

    2012-10-07

    Many scientists in the field of x-ray imaging rely on the simulation of x-ray images. As the phantom models become more and more realistic, their projection requires high computational effort. Since x-ray images are based on transmission, many standard graphics acceleration algorithms cannot be applied to this task. However, if adapted properly, the simulation speed can be increased dramatically using state-of-the-art graphics hardware. A custom graphics pipeline that simulates transmission projections for tomographic reconstruction was implemented based on moving spline surface models. All steps from tessellation of the splines, projection onto the detector and drawing are implemented in OpenCL. We introduced a special append buffer for increased performance in order to store the intersections with the scene for every ray. Intersections are then sorted and resolved to materials. Lastly, an absorption model is evaluated to yield an absorption value for each projection pixel. Projection of a moving spline structure is fast and accurate. Projections of size 640 × 480 can be generated within 254 ms. Reconstructions using the projections show errors below 1 HU with a sharp reconstruction kernel. Traditional GPU-based acceleration schemes are not suitable for our reconstruction task. Even in the absence of noise, they result in errors up to 9 HU on average, although projection images appear to be correct under visual examination. Projections generated with our new method are suitable for the validation of novel CT reconstruction algorithms. For complex simulations, such as the evaluation of motion-compensated reconstruction algorithms, this kind of x-ray simulation will reduce the computation time dramatically.

  5. Fast Simulation of X-ray Projections of Spline-based Surfaces using an Append Buffer

    PubMed Central

    Maier, Andreas; Hofmann, Hannes G.; Schwemmer, Chris; Hornegger, Joachim; Keil, Andreas; Fahrig, Rebecca

    2012-01-01

    Many scientists in the field of x-ray imaging rely on the simulation of x-ray images. As the phantom models become more and more realistic, their projection requires high computational effort. Since x-ray images are based on transmission, many standard graphics acceleration algorithms cannot be applied to this task. However, if adapted properly, simulation speed can be increased dramatically using state-of-the-art graphics hardware. A custom graphics pipeline that simulates transmission projections for tomographic reconstruction was implemented based on moving spline surface models. All steps from tessellation of the splines, projection onto the detector, and drawing are implemented in OpenCL. We introduced a special append buffer for increased performance in order to store the intersections with the scene for every ray. Intersections are then sorted and resolved to materials. Lastly, an absorption model is evaluated to yield an absorption value for each projection pixel. Projection of a moving spline structure is fast and accurate. Projections of size 640×480 can be generated within 254 ms. Reconstructions using the projections show errors below 1 HU with a sharp reconstruction kernel. Traditional GPU-based acceleration schemes are not suitable for our reconstruction task. Even in the absence of noise, they result in errors up to 9 HU on average, although projection images appear to be correct under visual examination. Projections generated with our new method are suitable for the validation of novel CT reconstruction algorithms. For complex simulations, such as the evaluation of motion-compensated reconstruction algorithms, this kind of x-ray simulation will reduce the computation time dramatically. Source code is available at http://conrad.stanford.edu/ PMID:22975431

  6. Fast simulation of x-ray projections of spline-based surfaces using an append buffer

    NASA Astrophysics Data System (ADS)

    Maier, Andreas; Hofmann, Hannes G.; Schwemmer, Chris; Hornegger, Joachim; Keil, Andreas; Fahrig, Rebecca

    2012-10-01

    Many scientists in the field of x-ray imaging rely on the simulation of x-ray images. As the phantom models become more and more realistic, their projection requires high computational effort. Since x-ray images are based on transmission, many standard graphics acceleration algorithms cannot be applied to this task. However, if adapted properly, the simulation speed can be increased dramatically using state-of-the-art graphics hardware. A custom graphics pipeline that simulates transmission projections for tomographic reconstruction was implemented based on moving spline surface models. All steps from tessellation of the splines, projection onto the detector and drawing are implemented in OpenCL. We introduced a special append buffer for increased performance in order to store the intersections with the scene for every ray. Intersections are then sorted and resolved to materials. Lastly, an absorption model is evaluated to yield an absorption value for each projection pixel. Projection of a moving spline structure is fast and accurate. Projections of size 640 × 480 can be generated within 254 ms. Reconstructions using the projections show errors below 1 HU with a sharp reconstruction kernel. Traditional GPU-based acceleration schemes are not suitable for our reconstruction task. Even in the absence of noise, they result in errors up to 9 HU on average, although projection images appear to be correct under visual examination. Projections generated with our new method are suitable for the validation of novel CT reconstruction algorithms. For complex simulations, such as the evaluation of motion-compensated reconstruction algorithms, this kind of x-ray simulation will reduce the computation time dramatically.

  7. WE-EF-207-08: Improve Cone Beam CT Using a Synchronized Moving Grid, An Inter-Projection Sensor Fusion and a Probability Total Variation Reconstruction

    SciTech Connect

    Zhang, H; Kong, V; Jin, J; Ren, L; Zhang, Y; Giles, W

    2015-06-15

    Purpose: To present a cone beam computed tomography (CBCT) system, which uses a synchronized moving grid (SMOG) to reduce and correct scatter, an inter-projection sensor fusion (IPSF) algorithm to estimate the missing information blocked by the grid, and a probability total variation (pTV) algorithm to reconstruct the CBCT image. Methods: A prototype SMOG-equipped CBCT system was developed, and was used to acquire gridded projections with complimentary grid patterns in two neighboring projections. Scatter was reduced by the grid, and the remaining scatter was corrected by measuring it under the grid. An IPSF algorithm was used to estimate the missing information in a projection from data in its 2 neighboring projections. Feldkamp-Davis-Kress (FDK) algorithm was used to reconstruct the initial CBCT image using projections after IPSF processing for pTV. A probability map was generated depending on the confidence of estimation in IPSF for the regions of missing data and penumbra. pTV was finally used to reconstruct the CBCT image for a Catphan, and was compared to conventional CBCT image without using SMOG, images without using IPSF (SMOG + FDK and SMOG + mask-TV), and image without using pTV (SMOG + IPSF + FDK). Results: The conventional CBCT without using SMOG shows apparent scatter-induced cup artifacts. The approaches with SMOG but without IPSF show severe (SMOG + FDK) or additional (SMOG + TV) artifacts, possibly due to using projections of missing data. The 2 approaches with SMOG + IPSF removes the cup artifacts, and the pTV approach is superior than the FDK by substantially reducing the noise. Using the SMOG also reduces half of the imaging dose. Conclusion: The proposed technique is promising in improving CBCT image quality while reducing imaging dose.

  8. Tactical Aviation Mission System Simulation Situational Awareness Project

    DTIC Science & Technology

    2004-04-01

    CH - 146 Griffon helicopter flown by the Department of National Defence (DND). The Carleton...the CH - 146 Griffon . HELISIM accepts inputs from the collective, cyclic and pedals of the simulator and using the defined flight model , updates... helicopters , (d) one wrecked CH149, (e) two CH149s flying in small loop formations, (f) two hovering CH - 146 Griffon helicopters , (g) one formation of

  9. Community Project for Accelerator Science and Simulation (ComPASS)

    SciTech Connect

    Simmons, Christopher; Carey, Varis

    2016-10-12

    After concluding our initial exercise (solving a simplified statistical inverse problem with unknown parameter laser intensity) of coupling Vorpal and our parallel statistical library QUESO, we shifted the application focus to DLA. Our efforts focused on developing a Gaussian process (GP) emulator within QUESO for efficient optimization of power couplers within woodpiles. The smaller simulation size (compared with LPA) allows for sufficient “training runs” to develop a reasonable GP statistical emulator for a parameter space of moderate dimension.

  10. Spinal Fusion

    MedlinePlus

    ... concept of fusion is similar to that of welding in industry. Spinal fusion surgery, however, does not ... bone taken from the patient has a long history of use and results in predictable healing. Autograft ...

  11. Spinal Fusion

    MedlinePlus

    ... concept of fusion is similar to that of welding in industry. Spinal fusion surgery, however, does not ... bone taken from the patient has a long history of use and results in predictable healing. Autograft ...

  12. Inertial electrostatic confinement and nuclear fusion in the interelectrode plasma of a nanosecond vacuum discharge. II: Particle-in-cell simulations

    SciTech Connect

    Kurilenkov, Yu. K.; Tarakanov, V. P.; Gus'kov, S. Yu.

    2010-12-15

    Results of particle-in-sell simulations of ion acceleration by using the KARAT code in a cylindrical geometry in the problem formulation corresponding to an actual experiment with a low-energy vacuum discharge with a hollow cathode are presented. The fundamental role of the formed virtual cathode is analyzed. The space-time dynamics of potential wells related to the formation of the virtual cathode is discussed. Quasi-steady potential wells (with a depth of {approx}80% of the applied voltage) cause acceleration of deuterium ions to energies about the electron beam energy ({approx}50 keV). In the well, a quasi-isotropic velocity distribution function of fast ions forms. The results obtained are compared with available data on inertial electrostatic confinement fusion (IECF). In particular, similar correlations between the structure of potential wells and the neutron yield, as well as the scaling of the fusion power density, which increases with decreasing virtual cathode radius and increasing potential well depth, are considered. The chosen electrode configuration and potential well parameters provide power densities of nuclear DD fusion in a nanosecond vacuum discharge noticeably higher than those achieved in other similar IECF systems.

  13. A GPU tool for efficient, accurate, and realistic simulation of cone beam CT projections

    PubMed Central

    Jia, Xun; Yan, Hao; Cerviño, Laura; Folkerts, Michael; Jiang, Steve B.

    2012-01-01

    Purpose: Simulation of x-ray projection images plays an important role in cone beam CT (CBCT) related research projects, such as the design of reconstruction algorithms or scanners. A projection image contains primary signal, scatter signal, and noise. It is computationally demanding to perform accurate and realistic computations for all of these components. In this work, the authors develop a package on graphics processing unit (GPU), called gDRR, for the accurate and efficient computations of x-ray projection images in CBCT under clinically realistic conditions. Methods: The primary signal is computed by a trilinear ray-tracing algorithm. A Monte Carlo (MC) simulation is then performed, yielding the primary signal and the scatter signal, both with noise. A denoising process specifically designed for Poisson noise removal is applied to obtain a smooth scatter signal. The noise component is then obtained by combining the difference between the MC primary and the ray-tracing primary signals, and the difference between the MC simulated scatter and the denoised scatter signals. Finally, a calibration step converts the calculated noise signal into a realistic one by scaling its amplitude according to a specified mAs level. The computations of gDRR include a number of realistic features, e.g., a bowtie filter, a polyenergetic spectrum, and detector response. The implementation is fine-tuned for a GPU platform to yield high computational efficiency. Results: For a typical CBCT projection with a polyenergetic spectrum, the calculation time for the primary signal using the ray-tracing algorithms is 1.2–2.3 s, while the MC simulations take 28.1–95.3 s, depending on the voxel size. Computation time for all other steps is negligible. The ray-tracing primary signal matches well with the primary part of the MC simulation result. The MC simulated scatter signal using gDRR is in agreement with EGSnrc results with a relative difference of 3.8%. A noise calibration process is

  14. A GPU tool for efficient, accurate, and realistic simulation of cone beam CT projections.

    PubMed

    Jia, Xun; Yan, Hao; Cervino, Laura; Folkerts, Michael; Jiang, Steve B

    2012-12-01

    Simulation of x-ray projection images plays an important role in cone beam CT (CBCT) related research projects, such as the design of reconstruction algorithms or scanners. A projection image contains primary signal, scatter signal, and noise. It is computationally demanding to perform accurate and realistic computations for all of these components. In this work, the authors develop a package on graphics processing unit (GPU), called gDRR, for the accurate and efficient computations of x-ray projection images in CBCT under clinically realistic conditions. The primary signal is computed by a trilinear ray-tracing algorithm. A Monte Carlo (MC) simulation is then performed, yielding the primary signal and the scatter signal, both with noise. A denoising process specifically designed for Poisson noise removal is applied to obtain a smooth scatter signal. The noise component is then obtained by combining the difference between the MC primary and the ray-tracing primary signals, and the difference between the MC simulated scatter and the denoised scatter signals. Finally, a calibration step converts the calculated noise signal into a realistic one by scaling its amplitude according to a specified mAs level. The computations of gDRR include a number of realistic features, e.g., a bowtie filter, a polyenergetic spectrum, and detector response. The implementation is fine-tuned for a GPU platform to yield high computational efficiency. For a typical CBCT projection with a polyenergetic spectrum, the calculation time for the primary signal using the ray-tracing algorithms is 1.2-2.3 s, while the MC simulations take 28.1-95.3 s, depending on the voxel size. Computation time for all other steps is negligible. The ray-tracing primary signal matches well with the primary part of the MC simulation result. The MC simulated scatter signal using gDRR is in agreement with EGSnrc results with a relative difference of 3.8%. A noise calibration process is conducted to calibrate g

  15. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    ERIC Educational Resources Information Center

    Misfeldt, Morten

    2015-01-01

    In this paper I describe how students use a project management simulation game based on an attack-defense mechanism where two teams of players compete by challenging each other's projects. The project management simulation game is intended to be played by pre-service construction workers and engineers. The gameplay has two parts: a planning part,…

  16. Projection collimator optics for DMD-based infrared scene simulator

    NASA Astrophysics Data System (ADS)

    Zheng, Yawei; Hu, Yu; Li, Junnan; Huang, Meili; Gao, Jiaobo; Wang, Jun; Sun, Kefeng; Li, Jianjun; Zhang, Fang

    2016-10-01

    The design of the collimator for dynamic infrared (IR) scene simulation based on the digital micro-mirror devices (DMD) is present in this paper. The collimator adopts a reimaging configuration to limit in physical size availability and cost. The aspheric lens is used in the relay optics to improve the image quality and simplify the optics configuration. The total internal reflection (TIR) prisms is located between the last surface of the optics and the DMD to fold the raypaths of the IR light source. The optics collimates the output from 1024×768 element DMD in the 8 10.3μm waveband and enables an imaging system to be tested out of 8° Field Of View (FOV). The long pupil distance of 800mm ensures the remote location seekers under the test.

  17. Radioscapholunate Fusions

    PubMed Central

    McGuire, Duncan Thomas; Bain, Gregory Ian

    2012-01-01

    Radiocarpal fusions are performed for a variety of indications, most commonly for debilitating painful arthritis. The goal of a wrist fusion is to fuse the painful, diseased joints and to preserve motion through the healthy joints. Depending on the extent of the disease process, radiocarpal fusions may take the form of radiolunate, radioscapholunate, or total wrist fusions. Surgical techniques and instrumentation have advanced over the last few decades, and consequently the functional outcomes have improved and complications decreased. Techniques for partial carpal fusions have improved and now include distal scaphoid and triquetrum excision, which improves range of motion and fusion rates. In this article we discuss the various surgical techniques and fixation methods available and review the corresponding evidence in the literature. The authors' preferred surgical technique of radioscapholunate fusion with distal scaphoid and triquetrum excision is outlined. New implants and new concepts are also discussed. PMID:24179717

  18. Modeling and Simulation Optimization and Feasibility Studies for the Neutron Detection without Helium-3 Project

    SciTech Connect

    Ely, James H.; Siciliano, Edward R.; Swinhoe, Martyn T.; Lintereur, Azaree T.

    2013-01-01

    This report details the results of the modeling and simulation work accomplished for the ‘Neutron Detection without Helium-3’ project during the 2011 and 2012 fiscal years. The primary focus of the project is to investigate commercially available technologies that might be used in safeguards applications in the relatively near term. Other technologies that are being developed may be more applicable in the future, but are outside the scope of this study.

  19. Project ARGO: Gas phase formation in simulated microgravity

    NASA Technical Reports Server (NTRS)

    Powell, Michael R.; Waligora, James M.; Norfleet, William T.; Kumar, K. Vasantha

    1993-01-01

    The ARGO study investigated the reduced incidence of joint pain decompression sickness (DCS) encountered in microgravity as compared with an expected incidence of joint pain DCS experienced by test subjects in Earth-based laboratories (unit gravity) with similar protocols. Individuals who are decompressed from saturated conditions usually acquire joint pain DCS in the lower extremities. Our hypothesis is that the incidence of joint pain DCS can be limited by a significant reduction in the tissue gas micronuclei formed by stress-assisted nucleation. Reductions in dynamic and kinetic stresses in vivo are linked to hypokinetic and adynamic conditions of individuals in zero g. We employed the Doppler ultrasound bubble detection technique in simulated microgravity studies to determine quantitatively the degree of gas phase formation in the upper and lower extremities of test subjects during decompression. We found no evidence of right-to-left shunting through pulmonary vasculature. The volume of gas bubble following decompression was examined and compared with the number following saline contrast injection. From this, we predict a reduced incidence of DCS on orbit, although the incidence of predicted mild DCS still remains larger than that encountered on orbit.

  20. Projected 2050 Model Simulations for the Chesapeake Bay ...

    EPA Pesticide Factsheets

    The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and future, 2050, Weather Research and Forecast (WRF) metrological and Community Multiscale Air Quality (CMAQ) chemical transport model simulations to provide meteorological and nutrient deposition estimates for inclusion of the Chesapeake Bay Program’s assessment of how climate and land use change may impact water quality and ecosystem health. This presentation will present the timeline and research updates. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  1. Projected 2050 Model Simulations for the Chesapeake Bay ...

    EPA Pesticide Factsheets

    The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and future, 2050, Weather Research and Forecast (WRF) metrological and Community Multiscale Air Quality (CMAQ) chemical transport model simulations to provide meteorological and nutrient deposition estimates for inclusion of the Chesapeake Bay Program’s assessment of how climate and land use change may impact water quality and ecosystem health. This presentation will present the timeline and research updates. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  2. Application of scene projection technologies in the AEDC cryo-vacuum space simulation chambers

    NASA Astrophysics Data System (ADS)

    Lowry, H. S.; Crider, D. H.; Breeden, M. F.; Goethert, W. H.; Bertrand, W. T.; Steely, S. L.

    2006-05-01

    The space simulation chambers at the Arnold Engineering Development Center (AEDC) have performed space sensor characterization, calibration, and mission simulation testing on space-based, interceptor, and air-borne sensors for more than three decades. A continual effort to implement the latest scene simulation and projection technologies into these ground-based space sensor test chambers is necessary to properly manage the development of space defense systems. This requires the integration of high-fidelity, complex, dynamic scene projection systems that can provide the simulation of the desired target temperatures and ranges. The technologies to accomplish this include multiple-band source subsystems and special spectral tailoring methods, as well as comprehensive analysis and optical properties measurements of the components involved. Implementation of such techniques in the AEDC space sensor test facilities is discussed in this paper.

  3. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  4. Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.; hide

    2008-01-01

    Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.

  5. A simulation study on the focal plane detector of the LAUE project

    NASA Astrophysics Data System (ADS)

    Khalil, M.; Frontera, F.; Caroli, E.; Virgilli, E.; Valsan, V.

    2015-06-01

    The LAUE project, supported by the Italian Space Agency (ASI), is devoted to the development of a long focal length (even 20 m or longer) Laue lens for gamma ray astronomy between 80 and 600 keV. These lenses take advantage of Bragg diffraction to focus radiation onto a small spot drastically improving the signal to noise ratio as well as reducing the required size of the detector significantly. In this paper we present a Monte-Carlo simulation study with MEGALIB to optimize, for space applications, the detector size to achieve high detection efficiency, and to optimize the position resolution of the detector to reconstruct the Point Spread Function of the lens considered for the LAUE project. Then we will show simulations, using the SILVACO semiconductor simulation toolkit, on the optimized detector to estimate its capacitance per channel and depletion voltage. In all of the simulations, two materials were compared; a low density material (Silicon) and a high density material (Germanium).

  6. Ten years of computer visual simulations on large scale projects in the western United States

    SciTech Connect

    Ellsworth, J.C.

    1999-07-01

    Computer visual simulations are used to portray proposed landscape changes with true color, photo-realistic quality, and high levels of accuracy and credibility. this sophisticated technology is a valuable tool for planners, landscape architects, architects, engineers, environmental consultants, government agencies and private operators in the design and planning of surface mining operations. This paper presents examples of the application of computer visual simulations on large scale projects in the western United States, including those which generally require environmental impact statement under the National Environmental Policy Act of 1969 (e.g., open pit coal mines, gold surface mines, highways and bridges, oil and gas development, and alpine ski areas). This presentation will describe the development criteria, process, and use of the computer visual simulations of these types of projects. The issues of computer visual simulation accuracy, bias, credibility, ethics, and realism will be discussed with emphasis on application in real world situations. the use of computer visual simulations as a tool in the planning and design of these types of projects will be presented, along with discussion of their use in project permitting and public involvement.

  7. Final Report for LDRD Project on Rapid Problem Setup for Mesh-Based Simulation (Rapsodi)

    SciTech Connect

    Brown, D L; Henshaw, W; Petersson, N A; Fast, P; Chand, K

    2003-02-07

    Under LLNL Exploratory Research LDRD funding, the Rapsodi project developed rapid setup technology for computational physics and engineering problems that require computational representations of complex geometry. Many simulation projects at LLNL involve the solution of partial differential equations in complex 3-D geometries. A significant bottleneck in carrying out these simulations arises in converting some specification of a geometry, such as a computer-aided design (CAD) drawing to a computationally appropriate 3-D mesh that can be used for simulation and analysis. Even using state-of-the-art mesh generation software, this problem setup step typically has required weeks or months, which is often much longer than required to carry out the computational simulation itself. The Rapsodi project built computational tools and designed algorithms that help to significantly reduce this setup time to less than a day for many realistic problems. The project targeted rapid setup technology for computational physics and engineering problems that use mixed-element unstructured meshes, overset meshes or Cartesian-embedded boundary (EB) meshes to represent complex geometry. It also built tools that aid in constructing computational representations of geometry for problems that do not require a mesh. While completely automatic mesh generation is extremely difficult, the amount of manual labor required can be significantly reduced. By developing novel, automated, component-based mesh construction procedures and automated CAD geometry repair and cleanup tools, Rapsodi has significantly reduced the amount of hand crafting required to generate geometry and meshes for scientific simulation codes.

  8. Projection of retirement adequacy under wealth-needs model using simulated data

    NASA Astrophysics Data System (ADS)

    Alaudin, Ros Idayuwati; Ismail, Noriszura; Isa, Zaidi

    2017-04-01

    The objective of this study is to estimate the retirement wealth adequacy of future retirees under defined contribution (DC) plan in Malaysia using wealth-need model and simulation approach. Several feasible scenarios are projected, including the best case (or optimistic) and worse case (or pessimistic) scenarios, based on several related assumptions. The results show that 36% of households are projected to have adequate retirement benefits under the pessimistic scenario, while 84% of households are projected to have adequate retirement benefits under the optimistic scenario.

  9. Final Report for "Community Petascale Project for Accelerator Science and Simulations".

    SciTech Connect

    Cary, J. R.; Bruhwiler, D. L.; Stoltz, P. H.; Cormier-Michel, E.; Cowan, B.; Schwartz, B. T.; Bell, G.; Paul, K.; Veitzer, S.

    2013-04-19

    This final report describes the work that has been accomplished over the past 5 years under the Community Petascale Project for Accelerator and Simulations (ComPASS) at Tech-X Corporation. Tech-X had been involved in the full range of ComPASS activities with simulation of laser plasma accelerator concepts, mainly in collaboration with LOASIS program at LBNL, simulation of coherent electron cooling in collaboration with BNL, modeling of electron clouds in high intensity accelerators, in collaboration with researchers at Fermilab and accurate modeling of superconducting RF cavity in collaboration with Fermilab, JLab and Cockcroft Institute in the UK.

  10. Retrieval process development and enhancements project Fiscal year 1995: Simulant development technology task progress report

    SciTech Connect

    Golcar, G.R.; Bontha, J.R.; Darab, J.G.

    1997-01-01

    The mission of the Retrieval Process Development and Enhancements (RPD&E) project is to develop an understanding of retrieval processes, including emerging and existing technologies, gather data on these technologies, and relate the data to specific tank problems such that end-users have the requisite technical bases to make retrieval and closure decisions. The development of waste simulants is an integral part of this effort. The work of the RPD&E simulant-development task is described in this document. The key FY95 accomplishments of the RPD&E simulant-development task are summarized below.

  11. Fusion breeder

    SciTech Connect

    Moir, R.W.

    1982-04-20

    The fusion breeder is a fusion reactor designed with special blankets to maximize the transmutation by 14 MeV neutrons of uranium-238 to plutonium or thorium to uranium-233 for use as a fuel for fission reactors. Breeding fissile fuels has not been a goal of the US fusion energy program. This paper suggests it is time for a policy change to make the fusion breeder a goal of the US fusion program and the US nuclear energy program. The purpose of this paper is to suggest this policy change be made and tell why it should be made, and to outline specific research and development goals so that the fusion breeder will be developed in time to meet fissile fuel needs.

  12. Fusion breeder

    SciTech Connect

    Moir, R.W.

    1982-02-22

    The fusion breeder is a fusion reactor designed with special blankets to maximize the transmutation by 14 MeV neutrons of uranium-238 to plutonium or thorium to uranium-233 for use as a fuel for fission reactors. Breeding fissile fuels has not been a goal of the US fusion energy program. This paper suggests it is time for a policy change to make the fusion breeder a goal of the US fusion program and the US nuclear energy program. The purpose of this paper is to suggest this policy change be made and tell why it should be made, and to outline specific research and development goals so that the fusion breeder will be developed in time to meet fissile fuel needs.

  13. The Use of Physiological Indices in Simulation Research: A Report on Project CORES (Covert and Overt Responses to Educational Simulations). A Symposium.

    ERIC Educational Resources Information Center

    Dyrenfurth, Michael; And Others

    In two separate reports the founding and set up of Project CORES was outlined, and then a specific research project was described. Project CORES began in the efforts of three men who felt a more systematic investigation of simulation effects was needed. The criteria felt most sensitive were the physiological activities of galvanic skin potential…

  14. Information integration for data fusion

    SciTech Connect

    Bray, O.H.

    1997-01-01

    Data fusion has been identified by the Department of Defense as a critical technology for the U.S. defense industry. Data fusion requires combining expertise in two areas - sensors and information integration. Although data fusion is a rapidly growing area, there is little synergy and use of common, reusable, and/or tailorable objects and models, especially across different disciplines. The Laboratory-Directed Research and Development project had two purposes: to see if a natural language-based information modeling methodology could be used for data fusion problems, and if so, to determine whether this methodology would help identify commonalities across areas and achieve greater synergy. The project confirmed both of the initial hypotheses: that the natural language-based information modeling methodology could be used effectively in data fusion areas and that commonalities could be found that would allow synergy across various data fusion areas. The project found five common objects that are the basis for all of the data fusion areas examined: targets, behaviors, environments, signatures, and sensors. Many of the objects and the specific facts related to these objects were common across several areas and could easily be reused. In some cases, even the terminology remained the same. In other cases, different areas had their own terminology, but the concepts were the same. This commonality is important with the growing use of multisensor data fusion. Data fusion is much more difficult if each type of sensor uses its own objects and models rather than building on a common set. This report introduces data fusion, discusses how the synergy generated by this LDRD would have benefited an earlier successful project and contains a summary information model from that project, describes a preliminary management information model, and explains how information integration can facilitate cross-treaty synergy for various arms control treaties.

  15. The Virtual Liver Project: Modeling Tissue Response To Chemicals Through Multiscale Simulation

    EPA Science Inventory

    The US EPA Virtual Liver Project is aimed at simulating the risk of toxic effects from environmental chemicals in silico. The computational systems model of organ injury due to chronic chemical exposure is based on: (i) the dynamics of perturbed molecular pathways, (ii) their lin...

  16. Simulating Limb Formation in the U.S. EPA Virtual Embryo - Risk Assessment Project

    EPA Science Inventory

    The U.S. EPA’s Virtual Embryo project (v-Embryo™) is a computer model simulation of morphogenesis that integrates cell and molecular level data from mechanistic and in vitro assays with knowledge about normal development processes to assess in silico the effects of chemicals on d...

  17. Simulating Limb Formation in the U.S. EPA Virtual Embryo - Risk Assessment Project

    EPA Science Inventory

    The U.S. EPA’s Virtual Embryo project (v-Embryo™) is a computer model simulation of morphogenesis that integrates cell and molecular level data from mechanistic and in vitro assays with knowledge about normal development processes to assess in silico the effects of chemicals on d...

  18. The Virtual Liver Project: Simulating Tissue Injury Through Molecular and Cellular Processes

    EPA Science Inventory

    Efficiently and humanely testing the safety of thousands of environmental chemicals is a challenge. The US EPA Virtual Liver Project (v-Liver™) is aimed at simulating the effects of environmental chemicals computationally in order to estimate the risk of toxic outcomes in humans...

  19. The Virtual Liver Project: Simulating Tissue Injury Through Molecular and Cellular Processes

    EPA Science Inventory

    Efficiently and humanely testing the safety of thousands of environmental chemicals is a challenge. The US EPA Virtual Liver Project (v-Liver™) is aimed at simulating the effects of environmental chemicals computationally in order to estimate the risk of toxic outcomes in humans...

  20. The Virtual Liver Project: Modeling Tissue Response To Chemicals Through Multiscale Simulation

    EPA Science Inventory

    The US EPA Virtual Liver Project is aimed at simulating the risk of toxic effects from environmental chemicals in silico. The computational systems model of organ injury due to chronic chemical exposure is based on: (i) the dynamics of perturbed molecular pathways, (ii) their lin...

  1. Visual Analysis of Multi-Run Spatio-Temporal Simulations Using Isocontour Similarity for Projected Views.

    PubMed

    Fofonov, Alexey; Molchanov, Vladimir; Linsen, Lars

    2016-08-01

    Multi-run simulations are widely used to investigate how simulated processes evolve depending on varying initial conditions. Frequently, such simulations model the change of spatial phenomena over time. Isocontours have proven to be effective for the visual representation and analysis of 2D and 3D spatial scalar fields. We propose a novel visualization approach for multi-run simulation data based on isocontours. By introducing a distance function for isocontours, we generate a distance matrix used for a multidimensional scaling projection. Multiple simulation runs are represented by polylines in the projected view displaying change over time. We propose a fast calculation of isocontour differences based on a quasi-Monte Carlo approach. For interactive visual analysis, we support filtering and selection mechanisms on the multi-run plot and on linked views to physical space visualizations. Our approach can be effectively used for the visual representation of ensembles, for pattern and outlier detection, for the investigation of the influence of simulation parameters, and for a detailed analysis of the features detected. The proposed method is applicable to data of any spatial dimensionality and any spatial representation (gridded or unstructured). We validate our approach by performing a user study on synthetic data and applying it to different types of multi-run spatio-temporal simulation data.

  2. A fusion of minds

    NASA Astrophysics Data System (ADS)

    Corfield, Richard

    2013-02-01

    Mystery still surrounds the visit of the astronomer Sir Bernard Lovell to the Soviet Union in 1963. But his collaboration - and that of other British scientists - eased geopolitical tensions at the height of the Cold War and paved the way for today's global ITER fusion project, as Richard Corfield explains.

  3. "Polarized" Fusion

    NASA Astrophysics Data System (ADS)

    Schieck, Hans Paetz Gen.

    Increasing energy demand in view of limited supply, as well as environmental and nuclear-safety concerns leading to increased emphasis on renewable energy sources such as solar or wind energy are expected to focus public and scientific interest increasingly also on fusion energy. With the decision to build ITER (low-density magnetic confinement) and also continuing research on (high-density) inertial-confinement fusion (cf. the inauguration of the laser fusion facility at the Lawrence Livermore National Laboratory) prospects of fusion energy have probably entered a new era.

  4. Simulator verification effort at the South Texas project electric generating station

    SciTech Connect

    Bellmore, P.E.; Albury, C.R.

    1987-01-01

    This paper presents the work being done at Houston Lighting and Power Company to verify the South Texas Project Electric Generating Station (STPEGS) simulator. The purpose of that work is to assure that the STPEGS simulator adequately reflects plant response during normal and abnormal transients. An enhanced understanding of the engineering and organizational needs of a simulator verification program is significant. This paper presents the techniques used to develop a best-estimate model. The best-estimate model generates plant response data for comparison with the STPEGS simulator. A typical licensing model is inadequate for this work because of the conservative assumptions in the model. The authors examine, in this paper, the interaction between the various groups responsible for simulator verification.

  5. The Multiscale Systems Immunology project: software for cell-based immunological simulation

    PubMed Central

    Mitha, Faheem; Lucas, Timothy A; Feng, Feng; Kepler, Thomas B; Chan, Cliburn

    2008-01-01

    Background Computer simulations are of increasing importance in modeling biological phenomena. Their purpose is to predict behavior and guide future experiments. The aim of this project is to model the early immune response to vaccination by an agent based immune response simulation that incorporates realistic biophysics and intracellular dynamics, and which is sufficiently flexible to accurately model the multi-scale nature and complexity of the immune system, while maintaining the high performance critical to scientific computing. Results The Multiscale Systems Immunology (MSI) simulation framework is an object-oriented, modular simulation framework written in C++ and Python. The software implements a modular design that allows for flexible configuration of components and initialization of parameters, thus allowing simulations to be run that model processes occurring over different temporal and spatial scales. Conclusion MSI addresses the need for a flexible and high-performing agent based model of the immune system. PMID:18442405

  6. Project Report on DOE Young Investigator Grant (Contract No. DE-FG02-02ER25525) Dynamic Scheduling and Fusion of Irregular Computation (August 15, 2002 to August 14, 2005)

    SciTech Connect

    Ding, Chen

    2005-08-16

    Computer simulation has become increasingly important in many scientiï¬ c disciplines, but its performance and scalability are severely limited by the memory throughput on today's computer systems. With the support of this grant, we ï¬ rst designed training-based prediction, which accurately predicts the memory performance of large applications before their execution. Then we developed optimization techniques using dynamic computation fusion and large-scale data transformation. The research work has three major components. The ï¬ rst is modeling and prediction of cache behav- ior. We have developed a new technique, which uses reuse distance information from training inputs then extracts a parameterized model of the program's cache miss rates for any input size and for any size of fully associative cache. Using the model we have built a web-based tool using three dimensional visualization. The new model can help to build cost-effective computer systems, design better benchmark suites, and improve task scheduling on heterogeneous systems. The second component is global computation for improving cache performance. We have developed an algorithm for dynamic data partitioning using sampling theory and probability distribution. Recent work from a number of groups show that manual or semi-manual computation fusion has signiï¬ cant beneï¬ ts in physical, mechanical, and biological simulations as well as information retrieval and machine veriï¬ cation. We have developed an au- tomatic tool that measures the potential of computation fusion. The new system can be used by high-performance application programmers to estimate the potential of locality improvement for a program before trying complex transformations for a speciï¬ c cache system. The last component studies models of spatial locality and the problem of data layout. In scientific programs, most data are stored in arrays. Grand challenge problems such as hydrodynamics simulation and data mining may use

  7. Laser fusion monthly -- August 1980

    SciTech Connect

    Ahlstrom, H.G.

    1980-08-01

    This report documents the monthly progress for the laser fusion research at Lawrence Livermore National Laboratory. First it gives facilities report for both the Shiva and Argus projects. Topics discussed include; laser system for the Nova Project; the fusion experiments analysis facility; optical/x-ray streak camera; Shiva Dante System temporal response; 2{omega}{sub 0} experiment; and planning for an ICF engineering test facility.

  8. Toward the credibility of Northeast United States summer precipitation projections in CMIP5 and NARCCAP simulations

    NASA Astrophysics Data System (ADS)

    Thibeault, Jeanne M.; Seth, A.

    2015-10-01

    Precipitation projections for the northeast United States and nearby Canada (Northeast) are examined for 15 Fifth Phase of the Coupled Model Intercomparison Project (CMIP5) models. A process-based evaluation of atmospheric circulation features associated with wet Northeast summers is performed to examine whether credibility can be differentiated within the multimodel ensemble. Based on these evaluations, and an analysis of the interannual statistical properties of area-averaged precipitation, model subsets were formed. Multimodel precipitation projections from each subset were compared to the multimodel projection from all of the models. Higher-resolution North American Regional Climate Change Assessment Program (NARCCAP) regional climate models (RCMs) were subjected to a similar evaluation, grouping into subsets, and examination of future projections. CMIP5 models adequately simulate most large-scale circulation features associated with wet Northeast summers, though all have errors in simulating observed sea level pressure and moisture divergence anomalies in the western tropical Atlantic/Gulf of Mexico. Relevant large-scale processes simulated by the RCMs resemble those of their driving global climate models (GCMs), which are not always realistic. Future RCM studies could benefit from a process analysis of potential driving GCMs prior to dynamical downscaling. No CMIP5 or NARCCAP models were identified as clearly more credible, but six GCMs and four RCMs performed consistently better. Among the "Better" models, there is no consistency in the direction of future summer precipitation change. CMIP5 projections suggest that the Northeast precipitation response depends on the dynamics of the North Atlantic anticyclone and associated circulation and moisture convergence patterns, which vary among "Better" models. Even when model credibility cannot be clearly differentiated, examination of simulated processes provides important insights into their evolution under

  9. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. I. Internal kink mode

    SciTech Connect

    McClenaghan, J.; Lin, Z.; Holod, I.; Deng, W.; Wang, Z.

    2014-12-15

    The gyrokinetic toroidal code (GTC) capability has been extended for simulating internal kink instability with kinetic effects in toroidal geometry. The global simulation domain covers the magnetic axis, which is necessary for simulating current-driven instabilities. GTC simulation in the fluid limit of the kink modes in cylindrical geometry is verified by benchmarking with a magnetohydrodynamic eigenvalue code. Gyrokinetic simulations of the kink modes in the toroidal geometry find that ion kinetic effects significantly reduce the growth rate even when the banana orbit width is much smaller than the radial width of the perturbed current layer at the mode rational surface.

  10. ITER Fusion Energy

    ScienceCinema

    Dr. Norbert Holtkamp

    2016-07-12

    ITER (in Latin “the way”) is designed to demonstrate the scientific and technological feasibility of fusion energy. Fusion is the process by which two light atomic nuclei combine to form a heavier over one and thus release energy. In the fusion process two isotopes of hydrogen – deuterium and tritium – fuse together to form a helium atom and a neutron. Thus fusion could provide large scale energy production without greenhouse effects; essentially limitless fuel would be available all over the world. The principal goals of ITER are to generate 500 megawatts of fusion power for periods of 300 to 500 seconds with a fusion power multiplication factor, Q, of at least 10. Q ? 10 (input power 50 MW / output power 500 MW). The ITER Organization was officially established in Cadarache, France, on 24 October 2007. The seven members engaged in the project – China, the European Union, India, Japan, Korea, Russia and the United States – represent more than half the world’s population. The costs for ITER are shared by the seven members. The cost for the construction will be approximately 5.5 billion Euros, a similar amount is foreseen for the twenty-year phase of operation and the subsequent decommissioning.

  11. Comparisons of Simulated Hydrodynamics and Water Quality for Projected Demands in 2046, Pueblo Reservoir, Southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.; Galloway, Joel M.; Miller, Lisa D.; Mau, David P.

    2008-01-01

    Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Bureau of Reclamation is working to meet its goal to issue a Final Environmental Impact Statement (EIS) on the Southern Delivery System project (SDS). SDS is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various EIS alternatives and plans by Pueblo West to discharge treated water into the reservoir. Plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (water years 2000 through

  12. Plasma fusion and cold fusion

    SciTech Connect

    Hideo, Kozima

    1996-12-31

    Fundamental problems of plasma fusion (controlled thermonuclear fusion) due to the contradicting demands of the magnetic confinement of plasma and suppression of instabilities occurring on and in plasma are surveyed in contrast with problems of cold fusion. Problems in cold fusion due to the complicated constituents and types of force are explained. Typical cold fusion events are explained by a model based on the presence of trapped neutrons in cold fusion materials. The events include Pons-Fleishmann effect, tritium anomaly, helium 4 production, and nuclear transmutation. Fundamental hypothesis of the model is an effectiveness of a new concept--neutron affinity of elements. The neutron affinity is defined and some bases supporting it are explained. Possible justification of the concept by statistical approach is given.

  13. The Physics Education Technology Project: Web-based interactive simulations to support student learning

    NASA Astrophysics Data System (ADS)

    Adams, Wendy; Perkins, Kathy; Finkelstein, Noah; Lemaster, Ron; Reid, Sam; Dubson, Mike; Wieman, Carl

    2004-05-01

    We introduce the Physics Education Technology (PhET) Project^1,2, a new initiative to provide a suite of online tools for teaching and learning introductory physics at the high school and college levels. The project focuses on the development of relatively elaborate Java- and Flash-based animated simulations that are designed to help students develop visual and conceptual models of physical phenomena. We are also developing guiding questions that will utilize the simulation to address specific conceptual difficulties, help students experience the relationships among variables, and connect physics to real-world experiences and observations. These simulations create an interactive experience for the student that is designed to promote active thinking and encourage experimentation. We have implemented the simulations as lecture demonstrations, homework tools, a replacement for laboratory equipment, and as a preparation activity for class. We will present a summary of the simulations currently available and our preliminary research results on what makes a simulation effective and how it can be used effectively as an educational tool. 1. See http://www.colorado.edu/physics/phet 2. Supported by NSF, the Kavli Foundation, and CU.

  14. Heavy Ion Fusion Systems Assessment study

    SciTech Connect

    Dudziak, D.J.; Herrmannsfeldt, W.B.

    1986-07-01

    The Heavy Ion Fusion Systems Assessment (HIFSA) study was conducted with the specific objective of evaluating the prospects of using induction linac drivers to generate economical electrical power from inertial confinement fusion. The study used algorithmic models of representative components of a fusion system to identify favored areas in the multidimensional parameter space. The resulting cost-of-electricity (COE) projections are comparable to those from other (magnetic) fusion scenarios, at a plant size of 100 MWe.

  15. Magnetized Target Fusion

    NASA Technical Reports Server (NTRS)

    Griffin, Steven T.

    2002-01-01

    Magnetized target fusion (MTF) is under consideration as a means of building a low mass, high specific impulse, and high thrust propulsion system for interplanetary travel. This unique combination is the result of the generation of a high temperature plasma by the nuclear fusion process. This plasma can then be deflected by magnetic fields to provide thrust. Fusion is initiated by a small traction of the energy generated in the magnetic coils due to the plasma's compression of the magnetic field. The power gain from a fusion reaction is such that inefficiencies due to thermal neutrons and coil losses can be overcome. Since the fusion reaction products are directly used for propulsion and the power to initiate the reaction is directly obtained from the thrust generation, no massive power supply for energy conversion is required. The result should be a low engine mass, high specific impulse and high thrust system. The key is to successfully initiate fusion as a proof-of-principle for this application. Currently MSFC is implementing MTF proof-of-principle experiments. This involves many technical details and ancillary investigations. Of these, selected pertinent issues include the properties, orientation and timing of the plasma guns and the convergence and interface development of the "pusher" plasma. Computer simulations of the target plasma's behavior under compression and the convergence and mixing of the gun plasma are under investigation. This work is to focus on the gun characterization and development as it relates to plasma initiation and repeatability.

  16. Magnetized Target Fusion

    NASA Technical Reports Server (NTRS)

    Griffin, Steven T.

    2002-01-01

    Magnetized target fusion (MTF) is under consideration as a means of building a low mass, high specific impulse, and high thrust propulsion system for interplanetary travel. This unique combination is the result of the generation of a high temperature plasma by the nuclear fusion process. This plasma can then be deflected by magnetic fields to provide thrust. Fusion is initiated by a small traction of the energy generated in the magnetic coils due to the plasma's compression of the magnetic field. The power gain from a fusion reaction is such that inefficiencies due to thermal neutrons and coil losses can be overcome. Since the fusion reaction products are directly used for propulsion and the power to initiate the reaction is directly obtained from the thrust generation, no massive power supply for energy conversion is required. The result should be a low engine mass, high specific impulse and high thrust system. The key is to successfully initiate fusion as a proof-of-principle for this application. Currently MSFC is implementing MTF proof-of-principle experiments. This involves many technical details and ancillary investigations. Of these, selected pertinent issues include the properties, orientation and timing of the plasma guns and the convergence and interface development of the "pusher" plasma. Computer simulations of the target plasma's behavior under compression and the convergence and mixing of the gun plasma are under investigation. This work is to focus on the gun characterization and development as it relates to plasma initiation and repeatability.

  17. Image fusion

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    The topics covered include the following: a system overview of the basic components of a system designed to improve the ability of a pilot to fly through low-visibility conditions such as fog; the role of visual sciences; fusion issues; sensor characterization; sources of information; image processing; and image fusion.

  18. The fusion of gerontology and technology in nursing education: History and demonstration of the Gerontological Informatics Reasoning Project--GRIP.

    PubMed

    Dreher, H Michael; Cornelius, Fran; Draper, Judy; Pitkar, Harshad; Manco, Janet; Song, Il-Yeol

    2006-01-01

    Phase I of our Gerontological Reasoning Informatics Project (GRIP) began in the summer of 2002 when all 37 senior undergraduate nursing students in our accelerated BSN nursing program were given PDAs. These students were oriented to use a digitalized geriatric nursing assessment tool embedded into their PDA in a variety of geriatric clinical agencies. This informatics project was developed to make geriatric nursing more technology oriented and focused on seven modules of geriatric assessment: intellect (I), nutrition (N), self-concept (S), physical activity (P), interpersonal functioning (I), restful sleep (R), and elimination (E)--INSPIRE. Through phase II and now phase III, the GRIP Project has become a major collaboration between the College of Nursing & Health Professions and College of Information Science and Technology at Drexel University. The digitalized geriatric nursing health assessment tool has undergone a second round of reliability and validity testing and is now used to conduct a 20 minute comprehensive geriatric health assessment on the PDA, making our undergraduate gerontology course the most high tech clinical course in our nursing curriculum.

  19. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  20. Asian summer monsoon onset in simulations and CMIP5 projections using four Chinese climate models

    NASA Astrophysics Data System (ADS)

    Zou, Liwei; Zhou, Tianjun

    2015-06-01

    The reproducibility and future changes of the onset of the Asian summer monsoon were analyzed based on the simulations and projections under the Representative Concentration Pathways (RCP) scenario in which anthropogenic emissions continue to rise throughout the 21st century (i.e. RCP8.5) by all realizations from four Chinese models that participated in the Coupled Model Intercomparison Project Phase 5 (CMIP5). Delayed onset of the monsoon over the Arabian Sea was evident in all simulations for present-day climate, which was associated with a too weak simulation of the low-level Somali jet in May. A consistent advanced onset of the monsoon was found only over the Arabian Sea in the projections, where the advanced onset of the monsoon was accompanied by an increase of rainfall and an anomalous anticyclone over the northern Indian Ocean. In all the models except FGOALS-g2, the enhanced low-level Somali jet transported more water vapor to the Arabian Sea, whereas in FGOALS-g2 the enhanced rainfall was determined more by the increased wind convergence. Furthermore, and again in all models except FGOALS-g2, the equatorial SST warming, with maximum increase over the eastern Pacific, enhanced convection in the central West Pacific and reduced convection over the eastern Indian Ocean and Maritime Continent region, which drove the anomalous anticyclonic circulation over the western Indian Ocean. In contrast, in FGOALS-g2, there was minimal (near-zero) warming of projected SST in the central equatorial Pacific, with decreased convection in the central West Pacific and enhanced convection over the Maritime Continent. The broader-scale differences among the models across the Pacific were related to both the differences in the projected SST pattern and in the present-day simulations.

  1. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  2. Accelerated SPECT Monte Carlo Simulation Using Multiple Projection Sampling and Convolution-Based Forced Detection

    PubMed Central

    Liu, Shaoying; King, Michael A.; Brill, Aaron B.; Stabin, Michael G.; Farncombe, Troy H.

    2010-01-01

    Monte Carlo (MC) is a well-utilized tool for simulating photon transport in single photon emission computed tomography (SPECT) due to its ability to accurately model physical processes of photon transport. As a consequence of this accuracy, it suffers from a relatively low detection efficiency and long computation time. One technique used to improve the speed of MC modeling is the effective and well-established variance reduction technique (VRT) known as forced detection (FD). With this method, photons are followed as they traverse the object under study but are then forced to travel in the direction of the detector surface, whereby they are detected at a single detector location. Another method, called convolution-based forced detection (CFD), is based on the fundamental idea of FD with the exception that detected photons are detected at multiple detector locations and determined with a distance-dependent blurring kernel. In order to further increase the speed of MC, a method named multiple projection convolution-based forced detection (MP-CFD) is presented. Rather than forcing photons to hit a single detector, the MP-CFD method follows the photon transport through the object but then, at each scatter site, forces the photon to interact with a number of detectors at a variety of angles surrounding the object. This way, it is possible to simulate all the projection images of a SPECT simulation in parallel, rather than as independent projections. The result of this is vastly improved simulation time as much of the computation load of simulating photon transport through the object is done only once for all projection angles. The results of the proposed MP-CFD method agrees well with the experimental data in measurements of point spread function (PSF), producing a correlation coefficient (r2) of 0.99 compared to experimental data. The speed of MP-CFD is shown to be about 60 times faster than a regular forced detection MC program with similar results. PMID:20811587

  3. Description of convective-scale numerical weather simulation use in a flight simulator within the Flysafe project

    NASA Astrophysics Data System (ADS)

    Pradier-Vabre, S.; Forster, C.; Heesbeen, W. W. M.; Pagé, C.; Sénési, S.; Tafferner, A.; Bernard-Bouissières, I.; Caumont, O.; Drouin, A.; Ducrocq, V.; Guillou, Y.; Josse, P.

    2009-03-01

    Within the framework of the Flysafe project, dedicated tools aiming at improving flight safety are developed. In particular, efforts are directed towards the development of the Next Generation-Integrated Surveillance System (NG-ISS), i.e. a combination of new on-board systems and ground-based tools which provides the pilot with integrated information on three risks playing a major role in aircraft accidents: collision with another aircraft, collision with terrain, and adverse weather conditions. For the latter, Weather Information Management Systems (WIMSs) based on nowcasts of atmospheric hazards are developed. This paper describes the set-up of a test-bed for the NG-ISS incorporating two types of WIMS data, those related to aircraft in-flight icing and thunderstorm risks. The test-bed is based on convective-scale numerical simulations of a particular weather scenario with thunderstorms and icing in the area of the Innsbruck airport. Raw simulated fields as well as more elaborate diagnostics (synthetic reflectivity and satellite brightness temperature) feed both the flight simulator including the NG-ISS and the algorithms in charge of producing WIMS data. WIMS outputs based on the synthetic data are discussed, and it is indicated that the high-resolution simulated fields are beneficial for the NG-ISS test-bed purposes and its technical feasibility.

  4. A Hybrid Parachute Simulation Environment for the Orion Parachute Development Project

    NASA Technical Reports Server (NTRS)

    Moore, James W.

    2011-01-01

    A parachute simulation environment (PSE) has been developed that aims to take advantage of legacy parachute simulation codes and modern object-oriented programming techniques. This hybrid simulation environment provides the parachute analyst with a natural and intuitive way to construct simulation tasks while preserving the pedigree and authority of established parachute simulations. NASA currently employs four simulation tools for developing and analyzing air-drop tests performed by the CEV Parachute Assembly System (CPAS) Project. These tools were developed at different times, in different languages, and with different capabilities in mind. As a result, each tool has a distinct interface and set of inputs and outputs. However, regardless of the simulation code that is most appropriate for the type of test, engineers typically perform similar tasks for each drop test such as prediction of loads, assessment of altitude, and sequencing of disreefs or cut-aways. An object-oriented approach to simulation configuration allows the analyst to choose models of real physical test articles (parachutes, vehicles, etc.) and sequence them to achieve the desired test conditions. Once configured, these objects are translated into traditional input lists and processed by the legacy simulation codes. This approach minimizes the number of sim inputs that the engineer must track while configuring an input file. An object oriented approach to simulation output allows a common set of post-processing functions to perform routine tasks such as plotting and timeline generation with minimal sensitivity to the simulation that generated the data. Flight test data may also be translated into the common output class to simplify test reconstruction and analysis.

  5. Beam dynamics simulations and measurements at the Project X Test Facility

    SciTech Connect

    Gianfelice-Wendt, E.; Scarpine, V.E.; Webber, R.C.; /Fermilab

    2011-03-01

    Project X, under study at Fermilab, is a multitask high-power superconducting RF proton beam facility, aiming to provide high intensity protons for rare processes experiments and nuclear physics at low energy, and simultaneously for the production of neutrinos, as well as muon beams in the long term. A beam test facility - former known as High Intensity Neutrino Source (HINS) - is under commissioning for testing critical components of the project, e.g. dynamics and diagnostics at low beam energies, broadband beam chopping, RF power generation and distribution. In this paper we describe the layout of the test facility and present beam dynamics simulations and measurements.

  6. Final Project Report: Data Locality Enhancement of Dynamic Simulations for Exascale Computing

    SciTech Connect

    Shen, Xipeng

    2016-04-27

    The goal of this project is to develop a set of techniques and software tools to enhance the matching between memory accesses in dynamic simulations and the prominent features of modern and future manycore systems, alleviating the memory performance issues for exascale computing. In the first three years, the PI and his group have achieves some significant progress towards the goal, producing a set of novel techniques for improving the memory performance and data locality in manycore systems, yielding 18 conference and workshop papers and 4 journal papers and graduating 6 Ph.Ds. This report summarizes the research results of this project through that period.

  7. Cosmic rays Monte Carlo simulations for the Extreme Energy Events Project

    NASA Astrophysics Data System (ADS)

    Abbrescia, M.; Agocs, A.; Aiola, S.; Antolini, R.; Avanzini, C.; Baldini Ferroli, R.; Bencivenni, G.; Bossini, E.; Bressan, E.; Chiavassa, A.; Cicalò, C.; Cifarelli, L.; Coccia, E.; De Gruttola, D.; De Pasquale, S.; Di Giovanni, A.; D'Incecco, M.; Dreucci, M.; Fabbri, F. L.; Frolov, V.; Garbini, M.; Gemme, G.; Gnesi, I.; Gustavino, C.; Hatzifotiadou, D.; La Rocca, P.; Li, S.; Librizzi, F.; Maggiora, A.; Massai, M.; Miozzi, S.; Panareo, M.; Paoletti, R.; Perasso, L.; Pilo, F.; Piragino, G.; Regano, A.; Riggi, F.; Righini, G. C.; Sartorelli, G.; Scapparone, E.; Scribano, A.; Selvi, M.; Serci, S.; Siddi, E.; Spandre, G.; Squarcia, S.; Taiuti, M.; Tosello, F.; Votano, L.; Williams, M. C. S.; Yánez, G.; Zichichi, A.; Zuyeuski, R.

    2014-08-01

    The Extreme Energy Events Project (EEE Project) is an innovative experiment to study very high energy cosmic rays by means of the detection of the associated air shower muon component. It consists of a network of tracking detectors installed inside Italian High Schools. Each tracking detector, called EEE telescope, is composed of three Multigap Resistive Plate Chambers (MRPCs). At present, 43 telescopes are installed and taking data, opening the way for the detection of far away coincidences over a total area of about 3 × 105 km2. In this paper we present the Monte Carlo simulations that have been performed to predict the expected coincidence rate between distant EEE telescopes.

  8. EDITORIAL: Plasma Surface Interactions for Fusion

    NASA Astrophysics Data System (ADS)

    2006-05-01

    Because plasma-boundary physics encompasses some of the most important unresolved issues for both the International Thermonuclear Experimental Reactor (ITER) project and future fusion power reactors, there is a strong interest in the fusion community for better understanding and characterization of plasma wall interactions. Chemical and physical sputtering cause the erosion of the limiters/divertor plates and vacuum vessel walls (made of C, Be and W, for example) and degrade fusion performance by diluting the fusion fuel and excessively cooling the core, while carbon redeposition could produce long-term in-vessel tritium retention, degrading the superior thermo-mechanical properties of the carbon materials. Mixed plasma-facing materials are proposed, requiring optimization for different power and particle flux characteristics. Knowledge of material properties as well as characteristics of the plasma material interaction are prerequisites for such optimizations. Computational power will soon reach hundreds of teraflops, so that theoretical and plasma science expertise can be matched with new experimental capabilities in order to mount a strong response to these challenges. To begin to address such questions, a Workshop on New Directions for Advanced Computer Simulations and Experiments in Fusion-Related Plasma Surface Interactions for Fusion (PSIF) was held at the Oak Ridge National Laboratory from 21 to 23 March, 2005. The purpose of the workshop was to bring together researchers in fusion related plasma wall interactions in order to address these topics and to identify the most needed and promising directions for study, to exchange opinions on the present depth of knowledge of surface properties for the main fusion-related materials, e.g., C, Be and W, especially for sputtering, reflection, and deuterium (tritium) retention properties. The goal was to suggest the most important next steps needed for such basic computational and experimental work to be facilitated

  9. Integrated Vehicle Health Management Project-Modeling and Simulation for Wireless Sensor Applications

    NASA Technical Reports Server (NTRS)

    Wallett, Thomas M.; Mueller, Carl H.; Griner, James H., Jr.

    2009-01-01

    This paper describes the efforts in modeling and simulating electromagnetic transmission and reception as in a wireless sensor network through a realistic wing model for the Integrated Vehicle Health Management project at the Glenn Research Center. A computer model in a standard format for an S-3 Viking aircraft was obtained, converted to a Microwave Studio software format, and scaled to proper dimensions in Microwave Studio. The left wing portion of the model was used with two antenna models, one transmitting and one receiving, to simulate radio frequency transmission through the wing. Transmission and reception results were inconclusive.

  10. Hardware Accelerated Simulated Radiography

    SciTech Connect

    Laney, D; Callahan, S; Max, N; Silva, C; Langer, S; Frank, R

    2005-04-12

    We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32 bit floating point texture capabilities to obtain validated solutions to the radiative transport equation for X-rays. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedra that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester. We show that the hardware accelerated solution is faster than the current technique used by scientists.

  11. Effects of non-local electron transport in one-dimensional and two-dimensional simulations of shock-ignited inertial confinement fusion targets

    SciTech Connect

    Marocchino, A.; Atzeni, S.; Schiavi, A.

    2014-01-15

    In some regions of a laser driven inertial fusion target, the electron mean-free path can become comparable to or even longer than the electron temperature gradient scale-length. This can be particularly important in shock-ignited (SI) targets, where the laser-spike heated corona reaches temperatures of several keV. In this case, thermal conduction cannot be described by a simple local conductivity model and a Fick's law. Fluid codes usually employ flux-limited conduction models, which preserve causality, but lose important features of the thermal flow. A more accurate thermal flow modeling requires convolution-like non-local operators. In order to improve the simulation of SI targets, the non-local electron transport operator proposed by Schurtz-Nicolaï-Busquet [G. P. Schurtz et al., Phys. Plasmas 7, 4238 (2000)] has been implemented in the DUED fluid code. Both one-dimensional (1D) and two-dimensional (2D) simulations of SI targets have been performed. 1D simulations of the ablation phase highlight that while the shock profile and timing might be mocked up with a flux-limiter; the electron temperature profiles exhibit a relatively different behavior with no major effects on the final gain. The spike, instead, can only roughly be reproduced with a fixed flux-limiter value. 1D target gain is however unaffected, provided some minor tuning of laser pulses. 2D simulations show that the use of a non-local thermal conduction model does not affect the robustness to mispositioning of targets driven by quasi-uniform laser irradiation. 2D simulations performed with only two final polar intense spikes yield encouraging results and support further studies.

  12. Overview of High-Fidelity Modeling Activities in the Numerical Propulsion System Simulations (NPSS) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2002-01-01

    A high-fidelity simulation of a commercial turbofan engine has been created as part of the Numerical Propulsion System Simulation Project. The high-fidelity computer simulation utilizes computer models that were developed at NASA Glenn Research Center in cooperation with turbofan engine manufacturers. The average-passage (APNASA) Navier-Stokes based viscous flow computer code is used to simulate the 3D flow in the compressors and turbines of the advanced commercial turbofan engine. The 3D National Combustion Code (NCC) is used to simulate the flow and chemistry in the advanced aircraft combustor. The APNASA turbomachinery code and the NCC combustor code exchange boundary conditions at the interface planes at the combustor inlet and exit. This computer simulation technique can evaluate engine performance at steady operating conditions. The 3D flow models provide detailed knowledge of the airflow within the fan and compressor, the high and low pressure turbines, and the flow and chemistry within the combustor. The models simulate the performance of the engine at operating conditions that include sea level takeoff and the altitude cruise condition.

  13. Modelling and Simulation of National Electronic Product Code Network Demonstrator Project

    NASA Astrophysics Data System (ADS)

    Mo, John P. T.

    The National Electronic Product Code (EPC) Network Demonstrator Project (NDP) was the first large scale consumer goods track and trace investigation in the world using full EPC protocol system for applying RFID technology in supply chains. The NDP demonstrated the methods of sharing information securely using EPC Network, providing authentication to interacting parties, and enhancing the ability to track and trace movement of goods within the entire supply chain involving transactions among multiple enterprise. Due to project constraints, the actual run of the NDP was 3 months only and was unable to consolidate with quantitative results. This paper discusses the modelling and simulation of activities in the NDP in a discrete event simulation environment and provides an estimation of the potential benefits that can be derived from the NDP if it was continued for one whole year.

  14. Early Career. Harnessing nanotechnology for fusion plasma-material interface research in an in-situ particle-surface interaction facility

    SciTech Connect

    Allain, Jean Paul

    2014-08-08

    This project consisted of fundamental and applied research of advanced in-situ particle-beam interactions with surfaces/interfaces to discover novel materials able to tolerate intense conditions at the plasma-material interface (PMI) in future fusion burning plasma devices. The project established a novel facility that is capable of not only characterizing new fusion nanomaterials but, more importantly probing and manipulating materials at the nanoscale while performing subsequent single-effect in-situ testing of their performance under simulated environments in fusion PMI.

  15. Simulation of diffraction on a layer using the method of projections

    NASA Astrophysics Data System (ADS)

    Knyazkov, Dmitri

    2017-07-01

    A problem of simulating an incidence of a plane wave on a layer is considered. It was written a C++ computer program, that used the method of projections to solve the scattering problem. The adjusted layer width was used to calculate a layer irradiation defect. The method was tested in the cases of normal and skew incidence on a homogeneous layer, which can be solved analytically. An example of application to the field of sea surface radiometry was shown.

  16. Haughton-Mars Project (HMP)/NASA 2006 Lunar Medical Contingency Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Scheuring, R. A.; Jones, J. A.; Lee, P.; Comtois, J. M.; Chappell, S.; Rafiq, A.; Braham, S.; Hodgson, E.; Sullivan, P.; Wilkinson, N.

    2006-01-01

    Medical requirements are currently being developed for NASA's space exploration program. Lunar surface operations for crews returning to the moon will be performed on a daily basis to conduct scientific research and construct a lunar habitat. Inherent to aggressive surface activities is the potential risk of injury to crew members. To develop an evidence-base for handling medical contingencies on the lunar surface, a simulation project was conducted using the moon-Mars analog environment at Devon Island, Nunavut, high Canadian Arctic. A review of the Apollo lunar surface activities and personal communications with Apollo lunar crew members provided a knowledge base of plausible scenarios that could potentially injure an astronaut during a lunar extravehicular activity. Objectives were established to 1) demonstrate stabilization, field extraction and transfer an injured crew member to the habitat and 2) evaluate audio, visual and biomedical communication capabilities with ground controllers at multiple mission control centers. The simulation project s objectives were achieved. Among these objectives were 1) extracting a crew member from a sloped terrain by a two-member team in a 1-g analog environment, 2) establishing real-time communication to multiple space centers, 3) providing biomedical data to flight controllers and crew members, and 4) establishing a medical diagnosis and treatment plan from a remote site. The simulation project provided evidence for the types of equipment and methods needed for planetary space exploration. During the project, the crew members were confronted with a number of unexpected scenarios including environmental, communications, EVA suit, and navigation challenges. These trials provided insight into the challenges of carrying out a medical contingency in an austere environment. The knowledge gained from completing the objectives of this project will be incorporated into the exploration medical requirements involving an incapacited

  17. Simulated bat populations erode when exposed to climate change projections for western North America.

    PubMed

    Hayes, Mark A; Adams, Rick A

    2017-01-01

    Recent research has demonstrated that temperature and precipitation conditions correlate with successful reproduction in some insectivorous bat species that live in arid and semiarid regions, and that hot and dry conditions correlate with reduced lactation and reproductive output by females of some species. However, the potential long-term impacts of climate-induced reproductive declines on bat populations in western North America are not well understood. We combined results from long-term field monitoring and experiments in our study area with information on vital rates to develop stochastic age-structured population dynamics models and analyzed how simulated fringed myotis (Myotis thysanodes) populations changed under projected future climate conditions in our study area near Boulder, Colorado (Boulder Models) and throughout western North America (General Models). Each simulation consisted of an initial population of 2,000 females and an approximately stable age distribution at the beginning of the simulation. We allowed each population to be influenced by the mean annual temperature and annual precipitation for our study area and a generalized range-wide model projected through year 2086, for each of four carbon emission scenarios (representative concentration pathways RCP2.6, RCP4.5, RCP6.0, RCP8.5). Each population simulation was repeated 10,000 times. Of the 8 Boulder Model simulations, 1 increased (+29.10%), 3 stayed approximately stable (+2.45%, +0.05%, -0.03%), and 4 simulations decreased substantially (-44.10%, -44.70%, -44.95%, -78.85%). All General Model simulations for western North America decreased by >90% (-93.75%, -96.70%, -96.70%, -98.75%). These results suggest that a changing climate in western North America has the potential to quickly erode some forest bat populations including species of conservation concern, such as fringed myotis.

  18. Simulated bat populations erode when exposed to climate change projections for western North America

    PubMed Central

    Adams, Rick A.

    2017-01-01

    Recent research has demonstrated that temperature and precipitation conditions correlate with successful reproduction in some insectivorous bat species that live in arid and semiarid regions, and that hot and dry conditions correlate with reduced lactation and reproductive output by females of some species. However, the potential long-term impacts of climate-induced reproductive declines on bat populations in western North America are not well understood. We combined results from long-term field monitoring and experiments in our study area with information on vital rates to develop stochastic age-structured population dynamics models and analyzed how simulated fringed myotis (Myotis thysanodes) populations changed under projected future climate conditions in our study area near Boulder, Colorado (Boulder Models) and throughout western North America (General Models). Each simulation consisted of an initial population of 2,000 females and an approximately stable age distribution at the beginning of the simulation. We allowed each population to be influenced by the mean annual temperature and annual precipitation for our study area and a generalized range-wide model projected through year 2086, for each of four carbon emission scenarios (representative concentration pathways RCP2.6, RCP4.5, RCP6.0, RCP8.5). Each population simulation was repeated 10,000 times. Of the 8 Boulder Model simulations, 1 increased (+29.10%), 3 stayed approximately stable (+2.45%, +0.05%, -0.03%), and 4 simulations decreased substantially (-44.10%, -44.70%, -44.95%, -78.85%). All General Model simulations for western North America decreased by >90% (-93.75%, -96.70%, -96.70%, -98.75%). These results suggest that a changing climate in western North America has the potential to quickly erode some forest bat populations including species of conservation concern, such as fringed myotis. PMID:28686737

  19. Hanford Waste Simulants Created to Support the Research and Development on the River Protection Project - Waste Treatment Plant

    SciTech Connect

    Eibling, R.E.

    2001-07-26

    The development of nonradioactive waste simulants to support the River Protection Project - Waste Treatment Plant bench and pilot-scale testing is crucial to the design of the facility. The report documents the simulants development to support the SRTC programs and the strategies used to produce the simulants.

  20. The diffusive finite state projection algorithm for efficient simulation of the stochastic reaction-diffusion master equation

    PubMed Central

    Drawert, Brian; Lawson, Michael J.; Petzold, Linda; Khammash, Mustafa

    2010-01-01

    We have developed a computational framework for accurate and efficient simulation of stochastic spatially inhomogeneous biochemical systems. The new computational method employs a fractional step hybrid strategy. A novel formulation of the finite state projection (FSP) method, called the diffusive FSP method, is introduced for the efficient and accurate simulation of diffusive transport. Reactions are handled by the stochastic simulation algorithm. PMID:20170209

  1. Towards the petaflop for Lattice QCD simulations the PetaQCD project

    NASA Astrophysics Data System (ADS)

    Anglès d'Auriac, Jean-Christian; Barthou, Denis; Becirevic, Damir; Bilhaut, René; Bodin, François; Boucaud, Philippe; Brand-Foissac, Olivier; Carbonell, Jaume; Eisenbeis, Christine; Gallard, Pascal; Grosdidier, Gilbert; Guichon, Pierre; Honoré, Pierre-François; Le Meur, Guy; Pène, Olivier; Rilling, Louis; Roudeau, Patrick; Seznec, André; Stocchi, Achille; Touze, François

    2010-04-01

    The study and design of a very ambitious petaflop cluster exclusively dedicated to Lattice QCD simulations started in early '08 among a consortium of 7 laboratories (IN2P3, CNRS, INRIA, CEA) and 2 SMEs. This consortium received a grant from the French ANR agency in July '08, and the PetaQCD project kickoff took place in January '09. Building upon several years of fruitful collaborative studies in this area, the aim of this project is to demonstrate that the simulation of a 256 x 1283 lattice can be achieved through the HMC/ETMC software, using a machine with efficient speed/cost/reliability/power consumption ratios. It is expected that this machine can be built out of a rather limited number of processors (e.g. between 1000 and 4000), although capable of a sustained petaflop CPU performance. The proof-of-concept should be a mock-up cluster built as much as possible with off-the-shelf components, and 2 particularly attractive axis will be mainly investigated, in addition to fast all-purpose multi-core processors: the use of the new brand of IBM-Cell processors (with on-chip accelerators) and the very recent Nvidia GP-GPUs (off-chip co-processors). This cluster will obviously be massively parallel, and heterogeneous. Communication issues between processors, implied by the Physics of the simulation and the lattice partitioning, will certainly be a major key to the project.

  2. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-08-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  3. Progress in Mirror-Based Fusion Neutron Source Development.

    PubMed

    Anikeev, A V; Bagryansky, P A; Beklemishev, A D; Ivanov, A A; Kolesnikov, E Yu; Korzhavina, M S; Korobeinikova, O A; Lizunov, A A; Maximov, V V; Murakhtin, S V; Pinzhenin, E I; Prikhodko, V V; Soldatkina, E I; Solomakhin, A L; Tsidulko, Yu A; Yakovlev, D V; Yurov, D V

    2015-12-04

    The Budker Institute of Nuclear Physics in worldwide collaboration has developed a project of a 14 MeV neutron source for fusion material studies and other applications. The projected neutron source of the plasma type is based on the gas dynamic trap (GDT), which is a special magnetic mirror system for plasma confinement. Essential progress in plasma parameters has been achieved in recent experiments at the GDT facility in the Budker Institute, which is a hydrogen (deuterium) prototype of the source. Stable confinement of hot-ion plasmas with the relative pressure exceeding 0.5 was demonstrated. The electron temperature was increased up to 0.9 keV in the regime with additional electron cyclotron resonance heating (ECRH) of a moderate power. These parameters are the record for axisymmetric open mirror traps. These achievements elevate the projects of a GDT-based neutron source on a higher level of competitive ability and make it possible to construct a source with parameters suitable for materials testing today. The paper presents the progress in experimental studies and numerical simulations of the mirror-based fusion neutron source and its possible applications including a fusion material test facility and a fusion-fission hybrid system.

  4. Progress in Mirror-Based Fusion Neutron Source Development

    PubMed Central

    Anikeev, A. V.; Bagryansky, P. A.; Beklemishev, A. D.; Ivanov, A. A.; Kolesnikov, E. Yu.; Korzhavina, M. S.; Korobeinikova, O. A.; Lizunov, A. A.; Maximov, V. V.; Murakhtin, S. V.; Pinzhenin, E. I.; Prikhodko, V. V.; Soldatkina, E. I.; Solomakhin, A. L.; Tsidulko, Yu. A.; Yakovlev, D. V.; Yurov, D. V.

    2015-01-01

    The Budker Institute of Nuclear Physics in worldwide collaboration has developed a project of a 14 MeV neutron source for fusion material studies and other applications. The projected neutron source of the plasma type is based on the gas dynamic trap (GDT), which is a special magnetic mirror system for plasma confinement. Essential progress in plasma parameters has been achieved in recent experiments at the GDT facility in the Budker Institute, which is a hydrogen (deuterium) prototype of the source. Stable confinement of hot-ion plasmas with the relative pressure exceeding 0.5 was demonstrated. The electron temperature was increased up to 0.9 keV in the regime with additional electron cyclotron resonance heating (ECRH) of a moderate power. These parameters are the record for axisymmetric open mirror traps. These achievements elevate the projects of a GDT-based neutron source on a higher level of competitive ability and make it possible to construct a source with parameters suitable for materials testing today. The paper presents the progress in experimental studies and numerical simulations of the mirror-based fusion neutron source and its possible applications including a fusion material test facility and a fusion-fission hybrid system. PMID:28793722

  5. Fusion Power.

    ERIC Educational Resources Information Center

    Dingee, David A.

    1979-01-01

    Discusses the extraordinary potential, the technical difficulties, and the financial problems that are associated with research and development of fusion power plants as a major source of energy. (GA)

  6. Fusion Power.

    ERIC Educational Resources Information Center

    Dingee, David A.

    1979-01-01

    Discusses the extraordinary potential, the technical difficulties, and the financial problems that are associated with research and development of fusion power plants as a major source of energy. (GA)

  7. Fusion of cone-beam CT and 3D photographic images for soft tissue simulation in maxillofacial surgery

    NASA Astrophysics Data System (ADS)

    Chung, Soyoung; Kim, Joojin; Hong, Helen

    2016-03-01

    During maxillofacial surgery, prediction of the facial outcome after surgery is main concern for both surgeons and patients. However, registration of the facial CBCT images and 3D photographic images has some difficulties that regions around the eyes and mouth are affected by facial expressions or the registration speed is low due to their dense clouds of points on surfaces. Therefore, we propose a framework for the fusion of facial CBCT images and 3D photos with skin segmentation and two-stage surface registration. Our method is composed of three major steps. First, to obtain a CBCT skin surface for the registration with 3D photographic surface, skin is automatically segmented from CBCT images and the skin surface is generated by surface modeling. Second, to roughly align the scale and the orientation of the CBCT skin surface and 3D photographic surface, point-based registration with four corresponding landmarks which are located around the mouth is performed. Finally, to merge the CBCT skin surface and 3D photographic surface, Gaussian-weight-based surface registration is performed within narrow-band of 3D photographic surface.

  8. Quantum Controlled Nuclear Fusion

    NASA Astrophysics Data System (ADS)

    Gruebele, Martin

    2017-06-01

    Laser-assisted nuclear fusion is a potential means for providing short, well-controlled particle bursts in the lab, such as neutron or alpha particle pulses. I will discuss computational results of how coherent control by shaped, amplified 800 nm laser pulses can be used to enhance the nuclear fusion cross section of diatomic molecules such as BH or DT. Quantum dynamics simulations show that a strong laser pulse can simultaneously field-bind the diatomic molecule after electron ejection, and increase the amplitude of the vibrational wave function at small internuclear distances. When VUV shaped laser pulses become available, coherent laser control may also be extended to muonic molecules such as D-mu-T, held together by muons instead of electrons. Muonic fusion has been extensively investigated for many decades, but without coherent laser control it falls slightly short of the break-evne point.

  9. Climate simulations and projections with a super-parameterized climate model

    DOE PAGES

    Stan, Cristiana; Xu, Li

    2014-07-01

    The mean climate and its variability are analyzed in a suite of numerical experiments with a fully coupled general circulation model in which subgrid-scale moist convection is explicitly represented through embedded 2D cloud-system resolving models. Control simulations forced by the present day, fixed atmospheric carbon dioxide concentration are conducted using two horizontal resolutions and validated against observations and reanalyses. The mean state simulated by the higher resolution configuration has smaller biases. Climate variability also shows some sensitivity to resolution but not as uniform as in the case of mean state. The interannual and seasonal variability are better represented in themore » simulation at lower resolution whereas the subseasonal variability is more accurate in the higher resolution simulation. The equilibrium climate sensitivity of the model is estimated from a simulation forced by an abrupt quadrupling of the atmospheric carbon dioxide concentration. The equilibrium climate sensitivity temperature of the model is 2.77 °C, and this value is slightly smaller than the mean value (3.37 °C) of contemporary models using conventional representation of cloud processes. As a result, the climate change simulation forced by the representative concentration pathway 8.5 scenario projects an increase in the frequency of severe droughts over most of the North America.« less

  10. On the transverse-traceless projection in lattice simulations of gravitational wave production

    SciTech Connect

    Figueroa, Daniel G.; García-Bellido, Juan

    2011-11-01

    It has recently been pointed out that the usual procedure employed in order to obtain the transverse-traceless (TT) part of metric perturbations in lattice simulations was inconsistent with the fact that those fields live in the lattice and not in the continuum. It was claimed that this could lead to a larger amplitude and a wrong shape for the gravitational wave (GW) spectra obtained in numerical simulations of (p)reheating. In order to address this issue, we have defined a consistent prescription in the lattice for extracting the TT part of the metric perturbations. We demonstrate explicitly that the GW spectra obtained with the old continuum-based TT projection only differ marginally in amplitude and shape with respect to the new lattice-based ones. We conclude that one can therefore trust the predictions appearing in the literature on the spectra of GW produced during (p)reheating and similar scenarios simulated on a lattice.

  11. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    SciTech Connect

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  12. Microstructural Evolution and Mechanical Properties of Fusion Welds and Simulated Heat-Affected Zones in an Iron-Copper Based Multi-Component Steel

    NASA Astrophysics Data System (ADS)

    Farren, Jeffrey David

    NUCu-140 is a copper-precipitation strengthened steel that exhibits excellent mechanical properties with a relatively simple chemical composition and processing schedule. As a result, NUCu-140 is a candidate material for use in many naval and structural applications. Before NUCu-140 can be implemented as a replacement for currently utilized materials, a comprehensive welding strategy must be developed under a wide range of welding conditions. This research represents an initial step toward understanding the microstructural and mechanical property evolution that occurs during fusion welding of NUCu-140. The following dissertation is presented as a series of four chapters. Chapter one is a review of the relevant literature on the iron-copper system including the precipitation of copper in steel, the development of the NUCu family of alloys, and the formation of acicular ferrite in steel weldments. Chapter two is a detailed study of the precipitate, microstructural, and mechanical property evolution of NUCu-140 fusion welds. Microhardness testing, tensile testing, local-electrode atom probe (LEAP) tomography, MatCalc kinetic simulations, and Russell-Brown strengthening results for gas-tungsten and gas-metal arc welds are presented. Chapter three is a thorough study of the microstructural and mechanical property evolution that occurs in the four critical regions of the HAZ. Simulated HAZ specimens were produced and evaluated using microhardness, tensile testing, and charpy impact testing. MatCalc simulations and R-B strengthening calculations were also performed in an effort to model the experimentally observed mechanical property trends. Chapter 4 is a brief investigation into the capabilities of MatCalc and the R-B model to determine if the two techniques could be used as predictive tools for a series of binary iron-copper alloys without the aid of experimentally measured precipitate data. The mechanical property results show that local softening occurs in the heat

  13. Fusion of psychiatric and medical high fidelity patient simulation scenarios: effect on nursing student knowledge, retention of knowledge, and perception.

    PubMed

    Kameg, Kirstyn M; Englert, Nadine Cozzo; Howard, Valerie M; Perozzi, Katherine J

    2013-12-01

    High fidelity patient simulation (HFPS) has become an increasingly popular teaching methodology in nursing education. To date, there have not been any published studies investigating HFPS scenarios incorporating medical and psychiatric nursing content. This study utilized a quasi-experimental design to assess if HFPS improved student knowledge and retention of knowledge utilizing three parallel 30-item Elsevier HESI(TM) Custom Exams. A convenience sample of 37 senior level nursing students participated in the study. The results of the study revealed the mean HESI test scores decreased following the simulation intervention although an analysis of variance (ANOVA) determined the difference was not statistically significant (p = .297). Although this study did not reveal improved student knowledge following the HFPS experiences, the findings did provide preliminary evidence that HFPS may improve knowledge in students who are identified as "at-risk." Additionally, students responded favorably to the simulations and viewed them as a positive learning experience.

  14. Constraining a complex biogeochemical model for CO2 and N2O emission simulations from various land uses by model-data fusion

    NASA Astrophysics Data System (ADS)

    Houska, Tobias; Kraus, David; Kiese, Ralf; Breuer, Lutz

    2017-07-01

    This study presents the results of a combined measurement and modelling strategy to analyse N2O and CO2 emissions from adjacent arable land, forest and grassland sites in Hesse, Germany. The measured emissions reveal seasonal patterns and management effects, including fertilizer application, tillage, harvest and grazing. The measured annual N2O fluxes are 4.5, 0.4 and 0.1 kg N ha-1 a-1, and the CO2 fluxes are 20.0, 12.2 and 3.0 t C ha-1 a-1 for the arable land, grassland and forest sites, respectively. An innovative model-data fusion concept based on a multicriteria evaluation (soil moisture at different depths, yield, CO2 and N2O emissions) is used to rigorously test the LandscapeDNDC biogeochemical model. The model is run in a Latin-hypercube-based uncertainty analysis framework to constrain model parameter uncertainty and derive behavioural model runs. The results indicate that the model is generally capable of predicting trace gas emissions, as evaluated with RMSE as the objective function. The model shows a reasonable performance in simulating the ecosystem C and N balances. The model-data fusion concept helps to detect remaining model errors, such as missing (e.g. freeze-thaw cycling) or incomplete model processes (e.g. respiration rates after harvest). This concept further elucidates the identification of missing model input sources (e.g. the uptake of N through shallow groundwater on grassland during the vegetation period) and uncertainty in the measured validation data (e.g. forest N2O emissions in winter months). Guidance is provided to improve the model structure and field measurements to further advance landscape-scale model predictions.

  15. Ion-kinetic simulations of D-3He gas-filled inertial confinement fusion target implosions with moderate to large Knudsen number

    DOE PAGES

    Larroche, O.; Rinderknecht, H. G.; Rosenberg, M. J.; ...

    2016-01-06

    Experiments designed to investigate the transition to non-collisional behavior in D3He-gas inertial confinement fusion target implosions display increasingly large discrepancies with respect to simulations by standard hydrodynamics codes as the expected ion mean-free-paths λc increase with respect to the target radius R (i.e., when the Knudsen number NK = λc/R grows). To take properly into account large NK's, multi-ion-species Vlasov-Fokker-Planck computations of the inner gas in the capsules have been performed, for two different values of NK, one moderate and one large. The results, including nuclear yield, reactivity-weighted ion temperatures, nuclear emissivities, and surface brightness, have been compared with themore » experimental data and with the results of hydrodynamical simulations, some of which include an ad hocmodeling of kinetic effects. The experimental results are quite accurately rendered by the kinetic calculations in the smaller-NK case, much better than by the hydrodynamical calculations. The kinetic effects at play in this case are thus correctly understood. However, in the higher-NK case, the agreement is much worse. Furthermore, the remaining discrepancies are shown to arise from kinetic phenomena (e.g., inter-species diffusion) occurring at the gas-pusher interface, which should be investigated in the future work.« less

  16. A comparative study on generating simulated Landsat NDVI images using data fusion and regression method-the case of the Korean Peninsula.

    PubMed

    Lee, Mi Hee; Lee, Soo Bong; Eo, Yang Dam; Kim, Sun Woong; Woo, Jung-Hun; Han, Soo Hee

    2017-07-01

    Landsat optical images have enough spatial and spectral resolution to analyze vegetation growth characteristics. But, the clouds and water vapor degrade the image quality quite often, which limits the availability of usable images for the time series vegetation vitality measurement. To overcome this shortcoming, simulated images are used as an alternative. In this study, weighted average method, spatial and temporal adaptive reflectance fusion model (STARFM) method, and multilinear regression analysis method have been tested to produce simulated Landsat normalized difference vegetation index (NDVI) images of the Korean Peninsula. The test results showed that the weighted average method produced the images most similar to the actual images, provided that the images were available within 1 month before and after the target date. The STARFM method gives good results when the input image date is close to the target date. Careful regional and seasonal consideration is required in selecting input images. During summer season, due to clouds, it is very difficult to get the images close enough to the target date. Multilinear regression analysis gives meaningful results even when the input image date is not so close to the target date. Average R (2) values for weighted average method, STARFM, and multilinear regression analysis were 0.741, 0.70, and 0.61, respectively.

  17. Ion-kinetic simulations of D-3He gas-filled inertial confinement fusion target implosions with moderate to large Knudsen number

    NASA Astrophysics Data System (ADS)

    Larroche, O.; Rinderknecht, H. G.; Rosenberg, M. J.; Hoffman, N. M.; Atzeni, S.; Petrasso, R. D.; Amendt, P. A.; Séguin, F. H.

    2016-01-01

    Experiments designed to investigate the transition to non-collisional behavior in D3He-gas inertial confinement fusion target implosions display increasingly large discrepancies with respect to simulations by standard hydrodynamics codes as the expected ion mean-free-paths λc increase with respect to the target radius R (i.e., when the Knudsen number NK=λc/R grows). To take properly into account large NK's, multi-ion-species Vlasov-Fokker-Planck computations of the inner gas in the capsules have been performed, for two different values of NK, one moderate and one large. The results, including nuclear yield, reactivity-weighted ion temperatures, nuclear emissivities, and surface brightness, have been compared with the experimental data and with the results of hydrodynamical simulations, some of which include an ad hoc modeling of kinetic effects. The experimental results are quite accurately rendered by the kinetic calculations in the smaller-NK case, much better than by the hydrodynamical calculations. The kinetic effects at play in this case are thus correctly understood. However, in the higher-NK case, the agreement is much worse. The remaining discrepancies are shown to arise from kinetic phenomena (e.g., inter-species diffusion) occurring at the gas-pusher interface, which should be investigated in the future work.

  18. Ion-kinetic simulations of D-3He gas-filled inertial confinement fusion target implosions with moderate to large Knudsen number

    SciTech Connect

    Larroche, O.; Rinderknecht, H. G.; Rosenberg, M. J.; Hoffman, N. M.; Atzeni, S.; Petrasso, R. D.; Amendt, P. A.; Seguin, F. H.

    2016-01-06

    Experiments designed to investigate the transition to non-collisional behavior in D3He-gas inertial confinement fusion target implosions display increasingly large discrepancies with respect to simulations by standard hydrodynamics codes as the expected ion mean-free-paths λc increase with respect to the target radius R (i.e., when the Knudsen number NK = λc/R grows). To take properly into account large NK's, multi-ion-species Vlasov-Fokker-Planck computations of the inner gas in the capsules have been performed, for two different values of NK, one moderate and one large. The results, including nuclear yield, reactivity-weighted ion temperatures, nuclear emissivities, and surface brightness, have been compared with the experimental data and with the results of hydrodynamical simulations, some of which include an ad hocmodeling of kinetic effects. The experimental results are quite accurately rendered by the kinetic calculations in the smaller-NK case, much better than by the hydrodynamical calculations. The kinetic effects at play in this case are thus correctly understood. However, in the higher-NK case, the agreement is much worse. Furthermore, the remaining discrepancies are shown to arise from kinetic phenomena (e.g., inter-species diffusion) occurring at the gas-pusher interface, which should be investigated in the future work.

  19. Education and Public Outreach at The Pavilion Lake Research Project: Fusion of Science and Education using Web 2.0

    NASA Astrophysics Data System (ADS)

    Cowie, B. R.; Lim, D. S.; Pendery, R.; Laval, B.; Slater, G. F.; Brady, A. L.; Dearing, W. L.; Downs, M.; Forrest, A.; Lees, D. S.; Lind, R. A.; Marinova, M.; Reid, D.; Seibert, M. A.; Shepard, R.; Williams, D.

    2009-12-01

    The Pavilion Lake Research Project (PLRP) is an international multi-disciplinary science and exploration effort to explain the origin and preservation potential of freshwater microbialites in Pavilion Lake, British Columbia, Canada. Using multiple exploration platforms including one person DeepWorker submersibles, Autonomous Underwater Vehicles, and SCUBA divers, the PLRP acts as an analogue research site for conducting science in extreme environments, such as the Moon or Mars. In 2009, the PLRP integrated several Web 2.0 technologies to provide a pilot-scale Education and Public Outreach (EPO) program targeting the internet savvy generation. The seamless integration of multiple technologies including Google Earth, Wordpress, Youtube, Twitter and Facebook, facilitated the rapid distribution of exciting and accessible science and exploration information over multiple channels. Field updates, science reports, and multimedia including videos, interactive maps, and immersive visualization were rapidly available through multiple social media channels, partly due to the ease of integration of these multiple technologies. Additionally, the successful application of videoconferencing via a readily available technology (Skype) has greatly increased the capacity of our team to conduct real-time education and public outreach from remote locations. The improved communication afforded by Web 2.0 has increased the quality of EPO provided by the PLRP, and has enabled a higher level of interaction between the science team and the community at large. Feedback from these online interactions suggest that remote communication via Web 2.0 technologies were effective tools for increasing public discourse and awareness of the science and exploration activity at Pavilion Lake.

  20. A National Collaboratory to Advance the Science of High Temperature Plasma Physics for Magnetic Fusion

    SciTech Connect

    Schissel, David P.; Abla, G.; Burruss, J. R.; Feibush, E.; Fredian, T. W.; Goode, M. M.; Greenwald, M. J.; Keahey, K.; Leggett, T.; Li, K.; McCune, D. C.; Papka, M. E.; Randerson, L.; Sanderson, A.; Stillerman, J.; Thompson, M. R.; Uram, T.; Wallace, G.

    2012-12-20

    This report summarizes the work of the National Fusion Collaboratory (NFC) Project to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. The original objective of the NFC project was to develop and deploy a national FES Grid (FusionGrid) that would be a system for secure sharing of computation, visualization, and data resources over the Internet. The goal of FusionGrid was to allow scientists at remote sites to participate as fully in experiments and computational activities as if they were working on site thereby creating a unified virtual organization of the geographically dispersed U.S. fusion community. The vision for FusionGrid was that experimental and simulation data, computer codes, analysis routines, visualization tools, and remote collaboration tools are to be thought of as network services. In this model, an application service provider (ASP provides and maintains software resources as well as the necessary hardware resources. The project would create a robust, user-friendly collaborative software environment and make it available to the US FES community. This Grid's resources would be protected by a shared security infrastructure including strong authentication to identify users and authorization to allow stakeholders to control their own resources. In this environment, access to services is stressed rather than data or software portability.

  1. High Level Information Fusion (HLIF) with nested fusion loops

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  2. Hybrid-view programming of nuclear fusion simulation code in the PGAS parallel programming language XcalableMP

    DOE PAGES

    Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi; ...

    2016-06-01

    Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensionalmore » gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.« less

  3. Hybrid-view programming of nuclear fusion simulation code in the PGAS parallel programming language XcalableMP

    SciTech Connect

    Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi; Sato, Mitsuhisa; Tang, William; Wang, Bei

    2016-06-01

    Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensional gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.

  4. Hybrid-view programming of nuclear fusion simulation code in the PGAS parallel programming language XcalableMP

    SciTech Connect

    Tsugane, Keisuke; Boku, Taisuke; Murai, Hitoshi; Sato, Mitsuhisa; Tang, William; Wang, Bei

    2016-06-01

    Recently, the Partitioned Global Address Space (PGAS) parallel programming model has emerged as a usable distributed memory programming model. XcalableMP (XMP) is a PGAS parallel programming language that extends base languages such as C and Fortran with directives in OpenMP-like style. XMP supports a global-view model that allows programmers to define global data and to map them to a set of processors, which execute the distributed global data as a single thread. In XMP, the concept of a coarray is also employed for local-view programming. In this study, we port Gyrokinetic Toroidal Code - Princeton (GTC-P), which is a three-dimensional gyrokinetic PIC code developed at Princeton University to study the microturbulence phenomenon in magnetically confined fusion plasmas, to XMP as an example of hybrid memory model coding with the global-view and local-view programming models. In local-view programming, the coarray notation is simple and intuitive compared with Message Passing Interface (MPI) programming while the performance is comparable to that of the MPI version. Thus, because the global-view programming model is suitable for expressing the data parallelism for a field of grid space data, we implement a hybrid-view version using a global-view programming model to compute the field and a local-view programming model to compute the movement of particles. Finally, the performance is degraded by 20% compared with the original MPI version, but the hybrid-view version facilitates more natural data expression for static grid space data (in the global-view model) and dynamic particle data (in the local-view model), and it also increases the readability of the code for higher productivity.

  5. Using historical and projected future climate model simulations as drivers of agricultural and biological models (Invited)

    NASA Astrophysics Data System (ADS)

    Stefanova, L. B.

    2013-12-01

    Climate model evaluation is frequently performed as a first step in analyzing climate change simulations. Atmospheric scientists are accustomed to evaluating climate models through the assessment of model climatology and biases, the models' representation of large-scale modes of variability (such as ENSO, PDO, AMO, etc) and the relationship between these modes and local variability (e.g. the connection between ENSO and the wintertime precipitation in the Southeast US). While these provide valuable information about the fidelity of historical and projected climate model simulations from an atmospheric scientist's point of view, the application of climate model data to fields such as agriculture, ecology and biology may require additional analyses focused on the particular application's requirements and sensitivities. Typically, historical climate simulations are used to determine a mapping between the model and observed climate, either through a simple (additive for temperature or multiplicative for precipitation) or a more sophisticated (such as quantile matching) bias correction on a monthly or seasonal time scale. Plants, animals and humans however are not directly affected by monthly or seasonal means. To assess the impact of projected climate change on living organisms and related industries (e.g. agriculture, forestry, conservation, utilities, etc.), derivative measures such as the heating degree-days (HDD), cooling degree-days (CDD), growing degree-days (GDD), accumulated chill hours (ACH), wet season onset (WSO) and duration (WSD), among others, are frequently useful. We will present a comparison of the projected changes in such derivative measures calculated by applying: (a) the traditional temperature/precipitation bias correction described above versus (b) a bias correction based on the mapping between the historical model and observed derivative measures themselves. In addition, we will present and discuss examples of various application-based climate

  6. Cloud radiative effects and changes simulated by the Coupled Model Intercomparison Project Phase 5 models

    NASA Astrophysics Data System (ADS)

    Shin, Sun-Hee; Kim, Ok-Yeon; Kim, Dongmin; Lee, Myong-In

    2017-07-01

    Using 32 CMIP5 (Coupled Model Intercomparison Project Phase 5) models, this study examines the veracity in the simulation of cloud amount and their radiative effects (CREs) in the historical run driven by observed external radiative forcing for 1850-2005, and their future changes in the RCP (Representative Concentration Pathway) 4.5 scenario runs for 2006-2100. Validation metrics for the historical run are designed to examine the accuracy in the representation of spatial patterns for climatological mean, and annual and interannual variations of clouds and CREs. The models show large spread in the simulation of cloud amounts, specifically in the low cloud amount. The observed relationship between cloud amount and the controlling large-scale environment are also reproduced diversely by various models. Based on the validation metrics, four models—ACCESS1.0, ACCESS1.3, HadGEM2-CC, and HadGEM2-ES—are selected as best models, and the average of the four models performs more skillfully than the multimodel ensemble average. All models project global-mean SST warming at the increase of the greenhouse gases, but the magnitude varies across the simulations between 1 and 2 K, which is largely attributable to the difference in the change of cloud amount and distribution. The models that simulate more SST warming show a greater increase in the net CRE due to reduced low cloud and increased incoming shortwave radiation, particularly over the regions of marine boundary layer in the subtropics. Selected best-performing models project a significant reduction in global-mean cloud amount of about -0.99% K-1 and net radiative warming of 0.46 W m-2 K-1, suggesting a role of positive feedback to global warming.

  7. Numerical tokamak turbulence project (OFES grand challenge)

    SciTech Connect

    Beer, M; Cohen, B I; Crotinger, J; Dawson, J; Decyk, V; Dimits, A M; Dorland, W D; Hammett, G W; Kerbel, G D; Leboeuf, J N; Lee, W W; Lin, Z; Nevins, W M; Reynders, J; Shumaker, D E; Smith, S; Sydora, R; Waltz, R E; Williams, T

    1999-08-27

    The primary research objective of the Numerical Tokamak Turbulence Project (NTTP) is to develop a predictive ability in modeling turbulent transport due to drift-type instabilities in the core of tokamak fusion experiments, through the use of three-dimensional kinetic and fluid simulations and the derivation of reduced models.

  8. Fusion - An energy source for synthetic fuels

    NASA Astrophysics Data System (ADS)

    Fillo, J. A.; Powell, J.; Steinberg, M.

    1980-05-01

    An important first step in the synthesis of liquid and gaseous fuels is the production of hydrogen. Thermonuclear fusion offers an inexhaustible source of energy for the production of hydrogen from water. Depending on design, electric generation efficiencies of 40 to 60% and hydrogen production efficiencies by high temperature electrolysis of 50 to 70% are projected for fusion reactors using high temperature blankets. Fusion/coal symbiotic systems appear economically promising for the first generation of commercial fusion synfuels plants. In the long term, there could be a gradual transition to an inexhaustible energy system based solely on fusion.

  9. (Fusion energy research)

    SciTech Connect

    Phillips, C.A.

    1988-01-01

    This report discusses the following topics: principal parameters achieved in experimental devices (FY88); tokamak fusion test reactor; Princeton beta Experiment-Modification; S-1 Spheromak; current drive experiment; x-ray laser studies; spacecraft glow experiment; plasma deposition and etching of thin films; theoretical plasma; tokamak modeling; compact ignition tokamak; international thermonuclear experimental reactor; Engineering Department; Project Planning and Safety Office; quality assurance and reliability; and technology transfer.

  10. Future of Inertial Fusion Energy

    SciTech Connect

    Nuckolls, J H; Wood, L L

    2002-09-04

    In the past 50 years, fusion R&D programs have made enormous technical progress. Projected billion-dollar scale research facilities are designed to approach net energy production. In this century, scientific and engineering progress must continue until the economics of fusion power plants improves sufficiently to win large scale private funding in competition with fission and non-nuclear energy systems. This economic advantage must be sustained: trillion dollar investments will be required to build enough fusion power plants to generate ten percent of the world's energy. For Inertial Fusion Energy, multi-billion dollar driver costs must be reduced by up to an order of magnitude, to a small fraction of the total cost of the power plant. Major cost reductions could be achieved via substantial improvements in target performance-both higher gain and reduced ignition energy. Large target performance improvements may be feasible through a combination of design innovations, e.g., ''fast ignition,'' propagation down density gradients, and compression of fusion fuel with a combination of driver and chemical energy. The assumptions that limit projected performance of fusion targets should be carefully examined. The National Ignition Facility will enable development and testing of revolutionary targets designed to make possible economically competitive fusion power plants.

  11. Toward Unanimous Projections for Sea Ice Using CMIP5 Multi-model Simulations

    NASA Astrophysics Data System (ADS)

    Yang, S.; Christensen, J. H.; Langen, P. P.; Thejll, P.

    2015-12-01

    Coupled global climate models have been used to provide future climate projections as major objective tools based on physical laws that govern the dynamics and thermodynamics of the climate system. However, while climate models in general predict declines in Arctic sea ice cover (i.e., ice extent and volume) from late 20th century through the next decades in response to increase of anthropogenic forcing, the model simulated Arctic sea ice demonstrates considerable biases in both the mean and the declining trend in comparison with the observations over the satellite era (1979-present). The models also show wide inter-model spread in hindcast and projected sea ice decline, raising the question of uncertainty in model predicted polar climate. In order to address the model uncertainty in the Arctic sea ice projection, we analyze the Arctic sea ice extent under the context of surface air temperature (SAT) as simulated in the historical, RCP4.5 and RCP8.5 experiments by 27 CMIP5 models. These 27 models are all we could obtain from the CMIP5 archive with sufficient gird information for processing the sea ice data. Unlike many previous studies in which only limited number of models were selected based on metrics of modeled sea ice characteristics for getting projected ice with reduced uncertainty, our analysis is applied to all model simulations with no discrimination. It is found that the changes in total Arctic sea ice in various seasons from one model are closely related to the changes in global mean SAT in the corresponding model. This relationship appears very similar in all models and agrees well with that in the observational data. In particular, the ratio of the total Arctic sea ice changes in March, September and annual mean with respect to the baseline climatology (1979-2008) are seen to linearly correlate to the global mean annual SAT anomaly, suggesting unanimous projection of the sea ice extent may be possible with this relationship. Further analysis is

  12. An actuator line model simulation with optimal body force projection length scales

    NASA Astrophysics Data System (ADS)

    Martinez-Tossas, Luis; Churchfield, Matthew J.; Meneveau, Charles

    2016-11-01

    In recent work (Martínez-Tossas et al. "Optimal smoothing length scale for actuator line models of wind turbine blades", preprint), an optimal body force projection length-scale for an actuator line model has been obtained. This optimization is based on 2-D aerodynamics and is done by comparing an analytical solution of inviscid linearized flow over a Gaussian body force to the potential flow solution of flow over a Joukowski airfoil. The optimization provides a non-dimensional optimal scale ɛ / c for different Joukowski airfoils, where ɛ is the width of the Gaussian kernel and c is the chord. A Gaussian kernel with different widths in the chord and thickness directions can further reduce the error. The 2-D theory developed is extended by simulating a full scale rotor using the optimal body force projection length scales. Using these values, the tip losses are captured by the LES and thus, no additional explicit tip-loss correction is needed for the actuator line model. The simulation with the optimal values provides excellent agreement with Blade Element Momentum Theory. This research is supported by the National Science Foundation (Grant OISE-1243482, the WINDINSPIRE project).

  13. 3-D simulation of urban warming in Tokyo and proposal of air-cooled city project

    SciTech Connect

    Saitoh, T.S.; Yamada, Noboru

    1999-07-01

    Recent computer projection of the urban warming in Tokyo metropolitan area around the year 2030 showed the authors that the urban temperature near Otemachi, heart of Tokyo, will exceed 43{+-}2 degree Celsius (110 degree Fahrenheit) at 6 p.m. in the summer. In the present paper, modeling and 3-D simulation results of urban warming in the Tokyo metropolitan area were presented and discussed. Furthermore, the effect of the reduction of carbon dioxide (CO{sub 2}) emissions was discussed by using a newly developed 3-D simulation code. Finally, the authors proposed a new concept; cool-air ventilated city project, which alleviates the urban warming, air pollution, and urban discomfort. In this project, the urban outdoor and indoor spaces are ventilated by clean cooled-air, which is produced in the rural or mountainous regions located far away from the urban area. Water of a huge reservoir is cooled below 4 degree Celsius in winter by utilizing sky radiation cooling and will be kept until the summer for indoor and outdoor space cooling. In this study, the feasibility of this system was discussed.

  14. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): simulation design and preliminary results

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Robock, A.; Tilmes, S.; Boucher, O.; English, J. M.; Irvine, P. J.; Jones, A.; Lawrence, M. G.; MacCracken, M.; Muri, H.; Moore, J. C.; Niemeier, U.; Phipps, S. J.; Sillmann, J.; Storelvmo, T.; Wang, H.; Watanabe, S.

    2015-10-01

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  15. Examinations of cloud variability and future change in the coupled model intercomparison project phase 3 simulations

    NASA Astrophysics Data System (ADS)

    Shin, Sun-Hee; Lee, Myong-In; Kim, Ok-Yeon

    2014-08-01

    Low-level cloud variability is critical to the radiation balance of Earth due to its wide spatial coverage. Using the adjusted International Satellite Cloud Climatology Project (ISCCP) observations of Clement et al. (2009), and the Coupled Model Intercomparison Project Phase 3 (CMIP3) model simulations, this study examines the observed and the simulated low-cloud variations and their relationships with large-scale environmental variables. From the observational analysis, significant correlations are found between low clouds and those of sea surface temperature (SST), lower tropospheric stability (LTS), and sea level pressure (SLP) over tropical marine areas of low cloud prevailing regions during most of the year. Increase of SST coincides with the reduction of LTS and increased vertical motion, which tends to reduce low-level clouds in subtropical oceans. Among the 14 models investigated, CGCM3 and HadGEM1 exhibit more realistic representation of the observed relationship between low-level clouds and large-scale environments. In future climate projection, these two models show a good agreement in the reduction of low-cloud throughout much of the global oceans in response to greenhouse gas forcing, suggesting a positive low-cloud feedback in a climate change context.

  16. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): Simulation design and preliminary results

    DOE PAGES

    Kravitz, Benjamin S.; Robock, Alan; Tilmes, S.; ...

    2015-10-27

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more long wave radiation to escape to space. We discuss experiment designs, as well as the rationale formore » those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. In conclusion, this is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.« less

  17. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): Simulation design and preliminary results

    SciTech Connect

    Kravitz, Benjamin S.; Robock, Alan; Tilmes, S.; Boucher, Olivier; English, J. M.; Irvine, Peter J.; Jones, Andrew; Lawrence, M. G.; MacCracken, Michael C.; Muri, Helene O.; Moore, John C.; Niemeier, Ulrike; Phipps, Steven J.; Sillmann, Jana; Storelvmo, Trude; Wang, Hailong; Watanabe, Shingo

    2015-10-27

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more long wave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. In conclusion, this is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  18. Simulations of Aperture Synthesis Imaging Radar for the EISCAT_3D Project

    NASA Astrophysics Data System (ADS)

    La Hoz, C.; Belyey, V.

    2012-12-01

    EISCAT_3D is a project to build the next generation of incoherent scatter radars endowed with multiple 3-dimensional capabilities that will replace the current EISCAT radars in Northern Scandinavia. Aperture Synthesis Imaging Radar (ASIR) is one of the technologies adopted by the EISCAT_3D project to endow it with imaging capabilities in 3-dimensions that includes sub-beam resolution. Complemented by pulse compression, it will provide 3-dimensional images of certain types of incoherent scatter radar targets resolved to about 100 metres at 100 km range, depending on the signal-to-noise ratio. This ability will open new research opportunities to map small structures associated with non-homogeneous, unstable processes such as aurora, summer and winter polar radar echoes (PMSE and PMWE), Natural Enhanced Ion Acoustic Lines (NEIALs), structures excited by HF ionospheric heating, meteors, space debris, and others. To demonstrate the feasibility of the antenna configurations and the imaging inversion algorithms a simulation of synthetic incoherent scattering data has been performed. The simulation algorithm incorporates the ability to control the background plasma parameters with non-homogeneous, non-stationary components over an extended 3-dimensional space. Control over the positions of a number of separated receiving antennas, their signal-to-noise-ratios and arriving phases allows realistic simulation of a multi-baseline interferometric imaging radar system. The resulting simulated data is fed into various inversion algorithms. This simulation package is a powerful tool to evaluate various antenna configurations and inversion algorithms. Results applied to realistic design alternatives of EISCAT_3D will be described.

  19. 55Fe effect on enhancing ferritic steel He/dpa ratio in fission reactor irradiations to simulate fusion conditions

    SciTech Connect

    Liu, Haibo; Abdou, Mohamed A.; Greenwood, Lawrence R.

    2013-11-01

    How to increase the ferritic steel He(appm)/dpa ratio in a fission reactor neutron spectrum is an important question for fusion reactor material testing. An early experiment showed that the accelerated He(appm)/dpa ratio of about 2.3 was achieved for 96% enriched 54Fe in iron with 458.2 effective full power days (EFPD) irradiation in the High Flux Isotope Reactor (HFIR), ORNL. Greenwood suggested that the transmutation produced 55Fe has a thermal neutron helium production cross section which may have an effect on this result. In the current work, the ferritic steel He(appm)/dpa ratio is studied in the neutron spectrum of HFIR with 55Fe thermal neutron helium production taken into account. The available ENDF-b format 55Fe incident neutron cross section file from TENDL, Netherlands, is first input into the calculation model. A benchmark calculation for the same sample as used in the aforementioned experiment was used to adjust and evaluate the TENDL 55Fe (n, a) cross section values. The analysis shows a decrease of a factor of 6700 for the TENDL 55Fe (n, a) cross section in the intermediate and low energy regions is required in order to fit the experimental results. The best fit to the cross section value at thermal neutron energy is about 27 mb. With the adjusted 55Fe (n, a) cross sections, calculation show that the 54Fe and 55Fe isotopes can be enriched by the isotopic tailoring technique in a ferritic steel sample irradiated in HFIR to significantly enhance the helium production rate. The results show that a 70% enriched 54Fe and 30% enriched 55Fe ferritic steel sample would produce a He(appm)/dpa ratio of about 13 initially in the HFIR peripheral target position (PTP). After one year irradiation, the ratio decreases to about 10. This new calculation can be used to guide future isotopic tailoring experiments designed to increase the He(appm)/dpa ratio in fission reactors. A benchmark experiment is suggested to be performed to evaluate the 55Fe (n, a) cross section

  20. Fusion Power measurement at ITER

    SciTech Connect

    Bertalot, L.; Barnsley, R.; Krasilnikov, V.; Stott, P.; Suarez, A.; Vayakis, G.; Walsh, M.

    2015-07-01

    Nuclear fusion research aims to provide energy for the future in a sustainable way and the ITER project scope is to demonstrate the feasibility of nuclear fusion energy. ITER is a nuclear experimental reactor based on a large scale fusion plasma (tokamak type) device generating Deuterium - Tritium (DT) fusion reactions with emission of 14 MeV neutrons producing up to 700 MW fusion power. The measurement of fusion power, i.e. total neutron emissivity, will play an important role for achieving ITER goals, in particular the fusion gain factor Q related to the reactor performance. Particular attention is given also to the development of the neutron calibration strategy whose main scope is to achieve the required accuracy of 10% for the measurement of fusion power. Neutron Flux Monitors located in diagnostic ports and inside the vacuum vessel will measure ITER total neutron emissivity, expected to range from 1014 n/s in Deuterium - Deuterium (DD) plasmas up to almost 10{sup 21} n/s in DT plasmas. The neutron detection systems as well all other ITER diagnostics have to withstand high nuclear radiation and electromagnetic fields as well ultrahigh vacuum and thermal loads. (authors)

  1. Projected strengthening of Amazonian dry season by constrained climate model simulations

    NASA Astrophysics Data System (ADS)

    Boisier, Juan P.; Ciais, Philippe; Ducharne, Agnès; Guimberteau, Matthieu

    2015-07-01

    The vulnerability of Amazonian rainforest, and the ecological services it provides, depends on an adequate supply of dry-season water, either as precipitation or stored soil moisture. How the rain-bearing South American monsoon will evolve across the twenty-first century is thus a question of major interest. Extensive savanization, with its loss of forest carbon stock and uptake capacity, is an extreme although very uncertain scenario. We show that the contrasting rainfall projections simulated for Amazonia by 36 global climate models (GCMs) can be reproduced with empirical precipitation models, calibrated with historical GCM data as functions of the large-scale circulation. A set of these simple models was therefore calibrated with observations and used to constrain the GCM simulations. In agreement with the current hydrologic trends, the resulting projection towards the end of the twenty-first century is for a strengthening of the monsoon seasonal cycle, and a dry-season lengthening in southern Amazonia. With this approach, the increase in the area subjected to lengthy--savannah-prone--dry seasons is substantially larger than the GCM-simulated one. Our results confirm the dominant picture shown by the state-of-the-art GCMs, but suggest that the `model democracy' view of these impacts can be significantly underestimated.

  2. Projected changes in atmospheric river events in Arizona as simulated by global and regional climate models

    NASA Astrophysics Data System (ADS)

    Rivera, Erick R.; Dominguez, Francina

    2016-09-01

    Inland-penetrating atmospheric rivers (ARs) affect the United States Southwest and significantly contribute to cool season precipitation. In this study, we examine the results from an ensemble of dynamically downscaled simulations from the North American Regional Climate Change Assessment Program (NARCCAP) and their driving general circulation models (GCMs) in order to determine statistically significant changes in the intensity of the cool season ARs impacting Arizona and the associated precipitation. Future greenhouse gas emissions follow the A2 emission scenario from the Intergovernmental Panel on Climate Change Fourth Assessment Report simulations. We find that there is a consistent and clear intensification of the AR-related water vapor transport in both the global and regional simulations which reflects the increase in water vapor content due to warmer atmospheric temperatures, according to the Clausius-Clapeyron relationship. However, the response of AR-related precipitation intensity to increased moisture flux and column-integrated water vapor is weak and no significant changes are projected either by the GCMs or the NARCCAP models. This lack of robust precipitation variations can be explained in part by the absence of meaningful changes in both the large-scale water vapor flux convergence and the maximum positive relative vorticity in the GCMs. Additionally, some global models show a robust decrease in relative humidity which may also be responsible for the projected precipitation patterns.

  3. Simulated hydrologic response to projected changes in precipitation and temperature in the Congo River basin

    NASA Astrophysics Data System (ADS)

    Aloysius, Noel; Saiers, James

    2017-08-01

    Despite their global significance, the impacts of climate change on water resources and associated ecosystem services in the Congo River basin (CRB) have been understudied. Of particular need for decision makers is the availability of spatial and temporal variability of runoff projections. Here, with the aid of a spatially explicit hydrological model forced with precipitation and temperature projections from 25 global climate models (GCMs) under two greenhouse gas emission scenarios, we explore the variability in modeled runoff in the near future (2016-2035) and mid-century (2046-2065). We find that total runoff from the CRB is projected to increase by 5 % [-9 %; 20 %] (mean - min and max - across model ensembles) over the next two decades and by 7 % [-12 %; 24 %] by mid-century. Projected changes in runoff from subwatersheds distributed within the CRB vary in magnitude and sign. Over the equatorial region and in parts of northern and southwestern CRB, most models project an overall increase in precipitation and, subsequently, runoff. A simulated decrease in precipitation leads to a decline in runoff from headwater regions located in the northeastern and southeastern CRB. Climate model selection plays an important role in future projections for both magnitude and direction of change. The multimodel ensemble approach reveals that precipitation and runoff changes under business-as-usual and avoided greenhouse gas emission scenarios (RCP8.5 vs. RCP4.5) are relatively similar in the near term but deviate in the midterm, which underscores the need for rapid action on climate change adaptation. Our assessment demonstrates the need to include uncertainties in climate model and emission scenario selection during decision-making processes related to climate change mitigation and adaptation.

  4. Accelerator & Fusion Research Division 1991 summary of activities

    SciTech Connect

    Not Available

    1991-12-01

    This report discusses research projects in the following areas: Heavy-ion fusion accelerator research; magnetic fusion energy; advanced light source; center for x-ray optics; exploratory studies; superconducting magnets; and bevalac operations.

  5. Accelerator Fusion Research Division 1991 summary of activities

    SciTech Connect

    Berkner, Klaus H.

    1991-12-01

    This report discusses research projects in the following areas: Heavy-ion fusion accelerator research; magnetic fusion energy; advanced light source; center for x-ray optics; exploratory studies; superconducting magnets; and bevalac operations.

  6. Time-Dependent Neutronics in Structural Materials of Inertial Fusion Reactors and Simulation of Defect Accumulation in Pulsed Fe and SiC

    SciTech Connect

    Perlado, J.M.; Lodi, D.; Marian, J.; Plata, A. Gonzalez; Salvador, M.; Caturla, M.J.; Rubia, T. Diaz de la; Colombo, L

    2003-05-15

    New results are presented on the time-dependent neutron intensities and energy spectra from compressed inertial fusion energy (IFE) targets and in structural Fe walls behind typical IFE chamber protection schemes. Protection schemes of LiPb and Flibe have been considered with two different thicknesses, and neutron fluxes in the outer Fe layer as a function of the time from target emission are given. Differences between the two solutions are noted and explained, and the effect of thickness is quantitatively shown. Time-dependent defect characterization of the Fe layer under pulse irradiation is presented. A new well-established multiscale modeling procedure injects, at the appropriate dose rate, damage cascades in a kinetic Monte Carlo lattice (microscopic) to study defect diffusion, clustering, and disintegration. The differences with a continuous irradiation for a still low fluence of irradiation are presented. Experimental validation of a multiscale modeling approach has been recognized and proposed in the Spanish VENUS-II project by using Fe ions on pure and ultrapure Fe. To study similar problems in SiC, new tools are needed to quantify the kinetic defects; results leading to the validation of a new tight binding molecular dynamics code for SiC are presented.

  7. Projected changes in haze pollution potential in China: an ensemble of regional climate model simulations

    NASA Astrophysics Data System (ADS)

    Han, Zhenyu; Zhou, Botao; Xu, Ying; Wu, Jia; Shi, Ying

    2017-08-01

    Based on the dynamic downscaling by the regional climate model RegCM4 from three CMIP5 global models under the historical and the RCP4.5 simulations, this article evaluated the performance of the RegCM4 downscaling simulations on the air environment carrying capacity (AEC) and weak ventilation days (WVDs) in China, which are applied to measure haze pollution potential. Their changes during the middle and the end of the 21st century were also projected. The evaluations show that the RegCM4 downscaling simulations can generally capture the observed features of the AEC and WVD distributions over the period 1986-2005. The projections indicate that the annual AEC tends to decrease and the annual WVDs tend to increase over almost the whole country except central China, concurrent with greater change by the late 21st century than by the middle of the 21st century. It suggests that annual haze pollution potential would be enlarged under the RCP4.5 scenario compared to the present. For seasonal change in the four main economic zones of China, it is projected consistently that there would be a higher probability of haze pollution risk over the Beijing-Tianjin-Hebei (BTH) region and the Yangtze River Delta (YRD) region in winter and over the Pearl River Delta (PRD) region in spring and summer in the context of the warming scenario. Over Northeast China (NEC), future climate change might reduce the AEC or increase the WVDs throughout the whole year, which favours the occurrence of haze pollution and thus the haze pollution risk would be aggravated. The relative contribution of different components related to the AEC change further indicates that changes in the boundary layer depth and the wind speed play leading roles in the AEC change over the BTH and NEC regions. In addition to those two factors, the precipitation change also exerts important impacts on the AEC change over the YRD and PRD zones.

  8. Fusion Data Grid Service

    NASA Astrophysics Data System (ADS)

    Shasharina, Svetlana; Wang, Nanbor

    2004-11-01

    Simulations and experiments in the fusion and plasma physics community generate large datasets at remote sites. Visualization and analysis of these datasets are difficult because of the incompatibility among the various data formats adopted by simulation, experiments, and analysis tools, and the large sizes of analyzed data. Grids and Web Services technologies are capable of providing solutions for such heterogeneous settings, but need to be customized to the field-specific needs and merged with distributed technologies currently used by the community. This paper describes how we are addressing these issues in the Fusion Grid Service under development. We also present performance results of relevant data transfer mechanisms including binary SOAP, DIME, GridFTP and MDSplus and CORBA. We will describe the status of data converters (between HDF5 and MDSplus data types), developed in collaboration with MIT (J. Stillerman). Finally, we will analyze bottlenecks of MDSplus data transfer mechanism (work performed in collaboration with General Atomics (D. Schissel and M. Qian).

  9. Numerical simulations of the ablative Rayleigh-Taylor instability in planar inertial-confinement-fusion targets using the FastRad3D code

    NASA Astrophysics Data System (ADS)

    Bates, J. W.; Schmitt, A. J.; Karasik, M.; Zalesak, S. T.

    2016-12-01

    The ablative Rayleigh-Taylor (RT) instability is a central issue in the performance of laser-accelerated inertial-confinement-fusion targets. Historically, the accurate numerical simulation of this instability has been a challenging task for many radiation hydrodynamics codes, particularly when it comes to capturing the ablatively stabilized region of the linear dispersion spectrum and modeling ab initio perturbations. Here, we present recent results from two-dimensional numerical simulations of the ablative RT instability in planar laser-ablated foils that were performed using the Eulerian code FastRad3D. Our study considers polystyrene, (cryogenic) deuterium-tritium, and beryllium target materials, quarter- and third-micron laser light, and low and high laser intensities. An initial single-mode surface perturbation is modeled in our simulations as a small modulation to the target mass density and the ablative RT growth-rate is calculated from the time history of areal-mass variations once the target reaches a steady-state acceleration. By performing a sequence of such simulations with different perturbation wavelengths, we generate a discrete dispersion spectrum for each of our examples and find that in all cases the linear RT growth-rate γ is well described by an expression of the form γ = α [ k g / ( 1 + ɛ k L m ) ] 1 / 2 - β k V a , where k is the perturbation wavenumber, g is the acceleration of the target, Lm is the minimum density scale-length, Va is the ablation velocity, and ɛ is either one or zero. The dimensionless coefficients α and β in the above formula depend on the particular target and laser parameters and are determined from two-dimensional simulation results through the use of a nonlinear curve-fitting procedure. While our findings are generally consistent with those of Betti et al. (Phys. Plasmas 5, 1446 (1998)), the ablative RT growth-rates predicted in this investigation are somewhat smaller than the values previously reported for the

  10. Changing Climate Extremes in the Northeast: CMIP5 Simulations and Projections

    NASA Astrophysics Data System (ADS)

    Thibeault, J. M.; Seth, A.

    2013-12-01

    Extreme climate events are known to have severe impacts on human and natural systems. As greenhouse warming progresses, a major concern is the potential for an increase in the frequency and intensity of extreme events. The Northeast (defined as the Northeast US, southern Quebec, and southeastern Ontario) is sensitive to climate extremes. The region is prone to flooding and drought, which poses challenges for infrastructure and water resource management, and increases risks to agriculture and forests. Extreme heat can be dangerous to human health, especially in the large urban centers of the Northeast. Annual average temperatures have steadily increased since the 1970s, accompanied by more frequent extremely hot weather, a longer growing season, and fewer frost days. Heavy precipitation events have become more frequent in recent decades. This research examines multi-model projections of annual and monthly extreme indices for the Northeast, using extreme indices computed by the Expert Team on Climate Change Detection and Indices (ETCCDI) for twenty-three global climate models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) for the 20th century historical and RCP8.5 experiments. Model simulations are compared to HadEX2 and ERA-interim gridded observations. CMIP5 simulations are consistent with observations - conditions in the Northeast are already becoming warmer and wetter. Projections indicate significant shifts toward warmer and wetter conditions by the middle century (2041-2070). Most indices are projected to be largely outside their late 20th century ranges by the late century (2071-2099). These results provide important information to stakeholders developing plans to lessen the adverse impacts of a warmer and wetter climate in the Northeast.

  11. Design and simulation of a micromirror array for a projection TV

    NASA Astrophysics Data System (ADS)

    Choi, Bumkyoo; Lee, Junghoon; Jung, Kyuwon; Shin, Hyungjae

    1999-10-01

    The design of a micromirror for a projection TV is investigated. A static structural analysis is performed to give an optimal shape of the micromirror using the FEM commercial package, ANSYS. A solid modeling is created, and mapped meshes are applied to it in order to satisfy a symmetric condition. A stress analysis shows that maximum stress does not exceed an allowable stress, which is the yield strength. A modal analysis is also executed to find the approximate natural frequencies with different design parameters. The result can be utilized to see which design parameter is strongly dominant. The micromirror was fabricated by Samsung Electronics. Dynamic deflection experiments confirm the results of the simulation.

  12. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  13. Prognostic simulation of reinjection-research project geothermal site Neustadt-Glewe/Germany

    SciTech Connect

    Poppei, J.

    1995-03-01

    For the first time after political and economical changes in Germany a hydrothermal site was put into operation in December 1994. Due to prevailing conditions extraordinary in Central Europe (reservoir temperature 99{degrees}C; 220 g/l salinity) the project Neustadt-Glewe is supported by a comprehensive research program. The wells concerned (a doublet with an internal distance of 1.400 m) open the porous sandstone aquifer with an average thickness of about 53 m in a depth of 2.240m. One point of interest was the pressure and temperature behavior over a period of 10 years considering the fluid viscosity changes due to variable injection temperature. For means of reservoir simulation and prognosing the injection behavior and simulator code TOUGH2 was used.

  14. MEchatronic REspiratory System SImulator for Neonatal Applications (MERESSINA) project: a novel bioengineering goal.

    PubMed

    Scaramuzzo, Rosa T; Ciantelli, Massimiliano; Baldoli, Ilaria; Bellanti, Lisa; Gentile, Marzia; Cecchi, Francesca; Sigali, Emilio; Tognarelli, Selene; Ghirri, Paolo; Mazzoleni, Stefano; Menciassi, Arianna; Cuttano, Armando; Boldrini, Antonio; Laschi, Cecilia; Dario, Paolo

    2013-01-01

    Respiratory function is mandatory for extrauterine life, but is sometimes impaired in newborns due to prematurity, congenital malformations, or acquired pathologies. Mechanical ventilation is standard care, but long-term complications, such as bronchopulmonary dysplasia, are still largely reported. Therefore, continuous medical education is mandatory to correctly manage devices for assistance. Commercially available breathing function simulators are rarely suitable for the anatomical and physiological realities. The aim of this study is to develop a high-fidelity mechatronic simulator of neonatal airways and lungs for staff training and mechanical ventilator testing. The project is divided into three different phases: (1) a review study on respiratory physiology and pathophysiology and on already available single and multi-compartment models; (2) the prototyping phase; and (3) the on-field system validation.

  15. MEchatronic REspiratory System SImulator for Neonatal Applications (MERESSINA) project: a novel bioengineering goal

    PubMed Central

    Scaramuzzo, Rosa T; Ciantelli, Massimiliano; Baldoli, Ilaria; Bellanti, Lisa; Gentile, Marzia; Cecchi, Francesca; Sigali, Emilio; Tognarelli, Selene; Ghirri, Paolo; Mazzoleni, Stefano; Menciassi, Arianna; Cuttano, Armando; Boldrini, Antonio; Laschi, Cecilia; Dario, Paolo

    2013-01-01

    Respiratory function is mandatory for extrauterine life, but is sometimes impaired in newborns due to prematurity, congenital malformations, or acquired pathologies. Mechanical ventilation is standard care, but long-term complications, such as bronchopulmonary dysplasia, are still largely reported. Therefore, continuous medical education is mandatory to correctly manage devices for assistance. Commercially available breathing function simulators are rarely suitable for the anatomical and physiological realities. The aim of this study is to develop a high-fidelity mechatronic simulator of neonatal airways and lungs for staff training and mechanical ventilator testing. The project is divided into three different phases: (1) a review study on respiratory physiology and pathophysiology and on already available single and multi-compartment models; (2) the prototyping phase; and (3) the on-field system validation. PMID:23966804

  16. Validation of Solar Sail Simulations for the NASA Solar Sail Demonstration Project

    NASA Technical Reports Server (NTRS)

    Braafladt, Alexander C.; Artusio-Glimpse, Alexandra B.; Heaton, Andrew F.

    2014-01-01

    NASA's Solar Sail Demonstration project partner L'Garde is currently assembling a flight-like sail assembly for a series of ground demonstration tests beginning in 2015. For future missions of this sail that might validate solar sail technology, it is necessary to have an accurate sail thrust model. One of the primary requirements of a proposed potential technology validation mission will be to demonstrate solar sail thrust over a set time period, which for this project is nominally 30 days. This requirement would be met by comparing a L'Garde-developed trajectory simulation to the as-flown trajectory. The current sail simulation baseline for L'Garde is a Systems Tool Kit (STK) plug-in that includes a custom-designed model of the L'Garde sail. The STK simulation has been verified for a flat plate model by comparing it to the NASA-developed Solar Sail Spaceflight Simulation Software (S5). S5 matched STK with a high degree of accuracy and the results of the validation indicate that the L'Garde STK model is accurate enough to meet the potential future mission requirements. Additionally, since the L'Garde sail deviates considerably from a flat plate, a force model for a non-flat sail provided by L'Garde sail was also tested and compared to a flat plate model in S5. This result will be used in the future as a basis of comparison to the non-flat sail model being developed for STK.

  17. A New Approach to Image Fusion Based on Cokriging

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.

    2005-01-01

    We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.

  18. Simulations of Plasma-Liner Formation and Implosion for the PLX- α Project

    NASA Astrophysics Data System (ADS)

    Samulyak, Roman; Cassibry, Jason; Schillo, Kevin; Shih, Wen; Yates, Kevin; Hsu, Scott; PLX-Alpha Collaboration

    2016-10-01

    Detailed numerical studies of the propagation and merger of high-Mach-number plasma jets and the formation and implosion of plasma liners have been performed using the FronTier and SPH codes enhanced with radiation, physical diffusion, and plasma-EOS models. These simulations support the Plasma Liner Experiment-ALPHA (PLX- α) project (see S. Hsu's talk in this session). Simulations predict properties of plasma liners, in particular 4 π-averaged liner density, ram pressure, and Mach number, the degree of non-uniformity, strength of primary and secondary shock waves, and scalings with the number of plasma jets, initial jet parameters, and other input data. In addition to direct analysis of liner states, simulations also provide synthetic data for direct comparison to experimental data from a multi-chord interferometer and survey and high-resolution spectrometers. Code verification and comparisons as well as predictions for the first series of PLX- α experiments with 6 and 7 jets will be presented. Verified against experimental data, both codes will be used for predictive simulations of plasma liners for PLX- α experiments and potential scaled-up future experiments. Supported by the ARPA-E ALPHA program.

  19. A NATIONAL COLLABORATORY TO ADVANCE THE SCIENCE OF HIGH TEMPERATURE PLASMA PHYSICS FOR MAGNETIC FUSION

    SciTech Connect

    Allen R. Sanderson; Christopher R. Johnson

    2006-08-01

    This report summarizes the work of the University of Utah, which was a member of the National Fusion Collaboratory (NFC) Project funded by the United States Department of Energy (DOE) under the Scientific Discovery through Advanced Computing Program (SciDAC) to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. A five year project that was initiated in 2001, it the NFC built on the past collaborative work performed within the U.S. fusion community and added the component of computer science research done with the USDOE Office of Science, Office of Advanced Scientific Computer Research. The project was itself a collaboration, itself uniting fusion scientists from General Atomics, MIT, and PPPL and computer scientists from ANL, LBNL, and Princeton University, and the University of Utah to form a coordinated team. The group leveraged existing computer science technology where possible and extended or created new capabilities where required. The complete finial report is attached as an addendum. The In the collaboration, the primary technical responsibility of the University of Utah in the collaboration was to develop and deploy an advanced scientific visualization service. To achieve this goal, the SCIRun Problem Solving Environment (PSE) is used on FusionGrid for an advanced scientific visualization service. SCIRun is open source software that gives the user the ability to create complex 3D visualizations and 2D graphics. This capability allows for the exploration of complex simulation results and the comparison of simulation and experimental data. SCIRun on FusionGrid gives the scientist a no-license-cost visualization capability that rivals present day commercial visualization packages. To accelerate the usage of SCIRun within the fusion community, a stand-alone application built on top of SCIRun was developed and deployed. This application, FusionViewer, allows users who are unfamiliar with SCIRun to quickly create

  20. Evaluation of Tropospheric Water Vapor Simulations from the Atmospheric Model Intercomparison Project

    NASA Technical Reports Server (NTRS)

    Gaffen, Dian J.; Rosen, Richard D.; Salstein, David A.; Boyle, James S.

    1997-01-01

    Simulations of humidity from 28 general circulation models for the period 1979-88 from the Atmospheric Model Intercomparison Project are compared with observations from radiosondes over North America and the globe and with satellite microwave observations over the Pacific basin. The simulations of decadal mean values of precipitable water (W) integrated over each of these regions tend to be less moist than the real atmosphere in all three cases; the median model values are approximately 5% less than the observed values. The spread among the simulations is larger over regions of high terrain, which suggests that differences in methods of resolving topographic features are important. The mean elevation of the North American continent is substantially higher in the models than is observed, which may contribute to the overall dry bias of the models over that area. The authors do not find a clear association between the mean topography of a model and its mean W simulation, however, which suggests that the bias over land is not purely a matter of orography. The seasonal cycle of W is reasonably well simulated by the models, although over North America they have a tendency to become moister more quickly in the spring than is observed. The interannual component of the variability of W is not well captured by the models over North America. Globally, the simulated W values show a signal correlated with the Southern Oscillation index but the observations do not. This discrepancy may be related to deficiencies in the radiosonde network, which does not sample the tropical ocean regions well. Overall, the interannual variability of W, as well as its climatology and mean seasonal cycle, are better described by the median of the 28 simulations than by individual members of the ensemble. Tests to learn whether simulated precipitable water, evaporation, and precipitation values may be related to aspects of model formulation yield few clear signals, although the authors find, for

  1. Evaluation of Tropospheric Water Vapor Simulations from the Atmospheric Model Intercomparison Project

    NASA Technical Reports Server (NTRS)

    Gaffen, Dian J.; Rosen, Richard D.; Salstein, David A.; Boyle, James S.

    1997-01-01

    Simulations of humidity from 28 general circulation models for the period 1979-88 from the Atmospheric Model Intercomparison Project are compared with observations from radiosondes over North America and the globe and with satellite microwave observations over the Pacific basin. The simulations of decadal mean values of precipitable water (W) integrated over each of these regions tend to be less moist than the real atmosphere in all three cases; the median model values are approximately 5% less than the observed values. The spread among the simulations is larger over regions of high terrain, which suggests that differences in methods of resolving topographic features are important. The mean elevation of the North American continent is substantially higher in the models than is observed, which may contribute to the overall dry bias of the models over that area. The authors do not find a clear association between the mean topography of a model and its mean W simulation, however, which suggests that the bias over land is not purely a matter of orography. The seasonal cycle of W is reasonably well simulated by the models, although over North America they have a tendency to become moister more quickly in the spring than is observed. The interannual component of the variability of W is not well captured by the models over North America. Globally, the simulated W values show a signal correlated with the Southern Oscillation index but the observations do not. This discrepancy may be related to deficiencies in the radiosonde network, which does not sample the tropical ocean regions well. Overall, the interannual variability of W, as well as its climatology and mean seasonal cycle, are better described by the median of the 28 simulations than by individual members of the ensemble. Tests to learn whether simulated precipitable water, evaporation, and precipitation values may be related to aspects of model formulation yield few clear signals, although the authors find, for

  2. Integrated computer control system CORBA-based simulator FY98 LDRD project final summary report

    SciTech Connect

    Bryant, R M; Holloway, F W; Van Arsdall, P J

    1999-01-15

    The CORBA-based Simulator was a Laboratory Directed Research and Development (LDRD) project that applied simulation techniques to explore critical questions about distributed control architecture. The simulator project used a three-prong approach comprised of a study of object-oriented distribution tools, computer network modeling, and simulation of key control system scenarios. This summary report highlights the findings of the team and provides the architectural context of the study. For the last several years LLNL has been developing the Integrated Computer Control System (ICCS), which is an abstract object-oriented software framework for constructing distributed systems. The framework is capable of implementing large event-driven control systems for mission-critical facilities such as the National Ignition Facility (NIF). Tools developed in this project were applied to the NIF example architecture in order to gain experience with a complex system and derive immediate benefits from this LDRD. The ICCS integrates data acquisition and control hardware with a supervisory system, and reduces the amount of new coding and testing necessary by providing prebuilt components that can be reused and extended to accommodate specific additional requirements. The framework integrates control point hardware with a supervisory system by providing the services needed for distributed control such as database persistence, system start-up and configuration, graphical user interface, status monitoring, event logging, scripting language, alert management, and access control. The design is interoperable among computers of different kinds and provides plug-in software connections by leveraging a common object request brokering architecture (CORBA) to transparently distribute software objects across the network of computers. Because object broker distribution applied to control systems is relatively new and its inherent performance is roughly threefold less than traditional point

  3. Multidisciplinary In Situ Simulation-Based Training as a Postpartum Hemorrhage Quality Improvement Project.

    PubMed

    Lutgendorf, Monica A; Spalding, Carmen; Drake, Elizabeth; Spence, Dennis; Heaton, Jason O; Morocco, Kristina V

    2017-03-01

    Postpartum hemorrhage is a common obstetric emergency affecting 3 to 5% of deliveries, with significant maternal morbidity and mortality. Effective management of postpartum hemorrhage requires strong teamwork and collaboration. We completed a multidisciplinary in situ postpartum hemorrhage simulation training exercise with structured team debriefing to evaluate hospital protocols, team performance, operational readiness, and real-time identification of system improvements. Our objective was to assess participant comfort with managing obstetric hemorrhage following our multidisciplinary in situ simulation training exercise. This was a quality improvement project that utilized a comprehensive multidisciplinary in situ postpartum hemorrhage simulation exercise. Participants from the Departments of Obstetrics and Gynecology, Anesthesia, Nursing, Pediatrics, and Transfusion Services completed the training exercise in 16 scenarios run over 2 days. The intervention was a high fidelity, multidisciplinary in situ simulation training to evaluate hospital protocols, team performance, operational readiness, and system improvements. Structured debriefing was conducted with the participants to discuss communication and team functioning. Our main outcome measure was participant self-reported comfort levels for managing postpartum hemorrhage before and after simulation training. A 5-point Likert scale (1 being very uncomfortable and 5 being very comfortable) was used to measure participant comfort. A paired t test was used to assess differences in participant responses before and after the simulation exercise. We also measured the time to prepare simulated blood products and followed the number of postpartum hemorrhage cases before and after the simulation exercise. We trained 113 health care professionals including obstetricians, midwives, residents, anesthesiologists, nurse anesthetists, nurses, and medical assistants. Participants reported a higher comfort level in managing

  4. Security on the US Fusion Grid

    SciTech Connect

    Burruss, Justin R.; Fredian, Tom W.; Thompson, Mary R.

    2005-06-01

    The National Fusion Collaboratory project is developing and deploying new distributed computing and remote collaboration technologies with the goal of advancing magnetic fusion energy research. This work has led to the development of the US Fusion Grid (FusionGrid), a computational grid composed of collaborative, compute, and data resources from the three large US fusion research facilities and with users both in the US and in Europe. Critical to the development of FusionGrid was the creation and deployment of technologies to ensure security in a heterogeneous environment. These solutions to the problems of authentication, authorization, data transfer, and secure data storage, as well as the lessons learned during the development of these solutions, may be applied outside of FusionGrid and scale to future computing infrastructures such as those for next-generation devices like ITER.

  5. Data security on the national fusion grid

    SciTech Connect

    Burruss, Justine R.; Fredian, Tom W.; Thompson, Mary R.

    2005-06-01

    The National Fusion Collaboratory project is developing and deploying new distributed computing and remote collaboration technologies with the goal of advancing magnetic fusion energy research. This work has led to the development of the US Fusion Grid (FusionGrid), a computational grid composed of collaborative, compute, and data resources from the three large US fusion research facilities and with users both in the US and in Europe. Critical to the development of FusionGrid was the creation and deployment of technologies to ensure security in a heterogeneous environment. These solutions to the problems of authentication, authorization, data transfer, and secure data storage, as well as the lessons learned during the development of these solutions, may be applied outside of FusionGrid and scale to future computing infrastructures such as those for next-generation devices like ITER.

  6. Creation of a novel simulator for minimally invasive neurosurgery: fusion of 3D printing and special effects.

    PubMed

    Weinstock, Peter; Rehder, Roberta; Prabhu, Sanjay P; Forbes, Peter W; Roussin, Christopher J; Cohen, Alan R

    2017-07-01

    OBJECTIVE Recent advances in optics and miniaturization have enabled the development of a growing number of minimally invasive procedures, yet innovative training methods for the use of these techniques remain lacking. Conventional teaching models, including cadavers and physical trainers as well as virtual reality platforms, are often expensive and ineffective. Newly developed 3D printing technologies can recreate patient-specific anatomy, but the stiffness of the materials limits fidelity to real-life surgical situations. Hollywood special effects techniques can create ultrarealistic features, including lifelike tactile properties, to enhance accuracy and effectiveness of the surgical models. The authors created a highly realistic model of a pediatric patient with hydrocephalus via a unique combination of 3D printing and special effects techniques and validated the use of this model in training neurosurgery fellows and residents to perform endoscopic third ventriculostomy (ETV), an effective minimally invasive method increasingly used in treating hydrocephalus. METHODS A full-scale reproduction of the head of a 14-year-old adolescent patient with hydrocephalus, including external physical details and internal neuroanatomy, was developed via a unique collaboration of neurosurgeons, simulation engineers, and a group of special effects experts. The model contains "plug-and-play" replaceable components for repetitive practice. The appearance of the training model (face validity) and the reproducibility of the ETV training procedure (content validity) were assessed by neurosurgery fellows and residents of different experience levels based on a 14-item Likert-like questionnaire. The usefulness of the training model for evaluating the performance of the trainees at different levels of experience (construct validity) was measured by blinded observers using the Objective Structured Assessment of Technical Skills (OSATS) scale for the performance of ETV. RESULTS A

  7. Final Report on Project 01-ERD-017 ''Smart Nanostructures From Computer Simulations''

    SciTech Connect

    Grossman, J C; Williamson, A J

    2004-02-13

    This project had two main objectives. The first major goal was to develop new, powerful computational simulation capabilities. It was important that these tools have the combination of the accuracy needed to describe the quantum mechanical nature of nanoscale systems and the efficiency required to be applied to realistic, experimentally derived materials. The second major goal was to apply these computational methods to calculate and predict the properties of quantum dots--initially composed of silicon, but then of other elements--which could be used to build novel nanotechnology devices. The driving factor of our purpose has been that, through the development and successful application of these tools, we would generate a new capability at LLNL that could be used to make nanostructured materials ''smarter'', e.g., by selectively predicting how to engineering specific, desired properties. To carry out the necessary work to successfully complete this project and deliver on our goals, we established a two-pronged effort from the beginning: (1) to work on developing new, more efficient algorithms and quantum simulation tools, and (2) to solve problems and make predictions regarding properties of quantum dots which were being studied experimentally here at Livermore.

  8. Effects of baseline conditions on the simulated hydrologic response to projected climate change

    USGS Publications Warehouse

    Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.

    2011-01-01

    Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.

  9. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): simulation design and preliminary results

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Robock, A.; Tilmes, S.; Boucher, O.; English, J. M.; Irvine, P. J.; Jones, A.; Lawrence, M. G.; MacCracken, M.; Muri, H.; Moore, J. C.; Niemeier, U.; Phipps, S. J.; Sillmann, J.; Storelvmo, T.; Wang, H.; Watanabe, S.

    2015-06-01

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  10. Cirrus Parcel Model Comparison Project. Phase 1: The Critical Components to Simulate Cirrus Initiation Explicitly.

    NASA Astrophysics Data System (ADS)

    Lin, Ruei-Fong; O'C. Starr, David; Demott, Paul J.; Cotton, Richard; Sassen, Kenneth; Jensen, Eric; Kärcher, Bernd; Liu, Xiaohong

    2002-08-01

    The Cirrus Parcel Model Comparison Project, a project of the GCSS [Global Energy and Water Cycle Experiment (GEWEX) Cloud System Studies] Working Group on Cirrus Cloud Systems, involves the systematic comparison of current models of ice crystal nucleation and growth for specified, typical, cirrus cloud environments. In Phase 1 of the project reported here, simulated cirrus cloud microphysical properties from seven models are compared for `warm' (40°C) and `cold' (60°C) cirrus, each subject to updrafts of 0.04, 0.2, and 1 m s1. The models employ explicit microphysical schemes wherein the size distribution of each class of particles (aerosols and ice crystals) is resolved into bins or the evolution of each individual particle is traced. Simulations are made including both homogeneous and heterogeneous ice nucleation mechanisms (all-mode simulations). A single initial aerosol population of sulfuric acid particles is prescribed for all simulations. Heterogeneous nucleation is disabled for a second parallel set of simulations in order to isolate the treatment of the homogeneous freezing (of haze droplets) nucleation process. Analysis of these latter simulations is the primary focus of this paper.Qualitative agreement is found for the homogeneous-nucleation-only simulations; for example, the number density of nucleated ice crystals increases with the strength of the prescribed updraft. However, significant quantitative differences are found. Detailed analysis reveals that the homogeneous nucleation rate, haze particle solution concentration, and water vapor uptake rate by ice crystal growth (particularly as controlled by the deposition coefficient) are critical components that lead to differences in the predicted microphysics.Systematic differences exist between results based on a modified classical theory approach and models using an effective freezing temperature approach to the treatment of nucleation. Each method is constrained by critical freezing data from

  11. Two-dimensional simulations of thermonuclear burn in ignition-scale inertial confinement fusion targets under compressed axial magnetic fields

    SciTech Connect

    Perkins, L. J.; Logan, B. G.; Zimmerman, G. B.; Werner, C. J.

    2013-07-15

    We report for the first time on full 2-D radiation-hydrodynamic implosion simulations that explore the impact of highly compressed imposed magnetic fields on the ignition and burn of perturbed spherical implosions of ignition-scale cryogenic capsules. Using perturbations that highly convolute the cold fuel boundary of the hotspot and prevent ignition without applied fields, we impose initial axial seed fields of 20–100 T (potentially attainable using present experimental methods) that compress to greater than 4 × 10{sup 4} T (400 MG) under implosion, thereby relaxing hotspot areal densities and pressures required for ignition and propagating burn by ∼50%. The compressed field is high enough to suppress transverse electron heat conduction, and to allow alphas to couple energy into the hotspot even when highly deformed by large low-mode amplitudes. This might permit the recovery of ignition, or at least significant alpha particle heating, in submarginal capsules that would otherwise fail because of adverse hydrodynamic instabilities.

  12. Review of alternative concepts for magnetic fusion

    SciTech Connect

    Krakowski, R.A.; Miller, R.L.; Hagenson, R.L.

    1980-01-01

    Although the Tokamak represents the mainstay of the world's quest for magnetic fusion power, with the tandem mirror serving as a primary backup concept in the US fusion program, a wide range of alternative fusion concepts (AFC's) have been and are being pursued. This review presents a summary of past and present reactor projections of a majority of AFC's. Whenever possible, quantitative results are given.

  13. Simulation to Seismic Fluid Substitution Modeling at the Illinois Basin - Decatur Project

    NASA Astrophysics Data System (ADS)

    Will, R. A.

    2015-12-01

    The Illinois Basin - Decatur Project (IBDP) is one of the most advanced US Department of Energy-funded carbon dioxide (CO2) sequestration projects. The goal of injecting 1 million tonnes of CO2 over a three year period was reached in November 2014 and the project is now in the post injection site closure (PISC) phase. A number of seismic methods are being utilized in the IBDP PISC plume monitoring program. These include time lapse three-dimensional (3D) vertical seismic profile (VSP) surveys, time-lapse surface seismic surveys, and passive seismic monitoring. While each seismic monitoring method has inherent spatial resolution and imaging footprint characteristics, all fundamentally rely on variation of reservoir elastic properties in response to injection induced changes in saturation and pressure conditions. These variations in elastic properties, and the resulting time-lapse seismic response, are often subtle and non-unique with respect to saturation and pressure effects. Elastic properties of saturated porous media may be estimated using rock physics theory and fluid substitution methods; however, the complexity of typical reservoir rock and fluid systems under injection conditions, and the subtlety of the resulting changes in elastic properties, dictate the need for representative estimates of the reservoir geologic framework, reservoir rock physics, and the anticipated plume geometry. At IBDP a "simulation-to-seismic" workflow has been used to develop accurate estimates of 3D time-lapse elastic property and seismic signal responses for CO2 plumes generated using a calibrated compositional flow simulation model. The anticipated time-lapse response for the IBDP surface and VSP time-lapse surveys have been estimated using ranges of rock physics parameters derived from geophysical logs. These investigations highlight the importance of geologic controls on plume geometry in monitoring program design as well as during model-based interpretation of time

  14. Massively parallel simulation with DOE's ASCI supercomputers : an overview of the Los Alamos Crestone project

    SciTech Connect

    Weaver, R. P.; Gittings, M. L.

    2004-01-01

    The Los Alamos Crestone Project is part of the Department of Energy's (DOE) Accelerated Strategic Computing Initiative, or ASCI Program. The main goal of this software development project is to investigate the use of continuous adaptive mesh refinement (CAMR) techniques for application to problems of interest to the Laboratory. There are many code development efforts in the Crestone Project, both unclassified and classified codes. In this overview I will discuss the unclassified SAGE and the RAGE codes. The SAGE (SAIC adaptive grid Eulerian) code is a one-, two-, and three-dimensional multimaterial Eulerian massively parallel hydrodynamics code for use in solving a variety of high-deformation flow problems. The RAGE CAMR code is built from the SAGE code by adding various radiation packages, improved setup utilities and graphics packages and is used for problems in which radiation transport of energy is important. The goal of these massively-parallel versions of the codes is to run extremely large problems in a reasonable amount of calendar time. Our target is scalable performance to {approx}10,000 processors on a 1 billion CAMR computational cell problem that requires hundreds of variables per cell, multiple physics packages (e.g. radiation and hydrodynamics), and implicit matrix solves for each cycle. A general description of the RAGE code has been published in [l],[ 2], [3] and [4]. Currently, the largest simulations we do are three-dimensional, using around 500 million computation cells and running for literally months of calendar time using {approx}2000 processors. Current ASCI platforms range from several 3-teraOPS supercomputers to one 12-teraOPS machine at Lawrence Livermore National Laboratory, the White machine, and one 20-teraOPS machine installed at Los Alamos, the Q machine. Each machine is a system comprised of many component parts that must perform in unity for the successful run of these simulations. Key features of any massively parallel system

  15. Converting Snow Depth to SWE: The Fusion of Simulated Data with Remote Sensing Retrievals and the Airborne Snow Observatory

    NASA Astrophysics Data System (ADS)

    Bormann, K.; Marks, D. G.; Painter, T. H.; Hedrick, A. R.; Deems, J. S.

    2015-12-01

    Snow cover monitoring has greatly benefited from remote sensing technology but, despite their critical importance, spatially distributed measurements of snow water equivalent (SWE) in mountain terrain remain elusive. Current methods of monitoring SWE rely on point measurements and are insufficient for distributed snow science and effective management of water resources. Many studies have shown that the spatial variability in SWE is largely controlled by the spatial variability in snow depth. JPL's Airborne Snow Observatory mission (ASO) combines LiDAR and spectrometer instruments to retrieve accurate and very high-resolution snow depth measurements at the watershed scale, along with other products such as snow albedo. To make best use of these high-resolution snow depths, spatially distributed snow density data are required to leverage SWE from the measured snow depths. Snow density is a spatially and temporally variable property that cannot yet be reliably extracted from remote sensing techniques, and is difficult to extrapolate to basin scales. However, some physically based snow models have shown skill in simulating bulk snow densities and therefore provide a pathway for snow depth to SWE conversion. Leveraging model ability where remote sensing options are non-existent, ASO employs a physically based snow model (iSnobal) to resolve distributed snow density dynamics across the basin. After an adjustment scheme guided by in-situ data, these density estimates are used to derive the elusive spatial distribution of SWE from the observed snow depth distributions from ASO. In this study, we describe how the process of fusing model data with remote sensing retrievals is undertaken in the context of ASO along with estimates of uncertainty in the final SWE volume products. This work will likely be of interest to those working in snow hydrology, water resource management and the broader remote sensing community.

  16. Simulation technology used for risky assessment in deep exploration project in China

    NASA Astrophysics Data System (ADS)

    jiao, J.; Huang, D.; Liu, J.

    2013-12-01

    Deep exploration has been carried out in China for five years in which various heavy duty instruments and equipments are employed for gravity, magnetic, seismic and electromagnetic data prospecting as well as ultra deep drilling rig established for obtaining deep samples, and so on. The deep exploration is a large and complex system engineering crossing multiple subjects with great investment. It is necessary to employ advanced technical means technology for verification, appraisal, and optimization of geographical prospecting equipment development. To reduce risk of the application and exploration, efficient and allegeable management concept and skills have to be enhanced in order to consolidate management measure and workflow to benefit the ambitious project. Therefore, evidence, prediction, evaluation and related decision strategies have to be taken into accouter simultaneously to meet practical scientific requests and technique limits and extendable attempts. Simulation technique is then proposed as a tool that can be used to carry out dynamic test on actual or imagined system. In practice, it is necessary to combine the simulation technique with the instruments and equipment to accomplish R&D tasks. In this paper, simulation technique is introduced into the R&D process of heavy-duty equipment and high-end engineering project technology. Based on the information provided by a drilling group recently, a digital model is constructed by combination of geographical data, 3d visualization, database management, and visual reality technologies together. It result in push ahead a R&D strategy, in which data processing , instrument application, expected result and uncertainty, and even operation workflow effect environment atmosphere are simulated systematically or simultaneously, in order to obtain an optimal consequence as well as equipment updating strategy. The simulation technology is able to adjust, verify, appraise and optimize the primary plan due to changing in

  17. Investigating the potential of the Pan-Planets project using Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Koppenhoefer, J.; Afonso, C.; Saglia, R. P.; Henning, Th.

    2009-02-01

    Using Monte Carlo simulations we analyze the potential of the upcoming transit survey Pan-Planets. The analysis covers the simulation of realistic light curves (including the effects of ingress/egress and limb-darkening) with both correlated and uncorrelated noise as well as the application of a box-fitting-least-squares detection algorithm. In this work we show how simulations can be a powerful tool in defining and optimizing the survey strategy of a transiting planet survey. We find the Pan-Planets project to be competitive with all other existing and planned transit surveys with the main power being the large 7 square degree field of view. In the first year we expect to find up to 25 Jupiter-sized planets with periods below 5 days around stars brighter than V = 16.5 mag. The survey will also be sensitive to planets with longer periods and planets with smaller radii. After the second year of the survey, we expect to find up to 9 Warm Jupiters with periods between 5 and 9 days and 7 Very Hot Saturns around stars brighter than V = 16.5 mag as well as 9 Very Hot Neptunes with periods from 1 to 3 days around stars brighter than i' = 18.0 mag.

  18. TOUGH2 simulations of the TEVES Project including the behavior of a single-component NAPL

    SciTech Connect

    Webb, S.W.

    1996-05-01

    The TEVES (Thermal Enhanced Vapor Extraction System) Project is a demonstration of a process designed to extract solvents and chemicals contained in the Chemical Waste Landfill at Sandia National Laboratories. In this process, the ground is electrically heated, and borehole(s) within the heated zone are maintained at a vacuum to draw air and evaporated contaminants into the borehole and a subsequent treatment facility. TOUGH2 simulations have been performed to evaluate the fluid flow and heat transfer behavior of the system. The TOUGH2 version used in this study includes air, water, and a single-component non-aqueous phase liquid (NAPL). In the present simulations, an initial o-xylene inventory is assumed in the heated zone for illustration purposes. Variation in borehole (vapor extraction) vacuum, borehole location, and soil permeability were investigated. Simulations indicate that the temperatures in the soil are relatively insensitive to the magnitude of the borehole vacuum or the borehole locations. In contrast, however, the NAPL and liquid water saturation distributions are sensitive to these borehole parameters. As the borehole vacuum and air flow rate through the soil decrease, the possibility of contaminant (NAPL) migration from the heated zone into the surrounding unheated soil increases. The borehole location can also affect the likelihood of contaminant movement into the unheated soil.

  19. The LISA Pathfinder Simulator for the Science and Technology Operations Center: Simulator Reuse Across the Project Life-Cycle: Practical Experiences and Lessons Learned

    NASA Astrophysics Data System (ADS)

    van der Plas, Peter; Leorato, Christiano

    2010-08-01

    During the operational phase of the Lisa Pathfinder (LPF) mission, the Science and Technology Operations Center (STOC) will be in charge of the operations of the LPF experiments. For the STOC to be able to perform its planning activities, an experiment simulator is required. The STOC simulator is based on the reuse of two simulators, which had originally been developed by EADS Astrium to support previous phases of the project life-cycle. This paper describes the STOC Simulator development approach, the used technologies and the high-level design. It then focuses on the specific implications of the reuse of the existing simulators: relevant issues are highlighted, together with the adopted solutions. Finally, the paper reports the first feedback on the actual usage of the STOC Simulator and then summarizes the lessons learned.

  20. Cold fusion, Alchemist's dream

    SciTech Connect

    Clayton, E.D.

    1989-09-01

    In this report the following topics relating to cold fusion are discussed: muon catalysed cold fusion; piezonuclear fusion; sundry explanations pertaining to cold fusion; cosmic ray muon catalysed cold fusion; vibrational mechanisms in excited states of D{sub 2} molecules; barrier penetration probabilities within the hydrogenated metal lattice/piezonuclear fusion; branching ratios of D{sub 2} fusion at low energies; fusion of deuterons into {sup 4}He; secondary D+T fusion within the hydrogenated metal lattice; {sup 3}He to {sup 4}He ratio within the metal lattice; shock induced fusion; and anomalously high isotopic ratios of {sup 3}He/{sup 4}He.

  1. Hardware-Accelerated Simulated Radiography

    SciTech Connect

    Laney, D; Callahan, S; Max, N; Silva, C; Langer, S; Frank, R

    2005-08-04

    We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32-bit floating point texture capabilities to obtain solutions to the radiative transport equation for X-rays. The hardware accelerated solutions are accurate enough to enable scientists to explore the experimental design space with greater efficiency than the methods currently in use. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedral meshes that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester.

  2. Dynamical simulation of the fission process and anisotropy of the fission fragment angular distributions of excited nuclei produced in fusion reactions

    NASA Astrophysics Data System (ADS)

    Eslamizadeh, H.

    2016-10-01

    Abstract. A stochastic approach based on four-dimensional Langevin equations was applied to calculate the anisotropy of fission fragment angular distributions, average prescission neutron multiplicity, and the fission probability in a wide range of fissile parameters for the compound nuclei 197Tl,225Pa,248Cf , and 264Rf produced in fusion reactions. Three collective shape coordinates plus the projection of total spin of the compound nucleus to the symmetry axis K were considered in the four-dimensional dynamical model. In the dynamical calculations, nuclear dissipation was generated through the chaos-weighted wall and window friction formula. Furthermore, in the dynamical calculations the dissipation coefficient of K ,γk was considered as a free parameter, and its magnitude inferred by fitting measured data on the anisotropy of fission fragment angular distributions for the compound nuclei 197Tl,225Pa,248Cf , and 264Rf. Comparison of the calculated results for the anisotropy of fission fragment angular distributions with the experimental data showed that the results of the calculations are in good agreement with the experimental data by using values of the dissipation coefficient of K equal to (0.185-0.205), (0.175-0.192), (0.077-0.090), and (0.075-0.085) (MeVzs ) -1 /2 for the compound nuclei 197Tl,225Pa,248Cf , and 264Rf, respectively. It was also shown that the influence of the dissipation coefficient of K on the results of the calculations of the prescission neutron multiplicity and fission probability is small.

  3. Data requirements for EOR surfactant-polymer process simulation and analysis of El Dorado pilot-project simulation, Butler County, Kansas. Volume II. Appendices

    SciTech Connect

    Claridge, E.L.; Lohse, A.

    1983-01-01

    The results of computer simulation of the El Dorado surfactant-polymer EOR pilot project, Butler County, Kansas indicated that conventional data from the project and other data in the public domain were not adequate for geologic, reservoir and process characterizations in a complex numerical simulation. As used by GURC in geologic characterization, and by INTERCOMP in process characterization and input into the CFTE simulator, the collective body of field and chemical data and related assumptions necessary for simulator input was not sufficient to predict how the chemical flood would behave in the Admire 650-foot sandstone reservoir. Based upon this study, a comprehensive body of data requirements for EOR simulation is defined in detail. Geologic characterization includes descriptors for rock, interwell and intrasystem correlations; reservoir characterization includes descriptors for fluid/rock, production, and flow rate properties; process characterization includes descriptors for chemical properties, interactions and functions. Reservoir heterogeneity is a principal problem in EOR simulation. It can be overcome within reasonable economic limits by successive orders of descriptors from: microscale (rock), achieved through borehole and core analyses; to macroscale (interwell), achieved through multiple borehole correlations; to megascale (intrasystem), achieved through extrapolation of rock and correlative well data into a generic depositional model that contains a description of internal mass properties within a given external morphology. Volume II contains appendices for: flow chart for surfactant-polymer process simulation; INTERCOMP reports to GURC describing the CFTE simulator program used in this study.

  4. Data requirements for EOR surfactant-polymer process simulation and analysis of El Dorado pilot-project simulation, Butler County, Kansas. Volume I. Technical report

    SciTech Connect

    Claridge, E.L.; Lohse, A.

    1983-01-01

    The results of computer simulation of the El Dorado surfactant-polymer EOR pilot project, Butler County, Kansas indicated that conventional data from the project and other data in the public domain were not adequate for geologic, reservoir and process characterizations in a complex numerical simulation. As used by GURC in geologic characterization, and by INTERCOMP in process characterization and input into the CFTE simulator, the collective body of field and chemical data and related assumptions necessary for simulator input was not sufficient to predict how the chemical flood would behave in the Admire 650-foot sandstone reservoir. Based upon this study, a comprehensive body of data requirements for EOR simulation is defined in detail. Geologic characterization includes descriptors for rock, interwell and intrasystem correlations; reservoir characterization includes descriptors for fluid/rock, production, and flow rate properties; process characterization includes descriptors for chemical properties, interactions and functions. Reservoir heterogeneity is a principal problem in EOR simulation. It can be overcome within reasonable economic limits by successive orders of descriptors from: microscale (rock), achieved through borehole and core analyses; to macroscale (interwell), achieved through multiple borehole correlations; to megascale (intrasystem), achieved through extrapolation of rock and correlative well data into a generic depositional model that contains a description of internal mass properties within a given external morphology. Volume II contains appendices for: flow chart for surfactant-polymer process simulation; INTERCOMP reports to GURC describing the CFTE simulator program used in this study.

  5. POST2 End-To-End Descent and Landing Simulation for the Autonomous Landing and Hazard Avoidance Technology Project

    NASA Technical Reports Server (NTRS)

    Fisher, Jody l.; Striepe, Scott A.

    2007-01-01

    The Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining the design and performance capability of lunar descent and landing system models and lunar environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. This POST2-based ALHAT simulation provides descent and landing simulation capability by integrating lunar environment and lander system models (including terrain, sensor, guidance, navigation, and control models), along with the data necessary to design and operate a landing system for robotic, human, and cargo lunar-landing success. This paper presents the current and planned development and model validation of the POST2-based end-to-end trajectory simulation used for the testing, performance and evaluation of ALHAT project system and models.

  6. Simulation and Projection of Blocking Highs in Key Regions of the Eurasia by CMIP5 Models

    NASA Astrophysics Data System (ADS)

    Li, Y.

    2016-12-01

    Previous studies generally hold viewpoint that CMIP5 models underestimate blocking frequency and predict decreasing trend of blockings in 21st century in North Hemisphere (NH). However, regional blocking has its own features, which is different from blockings in NH as a whole. Focusing on three key regions in Eurasia-Ural, Baikal, and Okhotsk where blockings significantly influence weather and climate of East Asia, historical simulations were analyzed to evaluate the performance of the CMIP5 models, and possible changes in the first half 21st century were then predicted using the RCP 4.5 and RCP 8.5 pathways. Comparison with NCEP/NCAR reanalysis (NNR) data revealed that instantaneous blocking frequencies are underestimated in the Ural and Baikal in the whole year and in Okhotsk in summer, but are overestimated in Okhotsk in winter. Overall, the CMIP5 models could largely reproduce character of instantaneous blocking frequency in the Eurasia, with a better performance in winter than in summer. Blocking episodes frequency in the Ural and Baikal is underestimated by most the 13 CMIP5 models, especially the short duration blocking episodes, and simulated superiorly in winter to in summer. However, modeled blocking episodes frequency is near to observed value in summer but overestimated in winter in Okhotsk. Model projections of instantaneous blocking frequency for the first half 21st century (2016-2065) shows that the RCP 4.5 projection yields a significant increasing frequency during January-May, decreasing frequency during June-August, and a little increasing frequency during September- December. The RCP 8.5 projection presents similar case to RCP 4.5 projection, but indicating more remarkable decreasing trend. Blocking episodes frequency of the multi-model ensemble mean obviously decreases in the Ural and Baikal (especially blocking episodes with short duration) and increase a little in Okhotsk in the first half 21st century. For blocking episodes with long duration

  7. To Create Space on Earth: The Space Environment Simulation Laboratory and Project Apollo

    NASA Technical Reports Server (NTRS)

    Walters, Lori C.

    2003-01-01

    Few undertakings in the history of humanity can compare to the great technological achievement known as Project Apollo. Among those who witnessed Armstrong#s flickering television image were thousands of people who had directly contributed to this historic moment. Amongst those in this vast anonymous cadre were the personnel of the Space Environment Simulation Laboratory (SESL) at the Manned Spacecraft Center (MSC) in Houston, Texas. SESL houses two large thermal-vacuum chambers with solar simulation capabilities. At a time when NASA engineers had a limited understanding of the effects of extremes of space on hardware and crews, SESL was designed to literally create the conditions of space on Earth. With interior dimensions of 90 feet in height and a 55-foot diameter, Chamber A dwarfed the Apollo command/service module (CSM) it was constructed to test. The chamber#s vacuum pumping capacity of 1 x 10(exp -6) torr can simulate an altitude greater than 130 miles above the Earth. A "lunar plane" capable of rotating a 150,000-pound test vehicle 180 deg replicates the revolution of a craft in space. To reproduce the temperature extremes of space, interior chamber walls cool to -280F as two banks of carbon arc modules simulate the unfiltered solar light/heat of the Sun. With capabilities similar to that of Chamber A, early Chamber B tests included the Gemini modular maneuvering unit, Apollo EVA mobility unit and the lunar module. Since Gemini astronaut Charles Bassett first ventured into the chamber in 1966, Chamber B has assisted astronauts in testing hardware and preparing them for work in the harsh extremes of space.

  8. Testbed for large volume surveillance through distributed fusion and resource management

    NASA Astrophysics Data System (ADS)

    Valin, Pierre; Guitouni, Adel; Bossé, Éloi; Wehn, Hans; Yates, Richard; Zwick, Harold

    2007-04-01

    DRDC Valcartier has initiated, through a PRECARN partnership project, the development of an advanced simulation testbed for the evaluation of the effectiveness of Network Enabled Operations in a coastal large volume surveillance situation. The main focus of this testbed is to study concepts like distributed information fusion, dynamic resources and networks configuration management, and self synchronising units and agents. This article presents the requirements, design and first implementation builds, and reports on some preliminary results. The testbed allows to model distributed nodes performing information fusion, dynamic resource management planning and scheduling, as well as configuration management, given multiple constraints on the resources and their communications networks. Two situations are simulated: cooperative and non-cooperative target search. A cooperative surface target behaves in ways to be detected (and rescued), while an elusive target attempts to avoid detection. The current simulation consists of a networked set of surveillance assets including aircraft (UAVs, helicopters, maritime patrol aircraft), and ships. These assets have electrooptical and infrared sensors, scanning and imaging radar capabilities. Since full data sharing over datalinks is not feasible, own-platform data fusion must be simulated to evaluate implementation and performance of distributed information fusion. A special emphasis is put on higher-level fusion concepts using knowledge-based rules, with level 1 fusion already providing tracks. Surveillance platform behavior is also simulated in order to evaluate different dynamic resource management algorithms. Additionally, communication networks are modeled to simulate different information exchange concepts. The testbed allows the evaluation of a range of control strategies from independent platform search, through various levels of platform collaboration, up to a centralized control of search platforms.

  9. Molecular dynamics simulations of T-2410 and T-2429 HIV fusion inhibitors interacting with model membranes: Insight into peptide behavior, structure and dynamics.

    PubMed

    Mavioso, I C V C; de Andrade, V C R; Palace Carvalho, A J; Martins do Canto, A M T

    2017-09-01

    T-2410 and T-2429 are HIV fusion inhibitor peptides (FI) designed to present a higher efficiency even against HIV strains that developed resistance against other FIs. Similar peptides were shown to interact with model membranes both in the liquid disordered phase and in the liquid ordered state. Those results indicated that such interaction is important to function and could be correlated with their effectiveness. Extensive molecular dynamics simulations were carried out to investigate the interactions between both T-2410 and T-2429 with bilayers of pure 1-palmitoyl-2-oleoyl-phosphatidylcholine (POPC) and a mixture of POPC/cholesterol (Chol) (1:1). It was observed that both peptides interact strongly with both membrane systems, especially with the POPC/Chol systems, where these peptides show the highest number of H-bonds observed so far. T-2410 and T-2429 showed higher extent of interaction with bilayers when compared to T-20 or T-1249 in previous studies. This is most notable in POPC/Chol membranes where, although able to form H-bonds with Chol, they do so to a lesser extent than T-1249 does, the latter being the only FI peptide so far that was observed to form H-bonds with Chol. This behavior suggests that interaction of FI peptides with rigid Chol rich membranes may not be as dependent from peptide/Chol H-bond formation as previous results of T-1249 behavior led to believe. As in other similar peptides, the higher ability to interact with membranes shown by T-2410 and T2429 is probably correlated with its higher inhibitory efficiency. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Sensor fusion for intelligent process control.

    SciTech Connect

    Connors, John J.; Hill, Kevin; Hanekamp, David; Haley, William F.; Gallagher, Robert J.; Gowin, Craig; Farrar, Arthur R.; Sheaffer, Donald A.; DeYoung, Mark A.; Bertram, Lee A.; Dodge, Craig; Binion, Bruce; Walsh, Peter M.; Houf, William G.; Desam, Padmabhushana R.; Tiwary, Rajiv; Stokes, Michael R.; Miller, Alan J.; Michael, Richard W.; Mayer, Raymond M.; Jiao, Yu; Smith, Philip J.; Arbab, Mehran; Hillaire, Robert G.

    2004-08-01

    An integrated system for the fusion of product and process sensors and controls for production of flat glass was envisioned, having as its objective the maximization of throughput and product quality subject to emission limits, furnace refractory wear, and other constraints. Although the project was prematurely terminated, stopping the work short of its goal, the tasks that were completed show the value of the approach and objectives. Though the demonstration was to have been done on a flat glass production line, the approach is applicable to control of production in the other sectors of the glass industry. Furthermore, the system architecture is also applicable in other industries utilizing processes in which product uniformity is determined by ability to control feed composition, mixing, heating and cooling, chemical reactions, and physical processes such as distillation, crystallization, drying, etc. The first phase of the project, with Visteon Automotive Systems as industrial partner, was focused on simulation and control of the glass annealing lehr. That work produced the analysis and computer code that provide the foundation for model-based control of annealing lehrs during steady state operation and through color and thickness changes. In the second phase of the work, with PPG Industries as the industrial partner, the emphasis was on control of temperature and combustion stoichiometry in the melting furnace, to provide a wider operating window, improve product yield, and increase energy efficiency. A program of experiments with the furnace, CFD modeling and simulation, flow measurements, and sensor fusion was undertaken to provide the experimental and theoretical basis for an integrated, model-based control system utilizing the new infrastructure installed at the demonstration site for the purpose. In spite of the fact that the project was terminated during the first year of the second phase of the work, the results of these first steps toward implementation

  11. Fusion reactor systems studies

    NASA Astrophysics Data System (ADS)

    1993-09-01

    Fusion Technology Institute personnel actively participated in the ARIES/PULSAR project during the present contract period. Numerous presentations were made at PULSAR project meetings, major contributions were written for the ARIES-2/4 Final Report presentations, and papers were given at technical conferences. Additionally, contributions were written for the ARIES Lessons Learned report, and a very large number of electronic-mail and regular-mail communications were sent. The remaining sections of this progress report will summarize the work accomplished and in progress for the PULSAR project during the contract period. The main areas of effort are as follows: PULSAR Research; ARIES-2/4 Report Contributions; ARIES Lessons Learned Report Contributions; and Stellarator Study.

  12. Simulation of extreme reservoir level distribution with the SCHADEX method (EXTRAFLO project)

    NASA Astrophysics Data System (ADS)

    Paquet, Emmanuel; Penot, David; Garavaglia, Federico

    2013-04-01

    -to-volume ratios and hydrographs applied to each simulated event. This allows accounting for different flood dynamics, depending on the season, the generating precipitation event, the soil saturation state, etc. In both cases, a hydraulic simulation of dam operation is performed, in order to compute the distribution of maximum reservoir levels. Results are detailed for an extreme return level, showing that a 1000 years return level reservoir level can be reached during flood events whose components (peaks, volumes) are not necessarily associated with such return level. The presentation will be illustrated by the example of a fictive dam on the Tech River at Reynes (South of France, 477 km²). This study has been carried out within the EXTRAFLO project, Task 8 (https://extraflo.cemagref.fr/). References: Paquet, E., Gailhard, J. and Garçon, R. (2006), Evolution of the GRADEX method: improvement by atmospheric circulation classification and hydrological modeling, La Houille Blanche, 5, 80-90. doi:10.1051/lhb:2006091. Paquet, E., Garavaglia, F., Garçon, R. and Gailhard, J. (2012), The SCHADEX method: a semi-continuous rainfall-runoff simulation for extreme food estimation, Journal of Hydrology, under revision

  13. Simulator Network Project Report: A tool for improvement of teaching materials and targeted resource usage in Skills Labs

    PubMed Central

    Damanakis, Alexander; Blaum, Wolf E.; Stosch, Christoph; Lauener, Hansjörg; Richter, Sabine; Schnabel, Kai P.

    2013-01-01

    During the last decade, medical education in the