Science.gov

Sample records for fusion simulation project

  1. AI/Simulation Fusion Project at Lawrence Livermore National Laboratory

    SciTech Connect

    Erickson, S.A.

    1984-04-25

    This presentation first discusses the motivation for the AI Simulation Fusion project. After discussing very briefly what expert systems are in general, what object oriented languages are in general, and some observed features of typical combat simulations, it discusses why putting together artificial intelligence and combat simulation makes sense. We then talk about the first demonstration goal for this fusion project.

  2. Integrated Simulation and Optimization of Fusion Systems: the Fusion Simulation Project

    NASA Astrophysics Data System (ADS)

    Batchelor, Donald B.

    2004-05-01

    Advanced experimental devices for fusion energy research are very large in the $1B class, the next major step being construction of ITER, a tokamak device capable of producing several hundred megawatts of fusion power. The plasmas in such devices are extremely far from thermal equilibrium and support a vast number of physical processes that must be controlled and coordinated to successfully achieve the conditions required for fusion. Simulation is a key element in the research program needed to understand experimental results from devices and compare these results to theory, to plan and design experiments on the devices, and to invent and evaluate new, higher performing confinement concepts. There are a number of fundamental computational challenges in such simulation: extreme range of time scales - wall equilibration time/electron cyclotron time O(10^14), extreme range of space scales - machine radius/electron gyroradius O(10^4), extreme plasma anisotropy - mean free path in magnetic field parallel/perpendicular O(10^10), strong non-linear coupling, sensitivity to geometric details, and high dimensionality. To deal with this challenge, several classes of fusion physics sub-disciplines and related simulation codes have been developed. There is not at present a single code, or code set, that integrates these sub-disciplines in their generality. The talk will describe the various approaches to fusion plasma simulation and progress toward bringing together the various models so as to treat the plasma more self-consistently. In particular, the fusion community is planning a comprehensive Fusion Simulation Project (FSP) whose ultimate goal ( 15 years) is to predict reliably the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales.

  3. Report of the Fusion Simulation Project Steering Committee

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Batchelor, Donald B.; Bramley, Randall B.; Cary, John R.; Cohen, Ronald H.; Colella, Phillip; Jardin, Steven C.

    2004-03-01

    The Fusion Simulation Project (FSP) is envisioned as a 15 year, 20M/year multi-institutional project to develop a comprehensive simulation capability for magnetic fusion experiments with a focus on the International Thermonuclear Experimental Reactor (ITER). The FSP would be able to contribute to design decisions, experimental planning and performance optimization for ITER, substantially increasing ITER's likelihood of success and its value to the US Fusion Program. The FSP would be jointly supported by the DOE Office of Fusion Energy Sciences and the DOE Office of Advanced Scientific Computing Research. The potential for developing this simulation capability rests on the exponential growth of computer power over the last 50 years, the progress in physics understanding developed by the international fusion program and the continued progress in computational mathematics that enables the use of the new "ultra-scale" computers to solve difficult mathematical problems. The initial concept for the FSP was developed by the Fusion Energy Sciences Advisory Committee Integrated Simulation and Optimization of Fusion Systems Subcommittee (J. Dahlburg and J. Corones, et al., J. Fusion Energy, 20(4), 135-196.). The DOE asked the FSP Steering Committee to develop a project vision, a governance concept and a roadmap for the FSP. The Committee recommends that the FSP consist of three elements: a production component, a research and integration component, and a software infrastructure component. The key challenge is developing components that bridge the enormous distance and time scales involved with the disparate physics elements of tokamak performance. The committee recommended that this challenge be met through "Focused Integration Initiatives" that would first seek to integrate different physics packages with disparate distance and time scales. An example is the integration of Radio Frequency (RF) Current Drive and Magnetohydrodynamics (MHD) components to produce an integrated

  4. Scientific and computational challenges of the fusion simulation project (FSP)

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2008-07-01

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Project (FSP). The primary objective is to develop advanced software designed to use leadership-class computers for carrying out multiscale physics simulations to provide information vital to delivering a realistic integrated fusion simulation model with unprecedented physics fidelity. This multiphysics capability will be unprecedented in that in the current FES applications domain, the largest-scale codes are used to carry out first-principles simulations of mostly individual phenomena in realistic 3D geometry while the integrated models are much smaller-scale, lower-dimensionality codes with significant empirical elements used for modeling and designing experiments. The FSP is expected to be the most up-to-date embodiment of the theoretical and experimental understanding of magnetically confined thermonuclear plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing a reliable ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices on all relevant time and space scales. From a computational perspective, the fusion energy science application goal to produce high-fidelity, whole-device modeling capabilities will demand computing resources in the petascale range and beyond, together with the associated multicore algorithmic formulation needed to address burning plasma issues relevant to ITER — a multibillion dollar collaborative device involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied

  5. Fusion Simulation Project. Workshop Sponsored by the U.S. Department of Energy, Rockville, MD, May 16-18, 2007

    SciTech Connect

    Kritz, A.; Keyes, D.

    2007-05-18

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  6. Fusion Simulation Project. Workshop sponsored by the U.S. Department of Energy Rockville, MD, May 16-18, 2007

    SciTech Connect

    2007-05-16

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  7. COLLABORATIVE: FUSION SIMULATION PROGRAM

    SciTech Connect

    Chang, Choong Seock

    2012-06-05

    New York University, Courant Institute of Mathematical Sciences, participated in the “Fusion Simulation Program (FSP) Planning Activities” [http://www.pppl.gov/fsp], with C.S. Chang as the institutional PI. FSP’s mission was to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. Specific institutional goal of the New York University was to participate in the planning of the edge integrated simulation, with emphasis on the usage of large scale HPCs, in connection with the SciDAC CPES project which the PI was leading. New York University successfully completed its mission by participating in the various planning activities, including the edge physics integration, the edge science drivers, and the mathematical verification. The activity resulted in the combined report that can be found in http://www.pppl.gov/fsp/Overview.html. Participation and presentations as part of this project are listed in a separation file.

  8. Lessons Learned from ASCI applied to the Fusion Simulation Project (FSP)

    NASA Astrophysics Data System (ADS)

    Post, Douglass

    2003-10-01

    The magnetic fusion program has proposed a 20M dollar per year project to develop a computational predictive capability for magnetic fusion experiments. The DOE NNSA launched a program in 1996, the Accelerated Strategic Computing Initiative (ASCI) to achieve the same goal for nuclear weapons to allow certification of the stockpile without testing. We present a "lessons learned" analysis of the 3B dollary 7 year ASCI program with the goal of improving the FSP to maximize the likelihood of success. The major lessons from ASCI are: 1. Build on your institution's successful history; 2.Teams are the key element; 3. Sound Software Project Management is essential: 4. Requirements, schedule and resources must be consistent; 5. Practices, not processes, are important; 6. Minimize and mitigate risks; 7. Minimize the computer science research aspect and maximize the physics elements; and 8. Verification and Validation are essential. We map this experience and recommendations into the FSP.

  9. Fusion processor simulation (FPSim)

    NASA Astrophysics Data System (ADS)

    Barnell, Mark D.; Wynne, Douglas G.; Rahn, Brian J.

    1998-07-01

    The Fusion Processor Simulation (FPSim) is being developed by Rome Laboratory to support the Discrimination Interceptor Technology (DITP) and Advanced Sensor Technology (ASTP) Programs of the Ballistic Missile Defense Organization. The purpose of the FPSim is to serve as a test bed and evaluation tool for establishing the feasibility of achieving threat engagement timelines. The FPSim supports the integration, evaluation, and demonstration of different strategies, system concepts, and Acquisition Tracking & Pointing (ATP) subsystems and components. The environment comprises a simulation capability within which users can integrate and test their application software models, algorithms and databases. The FPSim must evolve as algorithm developments mature to support independent evaluation of contractor designs and the integration of a number of fusion processor subsystem technologies. To accomplish this, the simulation contains validated modules, databases, and simulations. It possesses standardized engagement scenarios, architectures and subsystem interfaces, and provides a hardware and software framework which is flexible to support growth, reconfigurration, and simulation component modification and insertion. Key user interaction features include: (1) Visualization of platform status through displays of the surveillance scene as seen by imaging sensors. (2) User-selectable data analysis and graphics display during the simulation execution as well as during post-simulation analysis. (3) Automated, graphical tools to permit the user to reconfigure the FPSim, i.e., 'Plug and Play' various model/software modules. The FPSim is capable of hosting and executing user's software algorithms of image processing, signal processing, subsystems, and functions for evaluation purposes.

  10. Advanced fusion concepts: project summaries

    SciTech Connect

    1980-12-01

    This report contains descriptions of the activities of all the projects supported by the Advanced Fusion Concepts Branch of the Office of Fusion Energy, US Department of Energy. These descriptions are project summaries of each of the individual projects, and contain the following: title, principle investigators, funding levels, purpose, approach, progress, plans, milestones, graduate students, graduates, other professional staff, and recent publications. Information is given for each of the following programs: (1) reverse-field pinch, (2) compact toroid, (3) alternate fuel/multipoles, (4) stellarator/torsatron, (5) linear magnetic fusion, (6) liners, and (7) Tormac. (MOW)

  11. Fusion Simulation Program

    SciTech Connect

    Project Staff

    2012-02-29

    Under this project, General Atomics (GA) was tasked to develop the experimental validation plans for two high priority ISAs, Boundary and Pedestal and Whole Device Modeling in collaboration with the theory, simulation and experimental communities. The following sections have been incorporated into the final FSP Program Plan (www.pppl.gov/fsp), which was delivered to the US Department of Energy (DOE). Additional deliverables by GA include guidance for validation, development of metrics to evaluate success and procedures for collaboration with experiments. These are also part of the final report.

  12. Simulation Science for Fusion Plasmas

    NASA Astrophysics Data System (ADS)

    Skoric, M. M.; Sudo, S.

    2008-07-01

    The world fusion effort has recently entered a new age with the construction of ITER in Cadarache, France, which will be the first magnetic confinement fusion plasma experiment dominated by the self-heating of fusion reactions. In order to operate and control burning plasmas and future demo fusion reactors, an advanced ability for comprehensive computer simulations that are fully verified and validated against experimental data will be necessary. The ultimate goal is to develop the capability to predict reliably the behavior of plasmas in toroidal magnetic confinement devices on all relevant time and space scales. In addition to developing a sophisticated integrated simulation codes, directed advanced research in fusion physics, applied mathematics and computer science is envisaged. In this talk we review the basic strategy and main research efforts at the Department of Simulation Science of the National Institute for Fusion Science (NIFS)- which is the Inter University Institute and the coordinating Center of Excellence for academic fusion research in Japan. We overview a simulation research at NIFS, in particular relation to experiments in the Large Helical Device (LHD), the world's largest superconducting heliotron device, as a National Users' facility (see Motojima et al. 2003). Our main goal is understanding and systemizing the rich hierarchy of physical mechanisms in fusion plasmas, supported by exploring a basic science of complexity of plasma as a highly nonlinear, non-equilibrium, open system. The aim is to establish a simulation science as a new interdisciplinary field by fostering collaborative research in utilizing the large-scale supercomputer simulators. A concept of the hierarchy-renormalized simulation modelling will be invoked en route toward the LHD numerical test reactor. Finally, a perspective role is given on the ITER Broad Approach program at Rokkasho Center, as an integrated part of ITER and Development of Fusion Energy Agreement.

  13. Simulation science for fusion plasmas

    NASA Astrophysics Data System (ADS)

    Sudo, S.; Škorić, M. M.; Watanabe, T.-H.; Todo, Y.; Ishizawa, A.; Miura, H.; Ishizaki, R.; Ito, A.; Ohtani, H.; Usami, S.; Nakamura, H.; Ito, Atsushi; Ishiguro, S.; Tomita, Y.; Takayama, A.; Sato, M.; Yamamoto, T.; Den, M.; Sakagami, H.; Horiuchi, R.; Okamura, S.; Nakajima, N.

    2008-10-01

    The world fusion effort has embarked into a new age with the construction of ITER in Cadarache, France, which will be the first magnetic confinement fusion plasma experiment dominated by the self-heating of fusion reactions. In order to operate and control burning plasmas and next generation demo fusion reactors, an advanced capability for comprehensive integrated computer simulations that are fully verified and validated against experimental data will be necessary. The ultimate goal is to predict reliably the behaviour of plasmas in toroidal magnetic confinement devices on all relevant scales, both in time and space. In addition to developing a sophisticated integrated simulation codes, directed advanced research in fusion physics, applied mathematics, computer science and software is envisaged. In this paper we review the basic strategy and main research efforts at the Department of Simulation Science of the National Institute for Fusion Science (NIFS)- which is the Inter University Institute and the coordinating Center of Excellence for academic fusion research in Japan. We overview a simulation research at NIFS, in particular relation to experiments in the Large Helical Device (LHD), the world's largest superconducting heliotron device, as a National Users' facility (see Motojima et al. [1]). Our main goal is understanding and systemizing the rich hierarchy of physical mechanisms in fusion plasmas, supported by exploring a basic science of complexity of plasma as a highly nonlinear, non-equilibrium, open system. The aim is to establish a simulation science as a new interdisciplinary field by fostering collaborative research in utilizing the large-scale supercomputer simulators. A concept of the hierarchy-renormalized simulation modelling will be invoked en route toward the LHD numerical test reactor.

  14. Integrated simulation and modeling capability for alternate magnetic fusion concepts

    SciTech Connect

    Cohen, B. I.; Hooper, E.B.; Jarboe, T. R.; LoDestro, L. L.; Pearlstein, L. D.; Prager, S. C.; Sarff, J. S.

    1998-11-03

    This document summarizes a strategic study addressing the development of a comprehensive modeling and simulation capability for magnetic fusion experiments with particular emphasis on devices that are alternatives to the mainline tokamak device. A code development project in this area supports two defined strategic thrust areas in the Magnetic Fusion Energy Program: (1) comprehensive simulation and modeling of magnetic fusion experiments and (2) development, operation, and modeling of magnetic fusion alternate- concept experiment

  15. Simulation of Fusion Plasmas

    ScienceCinema

    Holland, Chris [UC San Diego, San Diego, California, United States

    2010-01-08

    The upcoming ITER experiment (www.iter.org) represents the next major milestone in realizing the promise of using nuclear fusion as a commercial energy source, by moving into the ?burning plasma? regime where the dominant heat source is the internal fusion reactions. As part of its support for the ITER mission, the US fusion community is actively developing validated predictive models of the behavior of magnetically confined plasmas. In this talk, I will describe how the plasma community is using the latest high performance computing facilities to develop and refine our models of the nonlinear, multiscale plasma dynamics, and how recent advances in experimental diagnostics are allowing us to directly test and validate these models at an unprecedented level.

  16. Fusion Simulation Program Definition. Final report

    SciTech Connect

    Cary, John R.

    2012-09-05

    We have completed our contributions to the Fusion Simulation Program Definition Project. Our contributions were in the overall planning with concentration in the definition of the area of Software Integration and Support. We contributed to the planning of multiple meetings, and we contributed to multiple planning documents.

  17. Fusion Plasma Theory project summaries

    SciTech Connect

    Not Available

    1993-10-01

    This Project Summary book is a published compilation consisting of short descriptions of each project supported by the Fusion Plasma Theory and Computing Group of the Advanced Physics and Technology Division of the Department of Energy, Office of Fusion Energy. The summaries contained in this volume were written by the individual contractors with minimal editing by the Office of Fusion Energy. Previous summaries were published in February of 1982 and December of 1987. The Plasma Theory program is responsible for the development of concepts and models that describe and predict the behavior of a magnetically confined plasma. Emphasis is given to the modelling and understanding of the processes controlling transport of energy and particles in a toroidal plasma and supporting the design of the International Thermonuclear Experimental Reactor (ITER). A tokamak transport initiative was begun in 1989 to improve understanding of how energy and particles are lost from the plasma by mechanisms that transport them across field lines. The Plasma Theory program has actively-participated in this initiative. Recently, increased attention has been given to issues of importance to the proposed Tokamak Physics Experiment (TPX). Particular attention has been paid to containment and thermalization of fast alpha particles produced in a burning fusion plasma as well as control of sawteeth, current drive, impurity control, and design of improved auxiliary heating. In addition, general models of plasma behavior are developed from physics features common to different confinement geometries. This work uses both analytical and numerical techniques. The Fusion Theory program supports research projects at US government laboratories, universities and industrial contractors. Its support of theoretical work at universities contributes to the office of Fusion Energy mission of training scientific manpower for the US Fusion Energy Program.

  18. SECAD-- a Schema-based Environment for Configuring, Analyzing and Documenting Integrated Fusion Simulations. Final report

    SciTech Connect

    Shasharina, Svetlana

    2012-05-23

    SECAD is a project that developed a GUI for running integrated fusion simulations as implemented in FACETS and SWIM SciDAC projects. Using the GUI users can submit simulations locally and remotely and visualize the simulation results.

  19. Project Icarus: Nuclear Fusion Propulsion Concept Comparison

    NASA Astrophysics Data System (ADS)

    Stanic, M.

    Project Icarus will use nuclear fusion as the primary propulsion, since achieving breakeven is imminent within the next decade. Therefore, fusion technology provides confidence in further development and fairly high technological maturity by the time the Icarus mission would be plausible. Currently there are numerous (over 2 dozen) different fusion approaches that are simultaneously being developed around the World and it is difficult to predict which of the concepts is going to be the most successful one. This study tried to estimate current technological maturity and possible technological extrapolation of fusion approaches for which appropriate data could be found. Figures of merit that were assessed include: current technological state, mass and volume estimates, possible gain values, main advantages and disadvantages of the concept and an attempt to extrapolate current technological state for the next decade or two. Analysis suggests that Magnetic Confinement Fusion (MCF) concepts are not likely to deliver sufficient performance due to size, mass, gain and large technological barriers of the concept. However, ICF and PJMIF did show potential for delivering necessary performance, assuming appropriate techno- logical advances. This paper is a submission of the Project Icarus Study Group.

  20. Plasma simulation and fusion calculation

    NASA Astrophysics Data System (ADS)

    Buzbee, B. L.

    Particle-in-cell (PIC) models are widely used in fusion studies associated with energy research and in certain fluid dynamical studies. Parallel computation is relevant to them because (1) PIC models are not amenable to a lot of vectorization - about 50% of the total computation is vectorized in the average model; (2) the volume of data processed by PIC models typically necessitates use of secondary storage with an attendant requirements for high-speed I/O; and (3) PIC models exist today whose implementation requires a computer 10 to 100 times faster than the Cray-1. Parallel formulation of PIC models for master/slave architectures and ring architectures is discussed. Because interprocessor communication is a decisive factor in the overall efficiency of a parallel system, division of these models into large granules that can be executed in parallel with relatively little need for communication is shown. Measurements of speedup obtained from experiments on the UNIVAC 1100/84 and the Denelcor HEP are also reported.

  1. Plasma simulation and fusion calculation

    SciTech Connect

    Buzbee, B.L.

    1983-01-01

    Particle-in-cell (PIC) models are widely used in fusion studies associated with energy research. They are also used in certain fluid dynamical studies. Parallel computation is relevant to them because (1) PIC models are not amenable to a lot of vectorization - about 50% of the total computation can be vectorized in the average model; (2) the volume of data processed by PIC models typically necessitates use of secondary storage with an attendant requirements for high-speed I/O; and (3) PIC models exist today whose implementation requires a computer 10 to 100 times faster than the Cray-1. This paper discusses parallel formulation of PIC models for master/slave architectures and ring architectures. Because interprocessor communication can be a decisive factor in the overall efficiency of a parallel system, we show how to divide these models into large granules that can be executed in parallel with relatively little need for communication. We also report measurements of speedup obtained from experiments on the UNIVAC 1100/84 and the Denelcor HEP.

  2. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    SciTech Connect

    William M. Tang

    2011-02-09

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  3. Particle simulation of transport in fusion devices

    SciTech Connect

    Procassini, R.J.; Birdsall, C.K.; Morse, E.C. . Electronics Research Lab.); Cohen, B.I. )

    1989-10-17

    Our research in the area of transport processes in fusion devices has recently been centered on the development of particle simulation models of transport in the scrape-off layer (SOL) of a diverted tokamak. As part of this research, we have been involved in the development of a suitable boundary condition for the plasma current at a floating plate that allows use of long time- and space-scale implicit simulation techniques. We have also been involved in a comparison of results from our particle-in-cell (PIC) code and a bounce-averaged Fokker-Planck (FP) code for the study of particle confinement in an auxiliary heated mirror plasma. 3 refs., 1 fig.

  4. Purdue Contribution of Fusion Simulation Program

    SciTech Connect

    Jeffrey Brooks

    2011-09-30

    The overall science goal of the FSP is to develop predictive simulation capability for magnetically confined fusion plasmas at an unprecedented level of integration and fidelity. This will directly support and enable effective U.S. participation in research related to the International Thermonuclear Experimental Reactor (ITER) and the overall mission of delivering practical fusion energy. The FSP will address a rich set of scientific issues together with experimental programs, producing validated integrated physics results. This is very well aligned with the mission of the ITER Organization to coordinate with its members the integrated modeling and control of fusion plasmas, including benchmarking and validation activities. [1]. Initial FSP research will focus on two critical areas: 1) the plasma edge and 2) whole device modeling including disruption avoidance. The first of these problems involves the narrow plasma boundary layer and its complex interactions with the plasma core and the surrounding material wall. The second requires development of a computationally tractable, but comprehensive model that describes all equilibrium and dynamic processes at a sufficient level of detail to provide useful prediction of the temporal evolution of fusion plasma experiments. The initial driver for the whole device model (WDM) will be prediction and avoidance of discharge-terminating disruptions, especially at high performance, which are a critical impediment to successful operation of machines like ITER. If disruptions prove unable to be avoided, their associated dynamics and effects will be addressed in the next phase of the FSP. The FSP plan targets the needed modeling capabilities by developing Integrated Science Applications (ISAs) specific to their needs. The Pedestal-Boundary model will include boundary magnetic topology, cross-field transport of multi-species plasmas, parallel plasma transport, neutral transport, atomic physics and interactions with the plasma wall

  5. EFFIS: and End-to-end Framework for Fusion Integrated Simulation

    SciTech Connect

    Cummings, Julian; Schwan, Karsten; Sim, Alexander S; Shoshani, Arie; Docan, Ciprian; Parashar, Manish; Klasky, Scott A; Podhorszki, Norbert

    2010-01-01

    The purpose of the Fusion Simulation Project is to develop a predictive capability for integrated modeling of magnetically confined burning plasmas. In support of this mission, the Center for Plasma Edge Simulation has developed an End-to-end Framework for Fusion Integrated Simulation (EFFIS) that combines critical computer science technologies in an effective manner to support leadership class computing and the coupling of complex plasma physics models. We describe here the main components of EFFIS and how they are being utilized to address our goal of integrated predictive plasma edge simulation.

  6. Stochastic Fusion Simulations and Experiments Suggest Passive and Active Roles of Hemagglutinin during Membrane Fusion

    PubMed Central

    Lee, Donald W.; Thapar, Vikram; Clancy, Paulette; Daniel, Susan

    2014-01-01

    Influenza enters the host cell cytoplasm by fusing the viral and host membrane together. Fusion is mediated by hemagglutinin (HA) trimers that undergo conformational change when acidified in the endosome. It is currently debated how many HA trimers, w, and how many conformationally changed HA trimers, q, are minimally required for fusion. Conclusions vary because there are three common approaches for determining w and q from fusion data. One approach correlates the fusion rate with the fraction of fusogenic HA trimers and leads to the conclusion that one HA trimer is required for fusion. A second approach correlates the fusion rate with the total concentration of fusogenic HA trimers and indicates that more than one HA trimer is required. A third approach applies statistical models to fusion rate data obtained at a single HA density to establish w or q and suggests that more than one HA trimer is required. In this work, all three approaches are investigated through stochastic fusion simulations and experiments to elucidate the roles of HA and its ability to bend the target membrane during fusion. We find that the apparent discrepancies among the results from the various approaches may be resolved if nonfusogenic HA participates in fusion through interactions with a fusogenic HA. Our results, based on H3 and H1 serotypes, suggest that three adjacent HA trimers and one conformationally changed HA trimer are minimally required to induce membrane fusion (w = 3 and q = 1). PMID:24559987

  7. Research on data fusion in ballistic warning simulation system

    NASA Astrophysics Data System (ADS)

    Cai, Zhihao; Zheng, Hongtao; Peng, Xiaoyuan

    2006-11-01

    Different kinds of sensors distributed in space, sky, ground and sea related in ballistic missile warning system are modeled using object-oriented method. Since each kind of sensor has distinct merit and demerit. The detection precision can be greatly enhanced using proper data fusion methods. A battlefield simulation environment is constructed based on HLA/RTI which could provide flexible interface to evaluate diverse sensors (mainly about infrared sensor and phase array radar) and different data fusion algorithms for ballistic missile defense. The data fusion simulation system can also be reused in computer generate force system to perform lager scale campaign simulation.

  8. Quality assurance in the Antares Laser Fusion Construction Project

    NASA Astrophysics Data System (ADS)

    Reichelt, W. H.

    The Antares CO2 laser facility came on line in November 1983 as an experimental physics facility; it is the world's largest CO2 laser fusion system. Antares is a major component of the Department of Energy's Inertial Confinement Fusion Program. Antares is a one-of-a kind laser system that is used in an experimental environment. Given limited project funds and tight schedules, the quality assurance program was tailored to achieve project goals without imposing oppressive constraints. The discussion will review the Antares quality assurance program and the utility of various portions to completion of the project.

  9. Progress and Future Directions in Confined Magnetic Fusion Simulation

    NASA Astrophysics Data System (ADS)

    Chan, V. S.

    2004-05-01

    The complexity of fusion plasmas makes the goal of integrated predictive simulation for optimization of fusion systems extremely challenging. Sophisticated computational models are under development for individual features of magnetically confined plasmas, enabled by increased scientific understanding and improvements in computer technology. Simulation codes, particle- and continuum-based, are being developed to elucidate the ability of fusion devices to contain mass, heat and momentum. Rigorous benchmarking among different codes has resulted in increased confidence in the predictive capability. Advances made in extended MHD simulations of actual experiments have led to deeper understanding of the nonlinear evolution of MHD instabilities that set the pressure limit of fusion devices. Simulation of the plasma edge, which controls the overall fusion performance, is especially difficult due to the wide range of spatial and temporal scales involved, as well as the need for a physics model that accurately describes collisionless and collisional plasma. We highlight encouraging progress in plasma microturbulence and extended MHD and a new challenge in simulation of the plasma edge.

  10. Basic plasma and fusion theory and computer simulations survey

    SciTech Connect

    Kawakami, I.; Nishikawa, K.

    1983-12-01

    The College of Science and Technology at Nihon University and the Institute for Fusion Theory at Hiroshima University discuss the history of the role of theory and simulation in fusion-oriented research. Recent activities include a one-dimensional tokamak transport code at Nagoya University and three-dimensional resistive MHD simulation studies of spheromaks. Other recent activities discussed include the tokamak computer code system TRITON, transport flux in currentless ECH-produced plasma in Heliotron-E, and thermal electron transport in the presence of a steep temperature gradient. The Japan-U.S. Joint Institute for Fusion Theory's present activities are discussed, including subject areas in three-dimensional simulation studies, nonequilibrium statistical physics, anaomalous transport and drift wave turbulence and hot-electron physics.

  11. Dynamics of cell aggregates fusion: Experiments and simulations

    NASA Astrophysics Data System (ADS)

    Thomas, Gilberto L.; Mironov, Vladimir; Nagy-Mehez, Agnes; Mombach, José C. M.

    2014-02-01

    Fusion of cell tissues is an ubiquitous phenomenon and has important technological applications including tissue biofabrication. In this work we present experimental results of aggregates fusion using adipose derived stem cells (ADSC) and a three dimensional computer simulation of the process using the cellular Potts model with aggregates reaching 10,000 cells. We consider fusion of round aggregates and monitor the dimensionless neck area of contact between the two aggregates to characterize the process, as done for the coalescence of liquid droplets and polymers. Both experiments and simulations show that the evolution of this quantity obeys a power law in time. We also study quantitatively individual cell motion with the simulation and it corresponds to an anomalous diffusion.

  12. Projective simulation for artificial intelligence

    PubMed Central

    Briegel, Hans J.; De las Cuevas, Gemma

    2012-01-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation. PMID:22590690

  13. Projective simulation for artificial intelligence.

    PubMed

    Briegel, Hans J; De las Cuevas, Gemma

    2012-01-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation. PMID:22590690

  14. Projective simulation for artificial intelligence

    NASA Astrophysics Data System (ADS)

    Briegel, Hans J.; de Las Cuevas, Gemma

    2012-05-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation.

  15. Spherically symmetric simulation of plasma liner driven magnetoinertial fusion

    SciTech Connect

    Samulyak, Roman; Parks, Paul; Wu Lingling

    2010-09-15

    Spherically symmetric simulations of the implosion of plasma liners and compression of plasma targets in the concept of the plasma jet driven magnetoinertial fusion have been performed using the method of front tracking. The cases of single deuterium and xenon liners and double layer deuterium-xenon liners compressing various deuterium-tritium targets have been investigated, optimized for maximum fusion energy gains, and compared with theoretical predictions and scaling laws of [P. Parks, Phys. Plasmas 15, 062506 (2008)]. In agreement with the theory, the fusion gain was significantly below unity for deuterium-tritium targets compressed by Mach 60 deuterium liners. The most optimal setup for a given chamber size contained a target with the initial radius of 20 cm compressed by a 10 cm thick, Mach 60 xenon liner, achieving a fusion energy gain of 10 with 10 GJ fusion yield. Simulations also showed that composite deuterium-xenon liners reduce the energy gain due to lower target compression rates. The effect of heating of targets by alpha particles on the fusion energy gain has also been investigated.

  16. Secondary fusion coupled deuteron/triton transport simulation and thermal-to-fusion neutron convertor measurement

    SciTech Connect

    Wang, G. B.; Wang, K.; Liu, H. G.; Li, R. D.

    2013-07-01

    A Monte Carlo tool RSMC (Reaction Sequence Monte Carlo) was developed to simulate deuteron/triton transportation and reaction coupled problem. The 'Forced particle production' variance reduction technique was used to improve the simulation speed, which made the secondary product play a major role. The mono-energy 14 MeV fusion neutron source was employed as a validation. Then the thermal-to-fusion neutron convertor was studied with our tool. Moreover, an in-core conversion efficiency measurement experiment was performed with {sup 6}LiD and {sup 6}LiH converters. Threshold activation foils was used to indicate the fast and fusion neutron flux. Besides, two other pivotal parameters were calculated theoretically. Finally, the conversion efficiency of {sup 6}LiD is obtained as 1.97x10{sup -4}, which matches well with the theoretical result. (authors)

  17. Opacity project - Astrophysical and fusion applications

    NASA Technical Reports Server (NTRS)

    Pradhan, A. K.

    1987-01-01

    An overview is presented of a project to calculate large quantities of accurate atomic data for radiative processes of importance in the precise determination of opacities in stellar atmospheres, and for astrophysical and laboratory applications in general. Work is in progress on the oscillator strengths, photoionization cross sections, damping constants, etc., for all atoms and ions in hydrogen through neon isoelectronic sequences going up to iron.

  18. Tempest Neoclassical Simulation of Fusion Edge Plasmas

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Xiong, Z.; Cohen, B. I.; Cohen, R. H.; Dorr, M.; Hittinger, J.; Kerbel, G. D.; Nevins, W. M.; Rognlien, T. D.

    2006-04-01

    We are developing a continuum gyrokinetic full-F code, TEMPEST, to simulate edge plasmas. The geometry is that of a fully diverted tokamak and so includes boundary conditions for both closed magnetic flux surfaces and open field lines. The code, presently 4-dimensional (2D2V), includes kinetic ions and electrons, a gyrokinetic Poisson solver for electric field, and the nonlinear Fokker-Planck collision operator. Here we present the simulation results of neoclassical transport with Boltzmann electrons. In a large aspect ratio circular geometry, excellent agreement is found for neoclassical equilibrium with parallel flows in the banana regime without a temperature gradient. In divertor geometry, it is found that the endloss of particles and energy induces pedestal-like density and temperature profiles inside the magnetic separatrix and parallel flow stronger than the neoclassical predictions in the SOL. The impact of the X-point divertor geometry on the self-consistent electric field and geo-acoustic oscillations will be reported. We will also discuss the status of extending TEMPEST into a 5-D code.

  19. Colorado School of Mines fusion gamma ray diagnostic project

    SciTech Connect

    Cecil, F.E.

    1992-02-14

    This report summarizes the 1991 calendar year activities of the fusion gamma ray diagnostics project in the Physics Department at the Colorado School of Mines. Considerable progress has been realized in the fusion gamma ray diagnostic project in the last year. Specifically we have achieved the two major goals of the project as outlined in last year's proposed work statement to the Office of Applied Plasma Physics in the DOE Division of Magnetic Fusion Energy. The two major goals were: (1) Solution of the severe interference problem encountered during the operation of the gamma ray spectrometer concurrent with high power levels of the neutral beam injectors (NBI) and the ICRH antenae. (2) Experimental determination of the absolute detection efficiency of the gamma ray spectrometer. This detection efficiency will allow the measured yields of the gamma rays to be converted to a total reaction rate. In addition to these two major accomplishments, we have continued, as permitted by the TFTR operating schedule, the observation of high energy gamma rays from the 3He(D,{gamma})5Li reaction during deuterium NBI heating of 3He plasmas.

  20. Simulation of Chamber Transport for Heavy-Ion-Fusion Drivers

    SciTech Connect

    Sharp, W M; Callahan, D A; Tabak, M; Yu, S S; Peterson, P F; Rose, D V; Welch, D R

    2003-09-25

    The heavy-ion fusion (HIF) community recently developed a power-plant design that meets the various requirements of accelerators, final focus, chamber transport, and targets. The point design is intended to minimize physics risk and is certainly not optimal for the cost of electricity. Recent chamber-transport simulations, however, indicate that changes in the beam ion species, the convergence angle, and the emittance might allow more-economical designs.

  1. A Motion Tracking and Sensor Fusion Module for Medical Simulation.

    PubMed

    Shen, Yunhe; Wu, Fan; Tseng, Kuo-Shih; Ye, Ding; Raymond, John; Konety, Badrinath; Sweet, Robert

    2016-01-01

    Here we introduce a motion tracking or navigation module for medical simulation systems. Our main contribution is a sensor fusion method for proximity or distance sensors integrated with inertial measurement unit (IMU). Since IMU rotation tracking has been widely studied, we focus on the position or trajectory tracking of the instrument moving freely within a given boundary. In our experiments, we have found that this module reliably tracks instrument motion. PMID:27046606

  2. Simulated disparity and peripheral blur interact during binocular fusion.

    PubMed

    Maiello, Guido; Chessa, Manuela; Solari, Fabio; Bex, Peter J

    2014-01-01

    We have developed a low-cost, practical gaze-contingent display in which natural images are presented to the observer with dioptric blur and stereoscopic disparity that are dependent on the three-dimensional structure of natural scenes. Our system simulates a distribution of retinal blur and depth similar to that experienced in real-world viewing conditions by emmetropic observers. We implemented the system using light-field photographs taken with a plenoptic camera which supports digital refocusing anywhere in the images. We coupled this capability with an eye-tracking system and stereoscopic rendering. With this display, we examine how the time course of binocular fusion depends on depth cues from blur and stereoscopic disparity in naturalistic images. Our results show that disparity and peripheral blur interact to modify eye-movement behavior and facilitate binocular fusion, and the greatest benefit was gained by observers who struggled most to achieve fusion. Even though plenoptic images do not replicate an individual’s aberrations, the results demonstrate that a naturalistic distribution of depth-dependent blur may improve 3-D virtual reality, and that interruptions of this pattern (e.g., with intraocular lenses) which flatten the distribution of retinal blur may adversely affect binocular fusion. PMID:25034260

  3. Bohunice Simulator Data Collection Project

    SciTech Connect

    Cillik, Ivan; Prochaska, Jan

    2002-07-01

    The paper describes the way and results of human reliability data analysis collected as a part of the Bohunice Simulator Data Collection Project (BSDCP), which was performed by VUJE Trnava, Inc. with funding support from the U.S. DOE, National Nuclear Security Administration. The goal of the project was to create a methodology for simulator data collection and analysis to support activities in probabilistic safety assessment (PSA) and human reliability assessment for Jaslovske Bohunice nuclear power plant consisting of two sets of twin units: two VVER 440/V-230 (V1) and two VVER 440/V-213 (V2) reactors. During the project training of V-2 control room crews was performed at VUJE-Trnava simulator. The simulator training and the data collection were done in parallel. The main goal of BSDCP was to collect suitable data of human errors under simulated conditions requiring the use of symptom-based emergency operating procedures (SBEOPs). The subjects of the data collection were scenario progress time data, operator errors, and real-time technological parameters. The paper contains three main parts. The first part presents preparatory work and semi-automatic computer-based methods used to collect data and to check technological parameters in order to find hidden errors of operators, to be able to retrace the course of each scenario for purposes of further analysis, and to document the whole training process. The first part gives also an overview of collected data scope, human error taxonomy, and state classifications for SBEOP instructions coding. The second part describes analytical work undertaken to describe time distribution necessary for execution of various kinds of instructions performed by operators according to the classification for coding of SBEOP instructions. It also presents the methods used for determination of probability distribution for different operator errors. Results from the data evaluation are presented in the last part of the paper. An overview of

  4. KULL: LLNL's ASCI Inertial Confinement Fusion Simulation Code

    SciTech Connect

    Rathkopf, J. A.; Miller, D. S.; Owen, J. M.; Zike, M. R.; Eltgroth, P. G.; Madsen, N. K.; McCandless, K. P.; Nowak, P. F.; Nemanic, M. K.; Gentile, N. A.; Stuart, L. M.; Keen, N. D.; Palmer, T. S.

    2000-01-10

    KULL is a three dimensional, time dependent radiation hydrodynamics simulation code under development at Lawrence Livermore National Laboratory. A part of the U.S. Department of Energy's Accelerated Strategic Computing Initiative (ASCI), KULL's purpose is to simulate the physical processes in Inertial Confinement Fusion (ICF) targets. The National Ignition Facility, where ICF experiments will be conducted, and ASCI are part of the experimental and computational components of DOE's Stockpile Stewardship Program. This paper provides an overview of ASCI and describes KULL, its hydrodynamic simulation capability and its three methods of simulating radiative transfer. Particular emphasis is given to the parallelization techniques essential to obtain the performance required of the Stockpile Stewardship Program and to exploit the massively parallel processor machines that ASCI is procuring.

  5. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  6. Simulation of polyethylene glycol and calcium-mediated membrane fusion

    SciTech Connect

    Pannuzzo, Martina; De Jong, Djurre H.; Marrink, Siewert J.; Raudino, Antonio

    2014-03-28

    We report on the mechanism of membrane fusion mediated by polyethylene glycol (PEG) and Ca{sup 2+} by means of a coarse-grained molecular dynamics simulation approach. Our data provide a detailed view on the role of cations and polymer in modulating the interaction between negatively charged apposed membranes. The PEG chains cause a reduction of the inter-lamellar distance and cause an increase in concentration of divalent cations. When thermally driven fluctuations bring the membranes at close contact, a switch from cis to trans Ca{sup 2+}-lipid complexes stabilizes a focal contact acting as a nucleation site for further expansion of the adhesion region. Flipping of lipid tails induces subsequent stalk formation. Together, our results provide a molecular explanation for the synergistic effect of Ca{sup 2+} and PEG on membrane fusion.

  7. Radiation Hydrodynamic Simulations of an Inertial Fusion Energy Reactor Chamber

    NASA Astrophysics Data System (ADS)

    Sacks, Ryan Foster

    Inertial fusion energy reactors present great promise for the future as they are capable of providing baseline power with no carbon footprint. Simulation work regarding the chamber response and first wall insult is carried out using the 1-D BUCKY radiation hydrodynamics code for a variety of differing chamber fills, radii, chamber obstructions and first wall materials. Discussion of the first wall temperature rise, x-ray spectrum incident on the wall, shock timing and maximum overpressure are presented. An additional discussion of the impact of different gas opacities and their effect on overall chamber dynamics, including the formation of two shock fronts, is also presented. This work is performed under collaboration with Lawrence Livermore National Laboratory at the University of Wisconsin-Madison's Fusion Technology Institute.

  8. Simulation of chamber transport for heavy-ion fusion

    SciTech Connect

    Sharp, W.M.; Callahan, D.A.; Tabak, M.A.; Yu, S.S.; Peterson, P.F.; Rose, D.V.; Welch, D.R.; Davidson, R.C.; Kaganovich, I.D.; Startsev, E.; Olson, C.L.

    2002-10-04

    Beams for heavy-ion fusion (HIF) are expected to require substantial neutralization in a target chamber. Present targets call for higher beam currents and smaller focal spots than most earlier designs, leading to high space-charge fields. Collisional stripping by the background gas expected in the chamber further increases the beam charge. Simulations with no electron sources other than beam stripping and background-gas ionization show an acceptable focal spot only for high ion energies or for currents far below the values assumed in recent HIF power-plant scenarios. Much recent research has, therefore, focused on beam neutralization by electron sources that were neglected in earlier simulations, including emission from walls and the target, photoionization by radiation from the target, and pre-neutralization by a plasma generated along the beam path. The simulations summarized here indicate that these effects can significantly reduce the beam focal-spot size.

  9. Simulation of Chamber Transport for Heavy-Ion Fusion

    SciTech Connect

    Sharp, W M; Callahan Miller, D A; Tabak, M; Yu, S S; Peterson, P F; Rose, D V; Welch, D R; Davidson, R C; Kaganovich, I D; Startsev, E; Olson, C L

    2002-10-14

    Beams for heavy-ion fusion (HIF) are expected to require substantial neutralization in a target chamber. Present targets call for higher beam currents and smaller focal spots than most earlier designs, leading to high space-charge fields. Collisional stripping by the background gas expected in the chamber further increases the beam charge. Simulations with no electron sources other than beam stripping and background-gas ionization show an acceptable focal spot only for high ion energies or for currents far below the values assumed in recent HIF power-plant scenarios. Much recent research has, therefore, focused on beam neutralization by electron sources that were neglected in earlier simulations, including emission from walls and the target, photoionization by radiation from the target, and pre-neutralization by a plasma generated along the beam path. The simulations summarized here indicate that these effects can significantly reduce the beam focal-spot size.

  10. Simulation of Carbon Production from Material Surfaces in Fusion Devices

    NASA Astrophysics Data System (ADS)

    Marian, J.; Verboncoeur, J.

    2005-10-01

    Impurity production at carbon surfaces by plasma bombardment is a key issue for fusion devices as modest amounts can lead to excessive radiative power loss and/or hydrogenic D-T fuel dilution. Here results of molecular dynamics (MD) simulations of physical and chemical sputtering of hydrocarbons are presented for models of graphite and amorphous carbon, the latter formed by continuous D-T impingement in conditions that mimic fusion devices. The results represent more extensive simulations than we reported last year, including incident energies in the 30-300 eV range for a variety of incident angles that yield a number of different hydrocarbon molecules. The calculated low-energy yields clarify the uncertainty in the complex chemical sputtering rate since chemical bonding and hard-core repulsion are both included in the interatomic potential. Also modeled is hydrocarbon break-up by electron-impact collisions and transport near the surface. Finally, edge transport simulations illustrate the sensitivity of the edge plasma properties arising from moderate changes in the carbon content. The models will provide the impurity background for the TEMPEST kinetic edge code.

  11. The Mars Gravity Simulation Project

    NASA Technical Reports Server (NTRS)

    Korienek, Gene

    1998-01-01

    Human beings who make abrupt transitions between one gravitational environment and another undergo severe disruptions of their visual perception and visual- motor coordination, frequently accompanied by "space sickness." Clearly, such immediate effects of exposure to a novel gravitational condition have significant implications for human performance. For example, when astronauts first arrive in Earth orbit their attempts to move about in the spacecraft and to perform their duties are uncoordinated, inaccurate, and inefficient. Other inter-gravitational transitions for which these difficulties can be expected include going from the 0 g of the spacecraft to the. 16 g of the Moon, from 0 g to the .38 g of Mars, and from 0 g back to the 1.0 g of Earth. However, after astronauts have actively interacted with their new gravitational environment for several days, these problems tend to disappear, evidence that some sort of adaptive process has taken place. It would be advantageous, therefore, if there were some way to minimize or perhaps even to eliminate this potentially hazardous adaptive transition period by allowing astronauts to adapt to the altered gravitational conditions before actually entering them. Simultaneous adaptations to both the altered and the normal gravitational environment as a result of repeatedly adapting to one and readapting to the other, a phenomenon known as dual adaptation. The objective of the Mars Gravity Simulator (MGS) Project is to construct a simulation of the visual and bodily effects of altered gravity. This perceptual-motor simulation is created through the use of: 1) differential body pressure to produce simulated hypo-gravity and 2) treadmill-controlled virtual reality to create a corresponding visual effect. It is expected that this combination will produce sensory motor perturbations in the subjects. Both the immediate and adaptive behavioral (postural and ambulatory) responses to these sensory perturbations will be assessed.

  12. The Mars Gravity Simulation Project

    NASA Astrophysics Data System (ADS)

    Korienek, Gene

    1998-10-01

    Human beings who make abrupt transitions between one gravitational environment and another undergo severe disruptions of their visual perception and visual- motor coordination, frequently accompanied by "space sickness." Clearly, such immediate effects of exposure to a novel gravitational condition have significant implications for human performance. For example, when astronauts first arrive in Earth orbit their attempts to move about in the spacecraft and to perform their duties are uncoordinated, inaccurate, and inefficient. Other inter-gravitational transitions for which these difficulties can be expected include going from the 0 g of the spacecraft to the. 16 g of the Moon, from 0 g to the .38 g of Mars, and from 0 g back to the 1.0 g of Earth. However, after astronauts have actively interacted with their new gravitational environment for several days, these problems tend to disappear, evidence that some sort of adaptive process has taken place. It would be advantageous, therefore, if there were some way to minimize or perhaps even to eliminate this potentially hazardous adaptive transition period by allowing astronauts to adapt to the altered gravitational conditions before actually entering them. Simultaneous adaptations to both the altered and the normal gravitational environment as a result of repeatedly adapting to one and readapting to the other, a phenomenon known as dual adaptation. The objective of the Mars Gravity Simulator (MGS) Project is to construct a simulation of the visual and bodily effects of altered gravity. This perceptual-motor simulation is created through the use of: 1) differential body pressure to produce simulated hypo-gravity and 2) treadmill-controlled virtual reality to create a corresponding visual effect. It is expected that this combination will produce sensory motor perturbations in the subjects. Both the immediate and adaptive behavioral (postural and ambulatory) responses to these sensory perturbations will be assessed.

  13. Multisource report-level simulator for fusion research

    NASA Astrophysics Data System (ADS)

    Carlotto, Mark J.; Kadar, Ivan

    2003-08-01

    The Multi-source Report-level Simulator (MRS) is a tool developed by Veridian Systems as part of its Model-adaptive Multi-source Track Fusion (MMTF) effort under DARPA's DTT program. MRS generates simulated multisensor contact reports for GMTI, HUMINT, IMINT, SIGINT, UGS, and video. It contains a spatial editor for creating ground tracks along which vehicles move over the terrain. Vehicles can start, stop, speed up, or slow down. The spatial editor is also used to define the locations of fixed sensors such as UGS and HUMINT observers on the ground, and flight paths of GMTI, IMINT, SIGINT, and video sensors in the air. Observation models characterize each sensor at the report level in terms of their operating characteristics (revisit rate, resolution, etc.) measurement errors, and detection/classification performance (i.e., Pd, Nfa, Pcc, and Pid). Contact reports are linked to ground truth data to facilitate the testing of track/fusion algorithms and the validation of associated performance models.

  14. Computer modeling and simulation in inertial confinement fusion

    NASA Astrophysics Data System (ADS)

    McCrory, R. L.; Verdon, C. P.

    1989-03-01

    The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper we describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics.

  15. Computer modeling and simulation in inertial confinement fusion

    SciTech Connect

    McCrory, R.L.; Verdon, C.P.

    1989-03-01

    The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper we describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics. 46 refs., 19 figs., 1 tab.

  16. The Progress of Research Project for Magnetized Target Fusion in China

    NASA Astrophysics Data System (ADS)

    Yang, Xian-Jun

    2015-11-01

    The fusion of magnetized plasma called Magnetized Target Fusion (MTF) is a hot research area recently. It may significantly reduce the cost and size. Great progress has been achieved in past decades around the world. Five years ago, China initiated the MTF project and has gotten some progress as follows: 1. Verifying the feasibility of ignition of MTF by means of first principle and MHD simulation; 2. Generating the magnetic field over 1400 Tesla, which can be suppress the heat conduction from charged particles, deposit the energy of alpha particle to promote the ignition process, and produce the stable magnetized plasma for the target of ignition; 3. The imploding facility of FP-1 can put several Mega Joule energy to the solid liner of about ten gram in the range of microsecond risen time, while the simulating tool has been developed for design and analysis of the process; 4. The target of FRC can be generated by ``YG 1 facility'' while some simulating tools have be developed. Next five years, the above theoretical work and the experiments of MTF may be integrated to step up as the National project, which may make my term play an important lead role and be supposed to achieve farther progress in China. Supported by the National Natural Science Foundation of China under Grant No 11175028.

  17. Physics Basis and Simulation of Burning Plasma Physics for the Fusion Ignition Research Experiment (FIRE)

    SciTech Connect

    C.E. Kessel; D. Meade; S.C. Jardin

    2002-01-18

    The FIRE [Fusion Ignition Research Experiment] design for a burning plasma experiment is described in terms of its physics basis and engineering features. Systems analysis indicates that the device has a wide operating space to accomplish its mission, both for the ELMing H-mode reference and the high bootstrap current/high beta advanced tokamak regimes. Simulations with 1.5D transport codes reported here both confirm and constrain the systems projections. Experimental and theoretical results are used to establish the basis for successful burning plasma experiments in FIRE.

  18. Contribution to fusion research from IAEA coordinated research projects and joint experiments

    NASA Astrophysics Data System (ADS)

    Gryaznevich, M.; Van Oost, G.; Stöckel, J.; Kamendje, R.; Kuteev, B. N.; Melnikov, A.; Popov, T.; Svoboda, V.; The IAEA CRP Teams

    2015-10-01

    The paper presents objectives and activities of IAEA Coordinated Research Projects ‘Conceptual development of steady-state compact fusion neutron sources’ and ‘Utilisation of a network of small magnetic confinement fusion devices for mainstream fusion research’. The background and main projects of the CRP on FNS are described in detail, as this is a new activity at IAEA. Recent activities of the second CRP, which continues activities of previous CRPs, are overviewed.

  19. SIMULATION OF INTENSE BEAMS FOR HEAVY ION FUSION

    SciTech Connect

    Friedman, A

    2004-06-10

    Computer simulations of intense ion beams play a key role in the Heavy Ion Fusion research program. Along with analytic theory, they are used to develop future experiments, guide ongoing experiments, and aid in the analysis and interpretation of experimental results. They also afford access to regimes not yet accessible in the experimental program. The U.S. Heavy Ion Fusion Virtual National Laboratory and its collaborators have developed state-of-the art computational tools, related both to codes used for stationary plasmas and to codes used for traditional accelerator applications, but necessarily differing from each in important respects. These tools model beams in varying levels of detail and at widely varying computational cost. They include moment models (envelope equations and fluid descriptions), particle-in-cell methods (electrostatic and electromagnetic), nonlinear-perturbative descriptions (''{delta}f''), and continuum Vlasov methods. Increasingly, it is becoming clear that it is necessary to simulate not just the beams themselves, but also the environment in which they exist, be it an intentionally-created plasma or an unwanted cloud of electrons and gas. In this paper, examples of the application of simulation tools to intense ion beam physics are presented, including support of present-day experiments, fundamental beam physics studies, and the development of future experiments. Throughout, new computational models are described and their utility explained. These include Mesh Refinement (and its dynamic variant, Adaptive Mesh Refinement); improved electron cloud and gas models, and an electron advance scheme that allows use of larger time steps; and moving-mesh and adaptive-mesh Vlasov methods.

  20. Report of the Fusion Energy Sciences Advisory Committee. Panel on Integrated Simulation and Optimization of Magnetic Fusion Systems

    SciTech Connect

    Dahlburg, Jill; Corones, James; Batchelor, Donald; Bramley, Randall; Greenwald, Martin; Jardin, Stephen; Krasheninnikov, Sergei; Laub, Alan; Leboeuf, Jean-Noel; Lindl, John; Lokke, William; Rosenbluth, Marshall; Ross, David; Schnack, Dalton

    2002-11-01

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individual features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC

  1. Deuterium-Tritium Simulations of the Enhanced Reversed Shear Mode in the Tokamak Fusion Test Reactor

    SciTech Connect

    Mikkelsen, D.R.; Manickam, J.; Scott, S.D.; Zarnstorff

    1997-04-01

    The potential performance, in deuterium-tritium plasmas, of a new enhanced con nement regime with reversed magnetic shear (ERS mode) is assessed. The equilibrium conditions for an ERS mode plasma are estimated by solving the plasma transport equations using the thermal and particle dif- fusivities measured in a short duration ERS mode discharge in the Tokamak Fusion Test Reactor [F. M. Levinton, et al., Phys. Rev. Letters, 75, 4417, (1995)]. The plasma performance depends strongly on Zeff and neutral beam penetration to the core. The steady state projections typically have a central electron density of {approx}2:5x10 20 m{sup -3} and nearly equal central electron and ion temperatures of {approx}10 keV. In time dependent simulations the peak fusion power, {approx} 25 MW, is twice the steady state level. Peak performance occurs during the density rise when the central ion temperature is close to the optimal value of {approx} 15 keV. The simulated pressure profiles can be stable to ideal MHD instabilities with toroidal mode number n = 1, 2, 3, 4 and {infinity} for {beta}{sub norm} up to 2.5; the simulations have {beta}{sub norm} {le} 2.1. The enhanced reversed shear mode may thus provide an opportunity to conduct alpha physics experiments in conditions imilar to those proposed for advanced tokamak reactors.

  2. Improved computational methods for simulating inertial confinement fusion

    NASA Astrophysics Data System (ADS)

    Fatenejad, Milad

    This dissertation describes the development of two multidimensional Lagrangian code for simulating inertial confinement fusion (ICF) on structured meshes. The first is DRACO, a production code primarily developed by the Laboratory for Laser Energetics. Several significant new capabilities were implemented including the ability to model radiative transfer using Implicit Monte Carlo [Fleck et al., JCP 8, 313 (1971)]. DRACO was also extended to operate in 3D Cartesian geometry on hexahedral meshes. Originally the code was only used in 2D cylindrical geometry. This included implementing thermal conduction and a flux-limited multigroup diffusion model for radiative transfer. Diffusion equations are solved by extending the 2D Kershaw method [Kershaw, JCP 39, 375 (1981)] to three dimensions. The second radiation-hydrodynamics code developed as part of this thesis is Cooper, a new 3D code which operates on structured hexahedral meshes. Cooper supports the compatible hydrodynamics framework [Caramana et al., JCP 146, 227 (1998)] to obtain round-off error levels of global energy conservation. This level of energy conservation is maintained even when two temperature thermal conduction, ion/electron equilibration, and multigroup diffusion based radiative transfer is active. Cooper is parallelized using domain decomposition, and photon energy group decomposition. The Mesh Oriented datABase (MOAB) computational library is used to exchange information between processes when domain decomposition is used. Cooper's performance is analyzed through direct comparisons with DRACO. Cooper also contains a method for preserving spherical symmetry during target implosions [Caramana et al., JCP 157, 89 (1999)]. Several deceleration phase implosion simulations were used to compare instability growth using traditional hydrodynamics and compatible hydrodynamics with/without symmetry modification. These simulations demonstrate increased symmetry preservation errors when traditional hydrodynamics

  3. On a Primal Coarse Projective Integration Method for Multiscale Simulations

    NASA Astrophysics Data System (ADS)

    Skoric, Milos; Ishiguro, Seiji; Maluckov, Sandra

    2006-10-01

    A novel simulation framework called Equation-Free Projective Integration (EFPI) was recently applied to nonlinear plasmas by M. Shay [1] to study propagation and steepening of a 1D ion sound (IS) with a PIC code as a microscopic simulator. To initialize, macro plasma variables are ``lifted'' to a fine micro-representation. PIC code is stepped forward for a short time, and the results are ``restricted'' or smoothed back to macro space. By extrapolation, time derivative is estimated and projected with a large step; the process is repeated. As a simple alternative, we propose a sort of a primal EPFI scheme to simulate nonlinear plasmas including kinetic effects. The micro-simulator is a standard 1D ES PIC code. Ions are assumed inherently coarse grained or ``smoothed'' and tracked to extrapolate in time and project. The potential is averaged over the electron plasma period to extrapolate and project. No adiabatic approximation for electrons is used [2], instead, self-consistently find the non-uniform electron distribution from the Poisson equation and ion density. Preliminary results for nonlinear IS as well as for the IS double layer paradigm are presented and some limitations on the EPFI discussed. [1] M. Shay, J. Drake, W. Dorland, J. of Comp. Phys (APS DPP 2005) [2] G. Stanchev, A. Maluckov et al., in EPS Fusion (Rome, 2006).

  4. Neoclassical Simulations of Fusion Alpha Particles in Pellet Charge Exchange Experiments on the Tokamak Fusion Test Reactor

    SciTech Connect

    Batha, S.H.; Budny, R.V.; Darrow, D.S.; Levinton, F.M.; Redi, M.H.; et al

    1999-02-01

    Neoclassical simulations of alpha particle density profiles in high fusion power plasmas on the Tokamak Fusion Test Reactor (TFTR) [Phys. Plasmas 5 (1998) 1577] are found to be in good agreement with measurements of the alpha distribution function made with a sensitive active neutral particle diagnostic. The calculations are carried out in Hamiltonian magnetic coordinates with a fast, particle-following Monte Carlo code which includes the neoclassical transport processes, a recent first-principles model for stochastic ripple loss and collisional effects. New global loss and confinement domain calculations allow an estimate of the actual alpha particle densities measured with the pellet charge exchange diagnostic.

  5. Evaluation of performance of select fusion experiments and projected reactors

    NASA Technical Reports Server (NTRS)

    Miley, G. H.

    1978-01-01

    The performance of NASA Lewis fusion experiments (SUMMA and Bumpy Torus) is compared with other experiments and that necessary for a power reactor. Key parameters cited are gain (fusion power/input power) and the time average fusion power, both of which may be more significant for real fusion reactors than the commonly used Lawson parameter. The NASA devices are over 10 orders of magnitude below the required powerplant values in both gain and time average power. The best experiments elsewhere are also as much as 4 to 5 orders of magnitude low. However, the NASA experiments compare favorably with other alternate approaches that have received less funding than the mainline experiments. The steady-state character and efficiency of plasma heating are strong advantages of the NASA approach. The problem, though, is to move ahead to experiments of sufficient size to advance in gain and average power parameters.

  6. CORSICA: A comprehensive simulation of toroidal magnetic-fusion devices. Final report to the LDRD Program

    SciTech Connect

    Crotinger, J.A.; LoDestro, L.; Pearlstein, L.D.; Tarditi, A.; Casper, T.A.; Hooper, E.B.

    1997-03-21

    In 1992, our group began exploring the requirements for a comprehensive simulation code for toroidal magnetic fusion experiments. There were several motivations for taking this step. First, the new machines being designed were much larger and more expensive than current experiments. Second, these new designs called for much more sophisticated control of the plasma shape and position, as well as the distributions of energy, mass, and current within the plasma. These factors alone made it clear that a comprehensive simulation capability would be an extremely valuable tool for machine design. The final motivating factor was that the national Numerical Tokamak Project (NTP) had recently received High Performance Computing and Communications (HPCC) Grand Challenge funding to model turbulent transport in tokamaks, raising the possibility that first-principles simulations of this process might be practical in the near future. We felt that the best way to capitalize on this development was to integrate the resulting turbulence simulation codes into a comprehensive simulation. Such simulations must include the effects of many microscopic length- and time-scales. In order to do a comprehensive simulation efficiently, the length- and time- scale disparities must be exploited. We proposed to do this by coupling the average or quasistatic effects from the fast time-scales to a slow-time-scale transport code for the macroscopic plasma evolution. In FY93-FY96 we received funding to investigate algorithms for computationally coupling such disparate-scale simulations and to implement these algorithms in a prototype simulation code, dubbed CORSICA. Work on algorithms and test cases proceeded in parallel, with the algorithms being incorporated into CORSICA as they became mature. In this report we discuss the methods and algorithms, the CORSICA code, its applications, and our plans for the future.

  7. Advanced simulation of electron heat transport in fusion plasmas

    SciTech Connect

    Lin, Zhihong; Xiao, Y.; Klasky, Scott A; Lofstead, J.

    2009-01-01

    Electron transport in burning plasmas is more important since fusion products first heat electrons. First-principles simulations of electron turbulence are much more challenging due to the multi-scale dynamics of the electron turbulence, and have been made possible by close collaborations between plasma physicists and computational scientists. The GTC simulations of collisionless trapped electron mode (CTEM) turbulence show that the electron heat transport exhibits a gradual transition from Bohm to gyroBohm scaling when the device size is increased. The deviation from the gyroBohm scaling can be induced by large turbulence eddies, turbulence spreading, and non-diffusive transport processes. Analysis of radial correlation function shows that CTEM turbulence eddies are predominantly microscopic but with a significant tail in the mesoscale. A comprehensive analysis of kinetic and fluid time scales shows that zonal flow shearing is the dominant decorrelation mechanism. The mesoscale eddies result from a dynamical process of linear streamers breaking by zonal flows and merging of microscopic eddies. The radial profile of the electron heat conductivity only follows the profile of fluctuation intensity on a global scale, whereas the ion transport tracks more sensitively the local fluctuation intensity. This suggests the existence of a nondiffusive component in the electron heat flux, which arises from the ballistic radial E x B drift of trapped electrons due to a combination of the presence of mesoscale eddies and the weak de-tuning of the toroidal precessional resonance that drives the CTEM instability. On the other hand, the ion radial excursion is not affected by the mesoscale eddies due to a parallel decorrelation, which is not operational for the trapped electrons because of a bounce averaging process associated with the electron fast motion along magnetic field lines. The presence of the nondiffusive component raises question on the applicability of the usual

  8. Advanced Simulation of Electron Heat Transport in Fusion Plasmas

    SciTech Connect

    Lin, Z.; Xiao, Y.; Holod, I.; Zhang, W. L.; Deng, Wenjun; Klasky, Scott A; Lofstead, J.; Kamath, Chandrika; Wichmann, Nathan

    2009-01-01

    Electron transport in burning plasmas is more important since fusion products first heat electrons. First-principles simulations of electron turbulence are much more challenging due to the multi-scale dynamics of the electron turbulence, and have been made possible by close collaborations between plasma physicists and computational scientists. The GTC simulations of collisionless trapped electron mode (CTEM) turbulence show that the electron heat transport exhibits a gradual transition from Bohm to gyroBohm scaling when the device size is increased. The deviation from the gyroBohm scaling can be induced by large turbulence eddies, turbulence spreading, and non-diffusive transport processes. Analysis of radial correlation function shows that CTEM turbulence eddies are predominantly microscopic but with a significant tail in the mesoscale. A comprehensive analysis of kinetic and fluid time scales shows that zonal flow shearing is the dominant decorrelation mechanism. The mesoscale eddies result from a dynamical process of linear streamers breaking by zonal flows and merging of microscopic eddies. The radial profile of the electron heat conductivity only follows the profile of fluctuation intensity on a global scale, whereas the ion transport tracks more sensitively the local fluctuation intensity. This suggests the existence of a nondiffusive component in the electron heat flux, which arises from the ballistic radial E x B drift of trapped electrons due to a combination of the presence of mesoscale eddies and the weak de-tuning of the toroidal precessional resonance that drives the CTEM instability. On the other hand, the ion radial excursion is not affected by the mesoscale eddies due to a parallel decorrelation, which is not operational for the trapped electrons because of a bounce averaging process associated with the electron fast motion along magnetic field lines. The presence of the nondiffusive component raises question on the applicability of the usual

  9. SimFuse: A Novel Fusion Simulator for RNA Sequencing (RNA-Seq) Data.

    PubMed

    Tan, Yuxiang; Tambouret, Yann; Monti, Stefano

    2015-01-01

    The performance evaluation of fusion detection algorithms from high-throughput sequencing data crucially relies on the availability of data with known positive and negative cases of gene rearrangements. The use of simulated data circumvents some shortcomings of real data by generation of an unlimited number of true and false positive events, and the consequent robust estimation of accuracy measures, such as precision and recall. Although a few simulated fusion datasets from RNA Sequencing (RNA-Seq) are available, they are of limited sample size. This makes it difficult to systematically evaluate the performance of RNA-Seq based fusion-detection algorithms. Here, we present SimFuse to address this problem. SimFuse utilizes real sequencing data as the fusions' background to closely approximate the distribution of reads from a real sequencing library and uses a reference genome as the template from which to simulate fusions' supporting reads. To assess the supporting read-specific performance, SimFuse generates multiple datasets with various numbers of fusion supporting reads. Compared to an extant simulated dataset, SimFuse gives users control over the supporting read features and the sample size of the simulated library, based on which the performance metrics needed for the validation and comparison of alternative fusion-detection algorithms can be rigorously estimated. PMID:26839886

  10. Simulating weld-fusion boundary microstructures in aluminum alloys

    NASA Astrophysics Data System (ADS)

    Kostrivas, Anastasios D.; Lippold, John C.

    2004-02-01

    A fundamental study of weld-fusion boundary microstructure evolution in aluminum alloys was conducted in an effort to understand equiaxed grain zone formation and fusion boundary nucleation and growth phenomena. In addition to commercial aluminum alloys, experimental Mg-bearing alloys with Zr and Sc additions were studied along with the widely used Cu- and Licontaining alloy 2195-T8. This article describes work conducted to clarify the interrelation among composition, base metal substrate, and temperature as they relate to nucleation and growth phenomena at the fusion boundary.

  11. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  12. Web Interface Connecting Gyrokinetic Turbulence Simulations with Tokamak Fusion Data

    NASA Astrophysics Data System (ADS)

    Suarez, A.; Ernst, D. R.

    2005-10-01

    We are developing a comprehensive interface to connect plasma microturbulence simulation codes with experimental data in the U.S. and abroad. This website automates the preparation and launch of gyrokinetic simulations utilizing plasma profile and magnetic equilibrium data. The functionality of existing standalone interfaces, such as GS2/PREP [D. R. Ernst et al., Phys. Plasmas 11(5) 2637 (2004)], in use for several years for the GS2 code [W. Dorland et al., Phys. Rev. Lett. 85(26) 5579 (2000)], will be extended to other codes, including GYRO [J. Candy / R.E. Waltz, J. Comput. Phys.186, (2003) 545]. Data is read from mdsplus and TRANSP [\\underline {http://w3.pppl.gov/transp}] and can be viewed using a java plotter, Webgraph, developed for this project by previous students Geoffrey Catto and Bo Feng. User sessions are tracked and saved to allow users to access their previous simulations, which can be used as templates for future work.

  13. Fusion

    NASA Astrophysics Data System (ADS)

    Herman, Robin

    1990-10-01

    The book abounds with fascinating anecdotes about fusion's rocky path: the spurious claim by Argentine dictator Juan Peron in 1951 that his country had built a working fusion reactor, the rush by the United States to drop secrecy and publicize its fusion work as a propaganda offensive after the Russian success with Sputnik; the fortune Penthouse magazine publisher Bob Guccione sank into an unconventional fusion device, the skepticism that met an assertion by two University of Utah chemists in 1989 that they had created "cold fusion" in a bottle. Aimed at a general audience, the book describes the scientific basis of controlled fusion--the fusing of atomic nuclei, under conditions hotter than the sun, to release energy. Using personal recollections of scientists involved, it traces the history of this little-known international race that began during the Cold War in secret laboratories in the United States, Great Britain and the Soviet Union, and evolved into an astonishingly open collaboration between East and West.

  14. Humanoid Flight Metabolic Simulator Project

    NASA Technical Reports Server (NTRS)

    Ross, Stuart

    2015-01-01

    NASA's Evolvable Mars Campaign (EMC) has identified several areas of technology that will require significant improvements in terms of performance, capacity, and efficiency, in order to make a manned mission to Mars possible. These include crew vehicle Environmental Control and Life Support System (ECLSS), EVA suit Portable Life Support System (PLSS) and Information Systems, autonomous environmental monitoring, radiation exposure monitoring and protection, and vehicle thermal control systems (TCS). (MADMACS) in a Suit can be configured to simulate human metabolism, consuming crew resources (oxygen) in the process. In addition to providing support for testing Life Support on unmanned flights, MADMACS will also support testing of suit thermal controls, and monitor radiation exposure, body zone temperatures, moisture, and loads.

  15. Networking Industry and Academia: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Stephens, Simon; Onofrei, George

    2009-01-01

    Graduate development programmes such as FUSION continue to be seen by policy makers, higher education institutions and small and medium-sized enterprises (SMEs) as primary means of strengthening higher education-business links and in turn improving the match between graduate output and the needs of industry. This paper provides evidence from case…

  16. Dynamic system simulation of small satellite projects

    NASA Astrophysics Data System (ADS)

    Raif, Matthias; Walter, Ulrich; Bouwmeester, Jasper

    2010-11-01

    A prerequisite to accomplish a system simulation is to have a system model holding all necessary project information in a centralized repository that can be accessed and edited by all parties involved. At the Institute of Astronautics of the Technische Universitaet Muenchen a modular approach for modeling and dynamic simulation of satellite systems has been developed called dynamic system simulation (DySyS). DySyS is based on the platform independent description language SysML to model a small satellite project with respect to the system composition and dynamic behavior. A library of specific building blocks and possible relations between these blocks have been developed. From this library a system model of the satellite of interest can be created. A mapping into a C++ simulation allows the creation of an executable system model with which simulations are performed to observe the dynamic behavior of the satellite. In this paper DySyS is used to model and simulate the dynamic behavior of small satellites, because small satellite projects can act as a precursor to demonstrate the feasibility of a system model since they are less complex compared to a large scale satellite project.

  17. Internet and web projects for fusion plasma science and education. Final technical report

    SciTech Connect

    Eastman, Timothy E.

    1999-08-30

    The plasma web site at http://www.plasmas.org provides comprehensive coverage of all plasma science and technology with site links worldwide. Prepared to serve the general public, students, educators, researchers, and decision-makers, the site covers basic plasma physics, fusion energy, magnetic confinement fusion, high energy density physics include ICF, space physics and astrophysics, pulsed-power, lighting, waste treatment, plasma technology, plasma theory, simulations and modeling.

  18. Size limitations for microwave cavity to simulate heating of blanket material in fusion reactor

    SciTech Connect

    Wolf, D.

    1987-01-01

    The power profile in the blanket material of a nuclear fusion reactor can be simulated by using microwaves at 200 MHz. Using these microwaves, ceramic breeder materials can be thermally tested to determine their acceptability as blanket materials without entering a nuclear fusion environment. A resonating cavity design is employed which can achieve uniform cross sectional heating in the plane transverse to the neutron flux. As the sample size increases in height and width, higher order modes, above the dominant mode, are propagated and destroy the approximation to the heating produced in a fusion reactor. The limits at which these modes develop are determined in the paper.

  19. Neoclassical simulations of fusion alpha particles in pellet charge exchange experiments on the Tokamak Fusion Test Reactor

    SciTech Connect

    Redi, M.H.; Batha, S.H.; Budny, R.V.; Darrow, D.S.; Levinton, F.M.; McCune, D.C.; Medley, S.S.; Petrov, M.P.; von Goeler, S.; White, R.B.; Zarnstorff, M.C.; Zweben, S.J.; TFTR Team

    1999-07-01

    Neoclassical simulations of alpha particle density profiles in high fusion power plasmas on the Tokamak Fusion Test Reactor [Phys. Plasmas {bold 5}, 1577 (1998)] are found to be in good agreement with measurements of the alpha distribution function made with a sensitive active neutral particle diagnostic. The calculations are carried out in Hamiltonian magnetic coordinates with a fast, particle-following Monte Carlo code which includes the neoclassical transport processes, a recent first-principles model for stochastic ripple loss and collisional effects. New calculations show that monotonic shear alpha particles are virtually unaffected by toroidal field ripple. The calculations show that in reversed shear the confinement domain is not empty for trapped alphas at birth and allow an estimate of the actual alpha particle densities measured with the pellet charge exchange diagnostic. {copyright} {ital 1999 American Institute of Physics.}

  20. Programmable AC power supply for simulating power transient expected in fusion reactor

    SciTech Connect

    Halimi, B.; Suh, K. Y.

    2012-07-01

    This paper focus on control engineering of the programmable AC power source which has capability to simulate power transient expected in fusion reactor. To generate the programmable power source, AC-AC power electronics converter is adopted to control the power of a set of heaters to represent the transient phenomena of heat exchangers or heat sources of a fusion reactor. The International Thermonuclear Experimental Reactor (ITER) plasma operation scenario is used as the basic reference for producing this transient power source. (authors)

  1. Simulation of RF-fields in a fusion device

    SciTech Connect

    De Witte, Dieter; Bogaert, Ignace; De Zutter, Daniel; Van Oost, Guido; Van Eester, Dirk

    2009-11-26

    In this paper the problem of scattering off a fusion plasma is approached from the point of view of integral equations. Using the volume equivalence principle an integral equation is derived which describes the electromagnetic fields in a plasma. The equation is discretized with MoM using conforming basis functions. This reduces the problem to solving a dense matrix equation. This can be done iteratively. Each iteration can be sped up using FFTs.

  2. Image Fusion Software in the Clearpem-Sonic Project

    NASA Astrophysics Data System (ADS)

    Pizzichemi, M.; di Vara, N.; Cucciati, G.; Ghezzi, A.; Paganoni, M.; Farina, F.; Frisch, B.; Bugalho, R.

    2012-08-01

    ClearPEM-Sonic is a mammography scanner that combines Positron Emission Tomography with 3D ultrasound echographic and elastographic imaging. It has been developed to improve early stage detection of breast cancer by combining metabolic and anatomical information. The PET system has been developed by the Crystal Clear Collaboration, while the 3D ultrasound probe has been provided by SuperSonic Imagine. In this framework, the visualization and fusion software is an essential tool for the radiologists in the diagnostic process. This contribution discusses the design choices, the issues faced during the implementation, and the commissioning of the software tools developed for ClearPEM-Sonic.

  3. One-dimensional particle simulations of Knudsen-layer effects on D-T fusion

    SciTech Connect

    Cohen, Bruce I.; Dimits, Andris M.; Zimmerman, George B.; Wilks, Scott C.

    2014-12-15

    Particle simulations are used to solve the fully nonlinear, collisional kinetic equation describing the interaction of a high-temperature, high-density, deuterium-tritium plasma with absorbing boundaries, a plasma source, and the influence of kinetic effects on fusion reaction rates. Both hydrodynamic and kinetic effects influence the end losses, and the simulations show departures of the ion velocity distributions from Maxwellian due to the reduction of the population of the highest energy ions (Knudsen-layer effects). The particle simulations show that the interplay between sources, plasma dynamics, and end losses results in temperature anisotropy, plasma cooling, and concomitant reductions in the fusion reaction rates. However, for the model problems and parameters considered, particle simulations show that Knudsen-layer modifications do not significantly affect the velocity distribution function for velocities most important in determining the fusion reaction rates, i.e., the thermal fusion reaction rates using the local densities and bulk temperatures give good estimates of the kinetic fusion reaction rates.

  4. Simulation of transition dynamics to high confinement in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Nielsen, A. H.; Xu, G. S.; Madsen, J.; Naulin, V.; Juul Rasmussen, J.; Wan, B. N.

    2015-12-01

    The transition dynamics from the low (L) to the high (H) confinement mode in magnetically confined plasmas is investigated using a first-principles four-field fluid model. Numerical results are in agreement with measurements from the Experimental Advanced Superconducting Tokamak - EAST. Particularly, the slow transition with an intermediate dithering phase is well reproduced at proper parameters. The model recovers the power threshold for the L-H transition as well as the decrease in power threshold switching from single to double null configuration observed experimentally. The results are highly relevant for developing predictive models of the transition, essential for understanding and optimizing future fusion power reactors.

  5. Overview of Theory and Simulations in the Heavy Ion Fusion ScienceVirtual National Laboratory

    SciTech Connect

    Friedman, Alex

    2006-07-09

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  6. Overview of Theory and Simulations in the Heavy Ion Fusion Science Virtual National Laboratory

    SciTech Connect

    Friedman, A

    2006-07-03

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  7. Numerical analysis corresponding with experiment in compact beam simulator for heavy ion inertial fusion driver

    NASA Astrophysics Data System (ADS)

    Kikuchi, T.; Sakai, Y.; Komori, T.; Sato, T.; Hasegawa, J.; Horioka, K.; Takahashi, K.; Sasaki, T.; Harada, Nob

    2016-05-01

    Tune depression in a compact beam equipment is estimated, and numerical simulation results are compared with an experimental one for the compact beam simulator in a driver of heavy ion inertial fusion. The numerical simulation with multi-particle tracking is carried out, corresponding to the experimental condition, and the result is discussed with the experimental one. It is expected that the numerical simulation developed in this paper is useful tool to investigate the beam dynamics in the experiment with the compact beam simulator.

  8. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  9. Graduate Training: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Hegarty, Cecilia; Johnston, Janet

    2008-01-01

    Purpose: This paper aims to explore graduate training through SME-based project work. The views and behaviours of graduates are examined along with the perceptions of the SMEs and academic partner institutions charged with training graduates. Design/methodology/approach: The data are largely qualitative and derived from the experiences of…

  10. The UPSCALE project: a large simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, Matthew; Roberts, Malcolm; Vidale, Pier Luigi; Schiemann, Reinhard; Demory, Marie-Estelle; Strachan, Jane

    2014-05-01

    The development of a traceable hierarchy of HadGEM3 global climate models, based upon the Met Office Unified Model, at resolutions from 135 km to 25 km, now allows the impact of resolution on the mean state, variability and extremes of climate to be studied in a robust fashion. In 2011 we successfully obtained a single-year grant of 144 million core hours of supercomputing time from the PRACE organization to run ensembles of 27 year atmosphere-only (HadGEM3-A GA3.0) climate simulations at 25km resolution, as used in present global weather forecasting, on HERMIT at HLRS. Through 2012 the UPSCALE project (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) ran over 650 years of simulation at resolutions of 25 km (N512), 60 km (N216) and 135 km (N96) to look at the value of high resolution climate models in the study of both present climate and a potential future climate scenario based on RCP8.5. Over 400 TB of data was produced using HERMIT, with additional simulations run on HECToR (UK supercomputer) and MONSooN (Met Office NERC Supercomputing Node). The data generated was transferred to the JASMIN super-data cluster, hosted by STFC CEDA in the UK, where analysis facilities are allowing rapid scientific exploitation of the data set. Many groups across the UK and Europe are already taking advantage of these facilities and we welcome approaches from other interested scientists. This presentation will briefly cover the following points; Purpose and requirements of the UPSCALE project and facilities used. Technical implementation and hurdles (model porting and optimisation, automation, numerical failures, data transfer). Ensemble specification. Current analysis projects and access to the data set. A full description of UPSCALE and the data set generated has been submitted to Geoscientific Model development, with overview information available from http://proj.badc.rl.ac.uk/upscale .

  11. Simulating Halos with the Caterpillar Project

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-04-01

    The Caterpillar Project is a beautiful series of high-resolution cosmological simulations. The goal of this project is to examine the evolution of dark-matter halos like the Milky Ways, to learn about how galaxies like ours formed. This immense computational project is still in progress, but the Caterpillar team is already providing a look at some of its first results.Lessons from Dark-Matter HalosWhy simulate the dark-matter halos of galaxies? Observationally, the formation history of our galaxy is encoded in galactic fossil record clues, like the tidal debris from disrupted satellite galaxies in the outer reaches of our galaxy, or chemical abundance patterns throughout our galactic disk and stellar halo.But to interpret this information in a way that lets us learn about our galaxys history, we need to first test galaxy formation and evolution scenarios via cosmological simulations. Then we can compare the end result of these simulations to what we observe today.This figure illustrates the difference that mass resolution makes. In the left panel, the mass resolution is 1.5*10^7 solar masses per particle. In the right panel, the mass resolution is 3*10^4 solar masses per particle [Griffen et al. 2016]A Computational ChallengeDue to how computationally expensive such simulations are, previous N-body simulations of the growth of Milky-Way-like halos have consisted of only one or a few halos each. But in order to establish a statistical understanding of how galaxy halos form and find out whether the Milky Ways halo is typical or unusual! it is necessary to simulate a larger number of halos.In addition, in order to accurately follow the formation and evolution of substructure within the dark-matter halos, these simulations must be able to resolve the smallest dwarf galaxies, which are around a million solar masses. This requires an extremely high mass resolution, which adds to the computational expense of the simulation.First OutcomesThese are the challenges faced by

  12. Atmospheric model intercomparison project: Monsoon simulations

    SciTech Connect

    Sperber, K.R.; Palmer, T.N.

    1994-06-01

    The simulation of monsoons, in particular the Indian summer monsoon, has proven to be a critical test of a general circulation model`s ability to simulate tropical climate and variability. The Monsoon Numerical Experimentation Group has begun to address questions regarding the predictability of monsoon extremes, in particular conditions associated with El Nino and La Nina conditions that tend to be associated with drought and flood conditions over the Indian subcontinent, through a series of seasonal integrations using analyzed initial conditions from successive days in 1987 and 1988. In this paper the authors present an analysis of simulations associated with the Atmospheric Model Intercomparison Project (AMIP), a coordinated effort to simulate the 1979--1988 decade using standardized boundary conditions with approximately 30 atmospheric general circulation models. The 13 models analyzed to date are listed. Using monthly mean data from these simulations they have calculated indices of precipitation and wind shear in an effort to access the performance of the models over the course of the AMIP decade.

  13. Kinetic simulation of edge instability in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Fulton, Daniel Patrick

    In this work, gyrokinetic simulations in edge plasmas of both tokamaks and field reversed. configurations (FRC) have been carried out using the Gyrokinetic Toroidal Code (GTC) and A New Code (ANC) has been formulated for cross-separatrix FRC simulation. In the tokamak edge, turbulent transport in the pedestal of an H-mode DIII-D plasma is. studied via simulations of electrostatic driftwaves. Annulus geometry is used and simulations focus on two radial locations corresponding to the pedestal top with mild pressure gradient and steep pressure gradient. A reactive trapped electron instability with typical ballooning mode structure is excited in the pedestal top. At the steep gradient, the electrostatic instability exhibits unusual mode structure, peaking at poloidal angles theta=+- pi/2. Simulations find this unusual mode structure is due to steep pressure gradients in the pedestal but not due to the particular DIII-D magnetic geometry. Realistic DIII-D geometry has a stabilizing effect compared to a simple circular tokamak geometry. Driftwave instability in FRC is studied for the first time using gyrokinetic simulation. GTC. is upgraded to treat realistic equilibrium calculated by an MHD equilibrium code. Electrostatic local simulations in outer closed flux surfaces find ion-scale modes are stable due to the large ion gyroradius and that electron drift-interchange modes are excited by electron temperature gradient and bad magnetic curvature. In the scrape-off layer (SOL) ion-scale modes are excited by density gradient and bad curvature. Collisions have weak effects on instabilities both in the core and SOL. Simulation results are consistent with density fluctuation measurements in the C-2 experiment using Doppler backscattering (DBS). The critical density gradients measured by the DBS qualitatively agree with the linear instability threshold calculated by GTC simulations. One outstanding critical issue in the FRC is the interplay between turbulence in the FRC. core

  14. Simulations of the performance of the Fusion-FEM, for an increased e-beam emittance

    SciTech Connect

    Tulupov, A.V.; Urbanus, W.H.; Caplan, M.

    1995-12-31

    The original design of the Fusion-FEM, which is under construction at the FOM-Institute for Plasma Physics, was based on an electron beam emittance of 50 {pi} mm mrad. Recent measurements of the emittance of the beam emitted by the electron gun showed that the actual emittance is 80 {pi} mm mrad. This results in a 2.5 times lower beam current density inside the undulator. As a result it changes the linear gain, the start-up time, the saturation level and the frequency spectrum. The main goal of the FEM project is to demonstrate a stable microwave output power of at least 1 MW. The decrease of the electron beam current density has to be compensated by variations of the other FEM parameters, such as the reflection (feedback) coefficient of the microwave cavity and the length of the drift gap between the two sections of the step-tapered undulator. All basic dependencies of the linear and nonlinear gain, and of the output power on the main FEM parameters have been simulated numerically with the CRMFEL code. Regimes of stable operation of the FEM with the increased emittance have been found. These regimes could be found because of the original flexibility of the FEM design.

  15. Sensitivity of mix in Inertial Confinement Fusion simulations to diffusion processes

    NASA Astrophysics Data System (ADS)

    Melvin, Jeremy; Cheng, Baolian; Rana, Verinder; Lim, Hyunkyung; Glimm, James; Sharp, David H.

    2015-11-01

    We explore two themes related to the simulation of mix within an Inertial Confinement Fusion (ICF) implosion, the role of diffusion (viscosity, mass diffusion and thermal conduction) processes and the impact of front tracking on the growth of the hydrodynamic instabilities. Using the University of Chicago HEDP code FLASH, we study the sensitivity of post-shot simulations of a NIC cryogenic shot to the diffusion models and front tracking of the material interfaces. Results of 1D and 2D simulations are compared to experimental quantities and an analysis of the current state of fully integrated ICF simulations is presented.

  16. Developing models for simulation of pinched-beam dynamics in heavy ion fusion. Revision 1

    SciTech Connect

    Boyd, J.K.; Mark, J.W.K.; Sharp, W.M.; Yu, S.S.

    1984-02-22

    For heavy-ion fusion energy applications, Mark and Yu have derived hydrodynamic models for numerical simulation of energetic pinched-beams including self-pinches and external-current pinches. These pinched-beams are applicable to beam propagation in fusion chambers and to the US High Temperature Experiment. The closure of the Mark-Yu model is obtained with adiabatic assumptions mathematically analogous to those of Chew, Goldberger, and Low for MHD. Features of this hydrodynamic beam model are compared with a kinetic treatment.

  17. A hybrid model for coupling kinetic corrections of fusion reactivity to hydrodynamic implosion simulations

    NASA Astrophysics Data System (ADS)

    Tang, Xian-Zhu; McDevitt, C. J.; Guo, Zehua; Berk, H. L.

    2014-03-01

    Inertial confinement fusion requires an imploded target in which a central hot spot is surrounded by a cold and dense pusher. The hot spot/pusher interface can take complicated shape in three dimensions due to hydrodynamic mix. It is also a transition region where the Knudsen and inverse Knudsen layer effect can significantly modify the fusion reactivity in comparison with the commonly used value evaluated with background Maxwellians. Here, we describe a hybrid model that couples the kinetic correction of fusion reactivity to global hydrodynamic implosion simulations. The key ingredient is a non-perturbative treatment of the tail ions in the interface region where the Gamow ion Knudsen number approaches or surpasses order unity. The accuracy of the coupling scheme is controlled by the precise criteria for matching the non-perturbative kinetic model to perturbative solutions in both configuration space and velocity space.

  18. Colorado School of Mines fusion gamma ray diagnostic project. Technical progress report

    SciTech Connect

    Cecil, F.E.

    1992-02-14

    This report summarizes the 1991 calendar year activities of the fusion gamma ray diagnostics project in the Physics Department at the Colorado School of Mines. Considerable progress has been realized in the fusion gamma ray diagnostic project in the last year. Specifically we have achieved the two major goals of the project as outlined in last year`s proposed work statement to the Office of Applied Plasma Physics in the DOE Division of Magnetic Fusion Energy. The two major goals were: (1) Solution of the severe interference problem encountered during the operation of the gamma ray spectrometer concurrent with high power levels of the neutral beam injectors (NBI) and the ICRH antenae. (2) Experimental determination of the absolute detection efficiency of the gamma ray spectrometer. This detection efficiency will allow the measured yields of the gamma rays to be converted to a total reaction rate. In addition to these two major accomplishments, we have continued, as permitted by the TFTR operating schedule, the observation of high energy gamma rays from the 3He(D,{gamma})5Li reaction during deuterium NBI heating of 3He plasmas.

  19. Comparison between initial Magnetized Liner Inertial Fusion experiments and integrated simulations

    NASA Astrophysics Data System (ADS)

    Sefkow, A. B.; Gomez, M. R.; Geissel, M.; Hahn, K. D.; Hansen, S. B.; Harding, E. C.; Peterson, K. J.; Slutz, S. A.; Koning, J. M.; Marinak, M. M.

    2014-10-01

    The Magnetized Liner Inertial Fusion (MagLIF) approach to ICF has obtained thermonuclear fusion yields using the Z facility. Integrated magnetohydrodynamic simulations provided the design for the first neutron-producing experiments using capabilities that presently exist, and the initial experiments measured stagnation radii rstag < 75 μm, temperatures around 3 keV, and isotropic neutron yields up to YnDD = 2 ×1012 from imploded liners reaching peak velocities around 70 km/s over an implosion time of about 60 ns. We present comparisons between the experimental observables and post-shot degraded integrated simulations. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  20. Online Simulation of Radiation Track Structure Project

    NASA Technical Reports Server (NTRS)

    Plante, Ianik

    2015-01-01

    Space radiation comprises protons, helium and high charged and energy (HZE) particles. High-energy particles are a concern for human space flight, because they are no known options for shielding astronauts from them. When these ions interact with matter, they damage molecules and create radiolytic species. The pattern of energy deposition and positions of the radiolytic species, called radiation track structure, is highly dependent on the charge and energy of the ion. The radiolytic species damage biological molecules, which may lead to several long-term health effects such as cancer. Because of the importance of heavy ions, the radiation community is very interested in the interaction of HZE particles with DNA, notably with regards to the track structure. A desktop program named RITRACKS was developed to simulate radiation track structure. The goal of this project is to create a web interface to allow registered internal users to use RITRACKS remotely.

  1. Three-dimensional simulations of the implosion of inertial confinement fusion targets

    SciTech Connect

    Town, R.P.J.; Bell, A.R. )

    1991-09-30

    The viability of inertial confinement fusion depends crucially on implosion symmetry. A spherical three-dimensional hydrocode called PLATO has been developed to model the growth in asymmetries during an implosion. Results are presented in the deceleration phase which show indistinguishable linear growth rates, but greater nonlinear growth of the Rayleigh-Taylor instability than is found in two-dimensional cylindrical simulations. The three-dimensional enhancement of the nonlinear growth is much smaller than that found by Sakagami and Nishihara.

  2. Study of Plasma Liner Driven Magnetized Target Fusion Via Advanced Simulations

    SciTech Connect

    Samulyak, Roman V.; Parks, Paul

    2013-08-31

    The feasibility of the plasma liner driven Magnetized Target Fusion (MTF) via terascale numerical simulations will be assessed. In the MTF concept, a plasma liner, formed by merging of a number (60 or more) of radial, highly supersonic plasma jets, implodes on the target in the form of two compact plasma toroids, and compresses it to conditions of the fusion ignition. By avoiding major difficulties associated with both the traditional laser driven inertial confinement fusion and solid liner driven MTF, the plasma liner driven MTF potentially provides a low-cost and fast R&D path towards the demonstration of practical fusion energy. High fidelity numerical simulations of full nonlinear models associated with the plasma liner MTF using state-of-art numerical algorithms and terascale computing are necessary in order to resolve uncertainties and provide guidance for future experiments. At Stony Brook University, we have developed unique computational capabilities that ideally suite the MTF problem. The FronTier code, developed in collaboration with BNL and LANL under DOE funding including SciDAC for the simulation of 3D multi-material hydro and MHD flows, has beenbenchmarked and used for fundamental and engineering problems in energy science applications. We have performed 3D simulations of converging supersonic plasma jets, their merger and the formation of the plasma liner, and a study of the corresponding oblique shock problem. We have studied the implosion of the plasma liner on the magnetized plasma target by resolving Rayleigh-Taylor instabilities in 2D and 3D and other relevant physics and estimate thermodynamic conditions of the target at the moment of maximum compression and the hydrodynamic efficiency of the method.

  3. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, D.L.; Greenwood, L.R.; Loomis, B.A.

    1988-05-20

    This paper discusses an apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  4. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, Dale L.; Greenwood, Lawrence R.; Loomis, Benny A.

    1989-03-07

    An apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  5. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, Dale L.; Greenwood, Lawrence R.; Loomis, Benny A.

    1989-01-01

    An apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  6. Neutral Buoyancy Simulator - EASE Project (NB32)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Once the United States' space program had progressed from Earth's orbit into outerspace, the prospect of building and maintaining a permanent presence in space was realized. To accomplish this feat, NASA launched a temporary workstation, Skylab, to discover the effects of low gravity and weightlessness on the human body, and also to develop tools and equipment that would be needed in the future to build and maintain a more permanent space station. The structures, techniques, and work schedules had to be carefully designed to fit this unique construction site. The components had to be lightweight for transport into orbit, yet durable. The station also had to be made with removable parts for easy servicing and repairs by astronauts. All of the tools necessary for service and repairs had to be designed for easy manipulation by a suited astronaut. Construction methods had to be efficient due to the limited time the astronauts could remain outside their controlled environment. In lieu of all the specific needs for this project, an environment on Earth had to be developed that could simulate a low gravity atmosphere. A Neutral Buoyancy Simulator (NBS) was constructed by NASA's Marshall Space Flight Center (MSFC) in 1968. Since then, NASA scientists have used this facility to understand how humans work best in low gravity and also provide information about the different kinds of structures that can be built. Pictured is a Massachusetts Institute of Technology (MIT) student working in a spacesuit on the Experimental Assembly of Structures in Extravehicular Activity (EASE) project which was developed as a joint effort between MFSC and MIT. The EASE experiment required that crew members assemble small components to form larger components, working from the payload bay of the space shuttle. The MIT student in this photo is assembling two six-beam tetrahedrons.

  7. High-level multifunction radar simulation for studying the performance of multisensor data fusion systems

    NASA Astrophysics Data System (ADS)

    Huizing, Albert G.; Bosse, Eloi

    1998-07-01

    This paper presents the basic requirements for a simulation of the main capabilities of a shipborne MultiFunction Radar (MFR) that can be used in conjunction with other sensor simulations in scenarios for studying Multi Sensor Data Fusion (MSDF) systems. This simulation is being used to support an ongoing joint effort (Canada - The Netherlands) in the development of MSDF testbeds. This joint effort is referred as Joint-FACET (Fusion Algorithms & Concepts Exploration Testbed), a highly modular and flexible series of applications that is capable of processing both real and synthetic input data. The question raised here is how realistic should the sensor simulations be to trust the MSDF performance assessment? A partial answer to this question is that at least, the dominant perturbing effects on sensor detection (true or false) are sufficiently represented. Following this philosophy, the MFR model, presented here, takes into account sensor's design parameters and external environmental effects such as clutter, propagation and jamming. Previous radar simulations capture most of these dominant effects. In this paper the emphasis is on an MFR scheduler which is the key element that needs to be added to the previous simulations to represent the MFR capability to search and track a large number of targets and at the same time support a large number of (semi-active) surface-to-air missiles (SAM) for the engagement of multiple hostile targets.

  8. Lipid droplets fusion in adipocyte differentiated 3T3-L1 cells: A Monte Carlo simulation

    SciTech Connect

    Boschi, Federico; Rizzatti, Vanni; Zamboni, Mauro; Sbarbati, Andrea

    2014-02-15

    Several human worldwide diseases like obesity, type 2 diabetes, hepatic steatosis, atherosclerosis and other metabolic pathologies are related to the excessive accumulation of lipids in cells. Lipids accumulate in spherical cellular inclusions called lipid droplets (LDs) whose sizes range from fraction to one hundred of micrometers in adipocytes. It has been suggested that LDs can grow in size due to a fusion process by which a larger LD is obtained with spherical shape and volume equal to the sum of the progenitors’ ones. In this study, the size distribution of two populations of LDs was analyzed in immature and mature (5-days differentiated) 3T3-L1 adipocytes (first and second populations, respectively) after Oil Red O staining. A Monte Carlo simulation of interaction between LDs has been developed in order to quantify the size distribution and the number of fusion events needed to obtain the distribution of the second population size starting from the first one. Four models are presented here based on different kinds of interaction: a surface weighted interaction (R2 Model), a volume weighted interaction (R3 Model), a random interaction (Random model) and an interaction related to the place where the LDs are born (Nearest Model). The last two models mimic quite well the behavior found in the experimental data. This work represents a first step in developing numerical simulations of the LDs growth process. Due to the complex phenomena involving LDs (absorption, growth through additional neutral lipid deposition in existing droplets, de novo formation and catabolism) the study focuses on the fusion process. The results suggest that, to obtain the observed size distribution, a number of fusion events comparable with the number of LDs themselves is needed. Moreover the MC approach results a powerful tool for investigating the LDs growth process. Highlights: • We evaluated the role of the fusion process in the synthesis of the lipid droplets. • We compared the

  9. ION BEAM HEATED TARGET SIMULATIONS FOR WARM DENSE MATTER PHYSICS AND INERTIAL FUSION ENERGY

    SciTech Connect

    Barnard, J.J.; Armijo, J.; Bailey, D.S.; Friedman, A.; Bieniosek, F.M.; Henestroza, E.; Kaganovich, I.; Leung, P.T.; Logan, B.G.; Marinak, M.M.; More, R.M.; Ng, S.F.; Penn, G.E.; Perkins, L.J.; Veitzer, S.; Wurtele, J.S.; Yu, S.S.; Zylstra, A.B.

    2008-08-01

    Hydrodynamic simulations have been carried out using the multi-physics radiation hydrodynamics code HYDRA and the simplified one-dimensional hydrodynamics code DISH. We simulate possible targets for a near-term experiment at LBNL (the Neutralized Drift Compression Experiment, NDCX) and possible later experiments on a proposed facility (NDCX-II) for studies of warm dense matter and inertial fusion energy related beam-target coupling. Simulations of various target materials (including solids and foams) are presented. Experimental configurations include single pulse planar metallic solid and foam foils. Concepts for double-pulsed and ramped-energy pulses on cryogenic targets and foams have been simulated for exploring direct drive beam target coupling, and concepts and simulations for collapsing cylindrical and spherical bubbles to enhance temperature and pressure for warm dense matter studies are described.

  10. Ion Beam Heated Target Simulations for Warm Dense Matter Physics and Inertial Fusion Energy

    SciTech Connect

    Barnard, J J; Armijo, J; Bailey, D S; Friedman, A; Bieniosek, F M; Henestroza, E; Kaganovich, I; Leung, P T; Logan, B G; Marinak, M M; More, R M; Ng, S F; Penn, G E; Perkins, L J; Veitzer, S; Wurtele, J S; Yu, S S; Zylstra, A B

    2008-08-12

    Hydrodynamic simulations have been carried out using the multi-physics radiation hydrodynamics code HYDRA and the simplified one-dimensional hydrodynamics code DISH. We simulate possible targets for a near-term experiment at LBNL (the Neutralized Drift Compression Experiment, NDCX) and possible later experiments on a proposed facility (NDCX-II) for studies of warm dense matter and inertial fusion energy related beam-target coupling. Simulations of various target materials (including solids and foams) are presented. Experimental configurations include single pulse planar metallic solid and foam foils. Concepts for double-pulsed and ramped-energy pulses on cryogenic targets and foams have been simulated for exploring direct drive beam target coupling, and concepts and simulations for collapsing cylindrical and spherical bubbles to enhance temperature and pressure for warm dense matter studies are described.

  11. Ion beam heated target simulations for warm dense matter physics and inertial fusion energy

    NASA Astrophysics Data System (ADS)

    Barnard, J. J.; Armijo, J.; Bailey, D. S.; Friedman, A.; Bieniosek, F. M.; Henestroza, E.; Kaganovich, I.; Leung, P. T.; Logan, B. G.; Marinak, M. M.; More, R. M.; Ng, S. F.; Penn, G. E.; Perkins, L. J.; Veitzer, S.; Wurtele, J. S.; Yu, S. S.; Zylstra, A. B.

    2009-07-01

    Hydrodynamic simulations have been carried out using the multi-physics radiation hydrodynamics code HYDRA and the simplified one-dimensional hydrodynamics code DISH. We simulate possible targets for a near-term experiment at LBNL (the Neutralized Drift Compression Experiment, NDCX) and possible later experiments on a proposed facility (NDCX-II) for studies of warm dense matter and inertial fusion energy-related beam-target coupling. Simulations of various target materials (including solids and foams) are presented. Experimental configurations include single-pulse planar metallic solid and foam foils. Concepts for double-pulsed and ramped-energy pulses on cryogenic targets and foams have been simulated for exploring direct drive beam-target coupling, and concepts and simulations for collapsing cylindrical and spherical bubbles to enhance temperature and pressure for warm dense matter studies.

  12. Simulations of mixing in Inertial Confinement Fusion with front tracking and sub-grid scale models

    NASA Astrophysics Data System (ADS)

    Rana, Verinder; Lim, Hyunkyung; Melvin, Jeremy; Cheng, Baolian; Glimm, James; Sharp, David

    2015-11-01

    We present two related results. The first discusses the Richtmyer-Meshkov (RMI) and Rayleigh-Taylor instabilities (RTI) and their evolution in Inertial Confinement Fusion simulations. We show the evolution of the RMI to the late time RTI under transport effects and tracking. The role of the sub-grid scales helps capture the interaction of turbulence with diffusive processes. The second assesses the effects of concentration on the physics model and examines the mixing properties in the low Reynolds number hot spot. We discuss the effect of concentration on the Schmidt number. The simulation results are produced using the University of Chicago code FLASH and Stony Brook University's front tracking algorithm.

  13. Nonlinear kinetic simulations of ion cyclotron emission from fusion products in large tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Dendy, Richard; Cook, James; Chapman, Sandra

    2012-10-01

    Ion cyclotron emission (ICE) was the only collective radiative instability, driven by fusion-born ions, observed from deuterium-tritium plasmas in both JET and TFTR (R O Dendy et al., Nucl. Fusion 35, 1733 (1995)). Suprathermal emission, peaked at sequential ion cyclotron harmonics at the outer mid-plane edge, was detected using heating antennas as receivers on JET and using probes in TFTR. The intensity of ICE spectral peaks scaled linearly with fusion reactivity. The underlying emission mechanism appears to be the magnetoacoustic cyclotron instability (MCI), which involves resonance between: the fast Alfv'en wave; cyclotron harmonic waves supported by the energetic ions and by the background thermal plasma; and a set of centrally born fusion products, lying on barely trapped orbits, which undergo large drift excursions. Analytical studies show that the linear growth rate of the MCI corresponds well with certain observational features of ICE, including ones where a nonlinear treatment might be thought essential. To help explain this, we have carried out direct numerical simulations using a particle-in-cell (PIC) code. We focus on the results of extending MCI theory from the linear into the nonlinear regime for large tokamak parameters.

  14. A Simulation Environment for Benchmarking Sensor Fusion-Based Pose Estimators

    PubMed Central

    Ligorio, Gabriele; Sabatini, Angelo Maria

    2015-01-01

    In-depth analysis and performance evaluation of sensor fusion-based estimators may be critical when performed using real-world sensor data. For this reason, simulation is widely recognized as one of the most powerful tools for algorithm benchmarking. In this paper, we present a simulation framework suitable for assessing the performance of sensor fusion-based pose estimators. The systems used for implementing the framework were magnetic/inertial measurement units (MIMUs) and a camera, although the addition of further sensing modalities is straightforward. Typical nuisance factors were also included for each sensor. The proposed simulation environment was validated using real-life sensor data employed for motion tracking. The higher mismatch between real and simulated sensors was about 5% of the measured quantity (for the camera simulation), whereas a lower correlation was found for an axis of the gyroscope (0.90). In addition, a real benchmarking example of an extended Kalman filter for pose estimation from MIMU and camera data is presented. PMID:26703603

  15. A Simulation Environment for Benchmarking Sensor Fusion-Based Pose Estimators.

    PubMed

    Ligorio, Gabriele; Sabatini, Angelo Maria

    2015-01-01

    In-depth analysis and performance evaluation of sensor fusion-based estimators may be critical when performed using real-world sensor data. For this reason, simulation is widely recognized as one of the most powerful tools for algorithm benchmarking. In this paper, we present a simulation framework suitable for assessing the performance of sensor fusion-based pose estimators. The systems used for implementing the framework were magnetic/inertial measurement units (MIMUs) and a camera, although the addition of further sensing modalities is straightforward. Typical nuisance factors were also included for each sensor. The proposed simulation environment was validated using real-life sensor data employed for motion tracking. The higher mismatch between real and simulated sensors was about 5% of the measured quantity (for the camera simulation), whereas a lower correlation was found for an axis of the gyroscope (0.90). In addition, a real benchmarking example of an extended Kalman filter for pose estimation from MIMU and camera data is presented. PMID:26703603

  16. Detector Simulations for the COREA Project

    NASA Astrophysics Data System (ADS)

    Lee, Sungwon; Kang, Hyesung

    2006-12-01

    The COREA (COsmic ray Research and Education Array in Korea) project aims to build a ground array of particle detectors distributed over the Korean Peninsular, through collaborations of high school students, educators, and university researchers, in order to study the origin of ultra high energy cosmic rays. COREA array will consist of about 2000 detector stations covering several hundreds of km2 area at its final configuration and detect electrons and muons in extensive air-showers triggered by high energy particles. During the initial pase COREA array will start with a small number of detector stations in Seoul area schools. In this paper, we have studied by Monte Carlo simulations how to select detector sites for optimal detection efficiency for proton triggered air-showers. We considered several model clusters with up to 30 detector stations and calculated the effective number of air-shower events that can be detected per year for each cluster. The greatest detection efficiency is achieved when the mean distance between detector stations of a cluster is comparable to the effective radius of the air-shower of a given proton energy. We find the detection efficiency of a cluster with randomly selected detector sites is comparable to that of clusters with uniform detector spacing. We also considered a hybrid cluster with 60 detector stations that combines a small cluster with Δl ≈ 100 m and a large cluster with Δl ≈ 1 km. We suggest that it can be an ideal configuration for the initial phase study of the COREA project, since it can measure the cosmic rays with a wide energy range, i.e., 1016eV ≤E ≤ 1019eV, with a reasonable detection rate.

  17. Radiation damage in ferritic/martensitic steels for fusion reactors: a simulation point of view

    NASA Astrophysics Data System (ADS)

    Schäublin, R.; Baluc, N.

    2007-12-01

    Low activation ferritic/martensitic steels are good candidates for the future fusion reactors, for, relative to austenitic steels, their lower damage accumulation and moderate swelling under irradiation by the 14 MeV neutrons produced by the fusion reaction. Irradiation of these steels, e.g. EUROFER97, is known to produce hardening, loss of ductility, shift in ductile to brittle transition temperature and a reduction of fracture toughness and creep resistance starting at the lowest doses. Helium, produced by transmutation by the 14 MeV neutrons, is known to impact mechanical properties, but its effect at the microstructure level is still unclear. The mechanisms underlying the degradation of mechanical properties are not well understood, despite numerous studies on the evolution of the microstructure under irradiation. This impedes our ability to predict materials' behaviour at higher doses for use in the future fusion reactors. Simulations of these effects are now essential. An overview is presented on molecular dynamics simulations of the primary state of damage in iron and of the mobility of a dislocation, vector of plasticity, in the presence of a defect.

  18. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment

    NASA Astrophysics Data System (ADS)

    Marzinek, Jan K.; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G.; Panzade, Sadhana; Verma, Chandra; Bond, Peter J.

    2016-01-01

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes.

  19. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment

    PubMed Central

    Marzinek, Jan K.; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G.; Panzade, Sadhana; Verma, Chandra; Bond, Peter J.

    2016-01-01

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes. PMID:26785994

  20. Simulation of Neural Firing Dynamics: A Student Project.

    ERIC Educational Resources Information Center

    Kletsky, E. J.

    This paper describes a student project in digital simulation techniques that is part of a graduate systems analysis course entitled Biosimulation. The students chose different simulation techniques to solve a problem related to the neuron model. (MLH)

  1. Project Icarus: Analysis of Plasma jet driven Magneto-Inertial Fusion as potential primary propulsion driver for the Icarus probe

    NASA Astrophysics Data System (ADS)

    Stanic, M.; Cassibry, J. T.; Adams, R. B.

    2013-05-01

    Hopes of sending probes to another star other than the Sun are currently limited by the maturity of advanced propulsion technologies. One of the few candidate propulsion systems for providing interstellar flight capabilities is nuclear fusion. In the past many fusion propulsion concepts have been proposed and some of them have even been explored in detail, Project Daedalus for example. However, as scientific progress in this field has advanced, new fusion concepts have emerged that merit evaluation as potential drivers for interstellar missions. Plasma jet driven Magneto-Inertial Fusion (PJMIF) is one of those concepts. PJMIF involves a salvo of converging plasma jets that form a uniform liner, which compresses a magnetized target to fusion conditions. It is an Inertial Confinement Fusion (ICF)-Magnetic Confinement Fusion (MCF) hybrid approach that has the potential for a multitude of benefits over both ICF and MCF, such as lower system mass and significantly lower cost. This paper concentrates on a thermodynamic assessment of basic performance parameters necessary for utilization of PJMIF as a candidate propulsion system for the Project Icarus mission. These parameters include: specific impulse, thrust, exhaust velocity, mass of the engine system, mass of the fuel required etc. This is a submission of the Project Icarus Study Group.

  2. SciDAC Fusiongrid Project--A National Collaboratory to Advance the Science of High Temperature Plasma Physics for Magnetic Fusion

    SciTech Connect

    SCHISSEL, D.P.; ABLA, G.; BURRUSS, J.R.; FEIBUSH, E.; FREDIAN, T.W.; GOODE, M.M.; GREENWALD, M.J.; KEAHEY, K.; LEGGETT, T.; LI, K.; McCUNE, D.C.; PAPKA, M.E.; RANDERSON, L.; SANDERSON, A.; STILLERMAN, J.; THOMPSON, M.R.; URAM, T.; WALLACE, G.

    2006-08-31

    This report summarizes the work of the National Fusion Collaboratory (NFC) Project funded by the United States Department of Energy (DOE) under the Scientific Discovery through Advanced Computing Program (SciDAC) to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. A five year project that was initiated in 2001, it built on the past collaborative work performed within the U.S. fusion community and added the component of computer science research done with the USDOE Office of Science, Office of Advanced Scientific Computer Research. The project was a collaboration itself uniting fusion scientists from General Atomics, MIT, and PPPL and computer scientists from ANL, LBNL, Princeton University, and the University of Utah to form a coordinated team. The group leveraged existing computer science technology where possible and extended or created new capabilities where required. Developing a reliable energy system that is economically and environmentally sustainable is the long-term goal of Fusion Energy Science (FES) research. In the U.S., FES experimental research is centered at three large facilities with a replacement value of over $1B. As these experiments have increased in size and complexity, there has been a concurrent growth in the number and importance of collaborations among large groups at the experimental sites and smaller groups located nationwide. Teaming with the experimental community is a theoretical and simulation community whose efforts range from applied analysis of experimental data to fundamental theory (e.g., realistic nonlinear 3D plasma models) that run on massively parallel computers. Looking toward the future, the large-scale experiments needed for FES research are staffed by correspondingly large, globally dispersed teams. The fusion program will be increasingly oriented toward the International Thermonuclear Experimental Reactor (ITER) where even now, a decade before operation begins, a large

  3. Assembly of Influenza Hemagglutinin Fusion Peptides in a Phospholipid Bilayer by Coarse-grained Computer Simulations

    PubMed Central

    Collu, Francesca; Spiga, Enrico; Lorenz, Christian D.; Fraternali, Franca

    2015-01-01

    Membrane fusion is critical to eukaryotic cellular function and crucial to the entry of enveloped viruses such as influenza and human immunodeficiency virus. Influenza viral entry in the host cell is mediated by a 20–23 amino acid long sequence, called the fusion peptide (FP). Recently, possible structures for the fusion peptide (ranging from an inverted V shaped α-helical structure to an α-helical hairpin, or to a complete α-helix) and their implication in the membrane fusion initiation have been proposed. Despite the large number of studies devoted to the structure of the FP, the mechanism of action of this peptide remains unclear with several mechanisms having been suggested, including the induction of local disorder, promoting membrane curvature, and/or altering local membrane composition. In recent years, several research groups have employed atomistic and/or coarse-grained molecular dynamics (MD) simulations to investigate the matter. In all previous works, the behavior of a single FP monomer was studied, while in this manuscript, we use a simplified model of a tripeptide (TP) monomer of FP (TFP) instead of a single FP monomer because each Influenza Hemagglutinin contains three FP molecules in the biological system. In this manuscript we report findings targeted at understanding the fusogenic properties and the collective behavior of these trimers of FP peptides on a 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine model membrane. Here we show how the TFP monomers self-assemble into differently sized oligomers in the presence of the membrane. We measure the perturbation to the structure of the phospholipid membrane caused by the presence of these TFP oligomers. Our work (i) shows how self-assembly of TFP in the presence of the membrane induces non negligible deformation to the membrane and (ii) could be a useful starting point to stimulate discussion and further work targeted to fusion pore formation. PMID:26636093

  4. Quasi-spherical direct drive fusion simulations for the Z machine and future accelerators.

    SciTech Connect

    VanDevender, J. Pace; McDaniel, Dillon Heirman; Roderick, Norman Frederick; Nash, Thomas J.

    2007-11-01

    We explored the potential of Quasi-Spherical Direct Drive (QSDD) to reduce the cost and risk of a future fusion driver for Inertial Confinement Fusion (ICF) and to produce megajoule thermonuclear yield on the renovated Z Machine with a pulse shortening Magnetically Insulated Current Amplifier (MICA). Analytic relationships for constant implosion velocity and constant pusher stability have been derived and show that the required current scales as the implosion time. Therefore, a MICA is necessary to drive QSDD capsules with hot-spot ignition on Z. We have optimized the LASNEX parameters for QSDD with realistic walls and mitigated many of the risks. Although the mix-degraded 1D yield is computed to be {approx}30 MJ on Z, unmitigated wall expansion under the > 100 gigabar pressure just before burn prevents ignition in the 2D simulations. A squeezer system of adjacent implosions may mitigate the wall expansion and permit the plasma to burn.

  5. Equations of State for Ablator Materials in Inertial Confinement Fusion Simulations

    NASA Astrophysics Data System (ADS)

    Sterne, P. A.; Benedict, L. X.; Hamel, S.; Correa, A. A.; Milovich, J. L.; Marinak, M. M.; Celliers, P. M.; Fratanduono, D. E.

    2016-05-01

    We discuss the development of the tabular equation of state (EOS) models for ablator materials in current use at Lawrence Livermore National Laboratory in simulations of inertial confinement fusion (ICF) experiments at the National Ignition Facility. We illustrate the methods with a review of current models for ablator materials and discuss some of the challenges in performing hydrocode simulations with high-fidelity multiphase models. We stress the importance of experimental data, as well as the utility of ab initio electronic structure calculations, in regions where data is not currently available. We illustrate why Hugoniot data alone is not sufficient to constrain the EOS models. These cases illustrate the importance of experimental EOS data in multi-megabar regimes, and the vital role they play in the development and validation of EOS models for ICF simulations.

  6. Improving Project Management with Simulation and Completion Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2004-01-01

    Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500

  7. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Estep, Donald

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  8. Simulations of longitudinal beam dynamics of space-charge dominated beams for heavy ion fusion

    SciTech Connect

    Miller, D.A.C.

    1994-12-01

    The longitudinal instability has potentially disastrous effects on the ion beams used for heavy ion driven inertial confinement fusion. This instability is a {open_quotes}resistive wall{close_quotes} instability with the impedance coining from the induction modules in the accelerator used as a driver. This instability can greatly amplify perturbations launched from the beam head and can prevent focusing of the beam onto the small spot necessary for fusion. This instability has been studied using the WARPrz particle-in-cell code. WARPrz is a 2 1/2 dimensional electrostatic axisymmetric code. This code includes a model for the impedance of the induction modules. Simulations with resistances similar to that expected in a driver show moderate amounts of growth from the instability as a perturbation travels from beam head to tail as predicted by cold beam fluid theory. The perturbation reflects off the beam tail and decays as it travels toward the beam head. Nonlinear effects cause the perturbation to steepen during reflection. Including the capacitive component of the, module impedance. has a partially stabilizing effect on the longitudinal instability. This reduction in the growth rate is seen in both cold beam fluid theory and in simulations with WARPrz. Instability growth rates for warm beams measured from WARPrz are lower than cold beam fluid theory predicts. Longitudinal thermal spread cannot account for this decrease in the growth rate. A mechanism for coupling the transverse thermal spread to decay of the longitudinal waves is presented. The longitudinal instability is no longer a threat to the heavy ion fusion program. The simulations in this thesis have shown that the growth rate for this instability will not be as large as earlier calculations predicted.

  9. Adjoint Monte Carlo simulation of fusion product activation probe experiment in ASDEX Upgrade tokamak

    NASA Astrophysics Data System (ADS)

    Äkäslompolo, S.; Bonheure, G.; Tardini, G.; Kurki-Suonio, T.; The ASDEX Upgrade Team

    2015-10-01

    The activation probe is a robust tool to measure flux of fusion products from a magnetically confined plasma. A carefully chosen solid sample is exposed to the flux, and the impinging ions transmute the material making it radioactive. Ultra-low level gamma-ray spectroscopy is used post mortem to measure the activity and, thus, the number of fusion products. This contribution presents the numerical analysis of the first measurement in the ASDEX Upgrade tokamak, which was also the first experiment to measure a single discharge. The ASCOT suite of codes was used to perform adjoint/reverse Monte Carlo calculations of the fusion products. The analysis facilitates, for the first time, a comparison of numerical and experimental values for absolutely calibrated flux. The results agree to within a factor of about two, which can be considered a quite good result considering the fact that all features of the plasma cannot be accounted in the simulations.Also an alternative to the present probe orientation was studied. The results suggest that a better optimized orientation could measure the flux from a significantly larger part of the plasma. A shorter version of this contribution is due to be published in PoS at: 1st EPS conference on Plasma Diagnostics

  10. Computational Plasma Physics at the Bleeding Edge: Simulating Kinetic Turbulence Dynamics in Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Tang, William

    2013-04-01

    Advanced computing is generally recognized to be an increasingly vital tool for accelerating progress in scientific research in the 21st Century. The imperative is to translate the combination of the rapid advances in super-computing power together with the emergence of effective new algorithms and computational methodologies to help enable corresponding increases in the physics fidelity and the performance of the scientific codes used to model complex physical systems. If properly validated against experimental measurements and verified with mathematical tests and computational benchmarks, these codes can provide more reliable predictive capability for the behavior of complex systems, including fusion energy relevant high temperature plasmas. The magnetic fusion energy research community has made excellent progress in developing advanced codes for which computer run-time and problem size scale very well with the number of processors on massively parallel supercomputers. A good example is the effective usage of the full power of modern leadership class computational platforms from the terascale to the petascale and beyond to produce nonlinear particle-in-cell simulations which have accelerated progress in understanding the nature of plasma turbulence in magnetically-confined high temperature plasmas. Illustrative results provide great encouragement for being able to include increasingly realistic dynamics in extreme-scale computing campaigns to enable predictive simulations with unprecedented physics fidelity. Some illustrative examples will be presented of the algorithmic progress from the magnetic fusion energy sciences area in dealing with low memory per core extreme scale computing challenges for the current top 3 supercomputers worldwide. These include advanced CPU systems (such as the IBM-Blue-Gene-Q system and the Fujitsu K Machine) as well as the GPU-CPU hybrid system (Titan).

  11. 3D and r,z particle simulations of heavy ion fusion beams

    NASA Astrophysics Data System (ADS)

    Friedman, A.; Grote, D. P.; Callahan, D. A.; Langdon, A. B.; Haber, I.

    1992-08-01

    The space-charge-dominated beams in a heavy ion beam driven inertial fusion (HIF) accelerator must be focused onto small (few mm) spots at the fusion target, and so preservation of a small emittance is crucial. The nonlinear beam self-fields can lead to emittance growth; thus, a self-consistent field description is necessary. We have developed a multi-dimensional time-dependent discrete particle simulation code, WARP, and are using it to study the behavior of HIF beams. The code's 3d package combines features of an accelerator code and a particle-in-cell (PIC) plasma simulation. Novel techniques allow it to follow beams through many accelerator elements over long distances and around bends. We have used the code to understand the emittance growth observed in the MBE4 experiment at Lawrence Berkeley Laboratory (LBL) under conditions of aggressive drift-compression. We are currently applying it to LBL's planned ILSE experiments, and (most recently) to an ESQ injector option being evaluated for ILSE. The code's r, z package is being used to study the axial confinement afforded by the shaped ends of the accelerating pulses, and to study longitudinal instability induced by induction module impedance.

  12. Verification of particle simulation of radio frequency waves in fusion plasmas

    SciTech Connect

    Kuley, Animesh; Lin, Z.; Wang, Z. X.; Wessel, F.

    2013-10-15

    Radio frequency (RF) waves can provide heating, current and flow drive, as well as instability control for steady state operations of fusion experiments. A particle simulation model has been developed in this work to provide a first-principles tool for studying the RF nonlinear interactions with plasmas. In this model, ions are considered as fully kinetic particles using the Vlasov equation and electrons are treated as guiding centers using the drift kinetic equation. This model has been implemented in a global gyrokinetic toroidal code using real electron-to-ion mass ratio. To verify the model, linear simulations of ion plasma oscillation, ion Bernstein wave, and lower hybrid wave are carried out in cylindrical geometry and found to agree well with analytic predictions.

  13. Using Geostatistical Data Fusion Techniques and MODIS Data to Upscale Simulated Wheat Yield

    NASA Astrophysics Data System (ADS)

    Castrignano, A.; Buttafuoco, G.; Matese, A.; Toscano, P.

    2014-12-01

    Population growth increases food request. Assessing food demand and predicting the actual supply for a given location are critical components of strategic food security planning at regional scale. Crop yield can be simulated using crop models because is site-specific and determined by weather, management, length of growing season and soil properties. Crop models require reliable location-specific data that are not generally available. Obtaining these data at a large number of locations is time-consuming, costly and sometimes simply not feasible. An upscaling method to extend coverage of sparse estimates of crop yield to an appropriate extrapolation domain is required. This work is aimed to investigate the applicability of a geostatistical data fusion approach for merging remote sensing data with the predictions of a simulation model of wheat growth and production using ground-based data. The study area is Capitanata plain (4000 km2) located in Apulia Region, mostly cropped with durum wheat. The MODIS EVI/NDVI data products for Capitanata plain were downloaded from the Land Processes Distributed Active Archive Center (LPDAAC) remote for the whole crop cycle of durum wheat. Phenological development, biomass growth and grain quantity of durum wheat were simulated by the Delphi system, based on a crop simulation model linked to a database including soil properties, agronomical and meteorological data. Multicollocated cokriging was used to integrate secondary exhaustive information (multi-spectral MODIS data) with primary variable (sparsely distributed biomass/yield model predictions of durum wheat). The model estimates looked strongly spatially correlated with the radiance data (red and NIR bands) and the fusion data approach proved to be quite suitable and flexible to integrate data of different type and support.

  14. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    NASA Astrophysics Data System (ADS)

    Haines, Brian M.; Grim, Gary P.; Fincke, James R.; Shah, Rahul C.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; Hayes-Sterbenz, Anna C.; Jungman, Gerard; Klein, Andreas; Rundberg, Robert S.; Steinkamp, Michael J.; Wilhelmy, Jerry B.

    2016-07-01

    We present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a "CD Mixcap," is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employ any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  15. Three-dimensional particle simulation of heavy-ion fusion beams*

    NASA Astrophysics Data System (ADS)

    Friedman, Alex; Grote, David P.; Haber, Irving

    1992-07-01

    The beams in a heavy-ion-beam-driven inertial fusion (HIF) accelerator are collisionless, nonneutral plasmas, confined by applied magnetic and electric fields. These space-charge-dominated beams must be focused onto small (few mm) spots at the fusion target, and so preservation of a small emittance is crucial. The nonlinear beam self-fields can lead to emittance growth, and so a self-consistent field description is needed. To this end, a multidimensional particle simulation code, warp [Friedman et al., Part. Accel. 37-38, 131 (1992)], has been developed and is being used to study the transport of HIF beams. The code's three-dimensional (3-D) package combines features of an accelerator code and a particle-in-cell plasma simulation. Novel techniques allow it to follow beams through many accelerator elements over long distances and around bends. This paper first outlines the algorithms employed in warp. A number of applications and corresponding results are then presented. These applications include studies of: beam drift-compression in a misaligned lattice of quadrupole focusing magnets; beam equilibria, and the approach to equilibrium; and the MBE-4 experiment [AIP Conference Proceedings 152 (AIP, New York, 1986), p. 145] recently concluded at Lawrence Berkeley Laboratory (LBL). Finally, 3-D simulations of bent-beam dynamics relevant to the planned Induction Linac Systems Experiments (ILSE) [Fessenden, Nucl. Instrum. Methods Plasma Res. A 278, 13 (1989)] at LBL are described. Axially cold beams are observed to exhibit little or no root-mean-square emittance growth at midpulse in transiting a (sharp) bend. Axially hot beams, in contrast, do exhibit some emittance growth.

  16. Special Education Simulation and Consultation Project: Special Training Project. Final Report. Part I: Results and Learnings.

    ERIC Educational Resources Information Center

    Batten, Murray O.; Burello, Leonard C.

    Presented is the final report of the Special Education Simulation and Consultation (SECAC) Project designed to provide simulation-based inservice training to Michigan building principals. Part I reviews project goals, objectives, procedures, results, and learnings. It is explained that the training employed the Special Education Administrators…

  17. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    SciTech Connect

    G. R. Odette; G. E. Lucas

    2005-11-15

    This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

  18. Angular radiation temperature simulation for time-dependent capsule drive prediction in inertial confinement fusion

    SciTech Connect

    Jing, Longfei; Yang, Dong; Li, Hang; Zhang, Lu; Lin, Zhiwei; Li, Liling; Kuang, Longyu; Jiang, Shaoen Ding, Yongkun; Huang, Yunbao

    2015-02-15

    The x-ray drive on a capsule in an inertial confinement fusion setup is crucial for ignition. Unfortunately, a direct measurement has not been possible so far. We propose an angular radiation temperature simulation to predict the time-dependent drive on the capsule. A simple model, based on the view-factor method for the simulation of the radiation temperature, is presented and compared with the experimental data obtained using the OMEGA laser facility and the simulation results acquired with VISRAD code. We found a good agreement between the time-dependent measurements and the simulation results obtained using this model. The validated model was then used to analyze the experimental results from the Shenguang-III prototype laser facility. More specifically, the variations of the peak radiation temperatures at different view angles with the albedo of the hohlraum, the motion of the laser spots, the closure of the laser entrance holes, and the deviation of the laser power were investigated. Furthermore, the time-dependent radiation temperature at different orientations and the drive history on the capsule were calculated. The results indicate that the radiation temperature from “U20W112” (named according to the diagnostic hole ID on the target chamber) can be used to approximately predict the drive temperature on the capsule. In addition, the influence of the capsule on the peak radiation temperature is also presented.

  19. A new paradigm for variable-fidelity stochastic simulation and information fusion in fluid mechanics

    NASA Astrophysics Data System (ADS)

    Venturi, Daniele; Parussini, Lucia; Perdikaris, Paris; Karniadakis, George

    2015-11-01

    Predicting the statistical properties of fluid systems based on stochastic simulations and experimental data is a problem of major interest across many disciplines. Even with recent theoretical and computational advancements, no broadly applicable techniques exist that could deal effectively with uncertainty propagation and model inadequacy in high-dimensions. To address these problems, we propose a new paradigm for variable-fidelity stochastic modeling, simulation and information fusion in fluid mechanics. The key idea relies in employing recursive Bayesian networks and multi-fidelity information sources (e.g., stochastic simulations at different resolution) to construct optimal predictors for quantities of interest, e.g., the random temperature field in stochastic Rayleigh-Bénard convection. The object of inference is the quantity of interest at the highest possible level of fidelity, for which we can usually afford only few simulations. To compute the optimal predictors, we developed a multivariate recursive co-kriging approach that simultaneously takes into account variable fidelity in the space of models (e.g., DNS vs. potential flow solvers), as well as variable-fidelity in probability space. Numerical applications are presented and discussed. This research was supported by AFOSR and DARPA.

  20. Simulation of plume dispersion from single release in Fusion Field Trial-07 experiment

    NASA Astrophysics Data System (ADS)

    Singh, Sarvesh Kumar; Sharan, Maithili

    2013-12-01

    Accurate description of source-receptor relationship is required for an efficient source reconstruction. This is examined by simulating the dispersion of plumes resulted from the available ten trials of single releases conducted at Fusion Field Trials, Dugway Proving Ground, Utah. The simulation is addressed with an earlier developed IIT (Indian Institute of Technology) dispersion model using the dispersion parameters in terms of measurements of turbulent velocity fluctuations. Simulation is described separately in both stable and unstable conditions, characterizing the peak as well as overall observed concentration distribution. Simulated results are compared with those obtained using AERMOD. With IIT model, peak concentrations are predicted within a factor of two in all the trials. The higher concentrations (>5 × 10-4 g m-3) are well predicted in stable condition and under-predicted (within a factor of two) in unstable condition whereas relatively smaller concentrations (<5 × 10-4 g m-3) are severely under-predicted in stable conditions and over-predicted in unstable conditions. The AERMOD exhibits the similar prediction of concentrations as shown by IIT model in most of the trials. Overall, both the models predict 70-80% concentrations in stable conditions and 85-95% concentrations in unstable conditions within a factor of six. The statistical measures for both the models are found well in agreement with the observations.

  1. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  2. Multimode guidance project low frequency ECM simulator: Hardware description

    NASA Astrophysics Data System (ADS)

    Kaye, H. M.

    1982-10-01

    The Multimode Guidance(MMG) Project, part of the Army/Navy Area Defense SAM Technology Prototyping Program, was established to conduct a feasibility demonstration of multimode guidance concepts. Prototype guidance units for advanced, long range missiles are being built and tested under MMG Project sponsorship. The Johns Hopkins University Applied Physics Laboratory has been designated as Government Agent for countermeasures for this project. In support of this effort, a family of computer-controlled ECM simulators is being developed for validation of contractor's multimode guidance prototype designs. The design of the Low Frequency ECM Simulator is documented in two volumes. This report, Volume A, describes the hardware design of the simulator; Volume B describes the software design. This computer-controlled simulator can simulate up to six surveillance frequency jammers in B through F bands and will be used to evaluate the performance of home-on-jamming guidance modes in multiple jammer environments.

  3. Application of proton boron fusion reaction to radiation therapy: A Monte Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk

    2014-12-01

    Three alpha particles are emitted from the point of reaction between a proton and boron. The alpha particles are effective in inducing the death of a tumor cell. After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton's maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here, we show that the effectiveness of the proton boron fusion therapy was verified using Monte Carlo simulations. We found that a dramatic increase by more than half of the proton's maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton's maximum dose point was located within the boron uptake region. In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.

  4. Application of proton boron fusion reaction to radiation therapy: A Monte Carlo simulation study

    SciTech Connect

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk

    2014-12-01

    Three alpha particles are emitted from the point of reaction between a proton and boron. The alpha particles are effective in inducing the death of a tumor cell. After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton's maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here, we show that the effectiveness of the proton boron fusion therapy was verified using Monte Carlo simulations. We found that a dramatic increase by more than half of the proton's maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton's maximum dose point was located within the boron uptake region. In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.

  5. Three-Dimensional Simulations of the Deceleration Phase of Inertial Fusion Implosions

    NASA Astrophysics Data System (ADS)

    Woo, K. M.; Betti, R.; Bose, A.; Epstein, R.; Delettrez, J. A.; Anderson, K. S.; Yan, R.; Chang, P.-Y.; Jonathan, D.; Charissis, M.

    2015-11-01

    The three-dimensional radiation-hydrodynamics code DEC3D has been developed to model the deceleration phase of direct-drive inertial confinement fusion implosions. The code uses the approximate Riemann solver on a moving mesh to achieve high resolution near discontinuities. The domain decomposition parallelization strategy is implemented to maintain high computation efficiency for the 3-D calculation through message passing interface. The implicit thermal diffusion is solved by the parallel successive-over-relaxation iteration. Results from 3-D simulations of low-mode Rayleigh-Taylor instability are presented and compared with 2-D results. A systematic comparison of yields, pressures, temperatures, and areal densities between 2-D and 3-D is carried out to determine the additional degradation in target performance caused by the three-dimensionality of the nonuniformities. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944 and DE-FC02-04ER54789 (Fusion Science Center).

  6. A fusion feature and its improvement based on locality preserving projections for rolling element bearing fault classification

    NASA Astrophysics Data System (ADS)

    Ding, Xiaoxi; He, Qingbo; Luo, Nianwu

    2015-01-01

    The sensitive feature extraction from vibration signals is still a great challenge for effective fault classification of rolling element bearing. Current fault classification generally depends on feature pattern difference of different fault classes. This paper explores the active role of healthy pattern in fault classification and proposes a new fusion feature extraction method based on locality preserving projections (LPP). This study intends to discover the local feature pattern difference between each bearing status and the healthy condition to characterize and discriminate different bearing statuses. Specifically, the proposed fusion feature is achieved by two main steps. In the first step, a two-class model is firstly constructed for each class by using this class of signals and healthy condition signals. Then a fusion mapping is generated by mathematically combing the mappings of the LPP or its improvement for all two-class models. In the second step, the LPP is further applied to reduce the fusion mapping dimension, which is to find more sensitive low-dimensional information hidden in the high-dimensional fusion feature structure. The final achieved fusion feature can enhance the discrimination between all classes by improving the between-class scatter and within-class scatter for fault classification. Experimental results using different bearing fault types and severities under different loads show that the proposed method is well-suited and effective for bearing fault classification.

  7. Project ITCH: Interactive Digital Simulation in Electrical Engineering Education.

    ERIC Educational Resources Information Center

    Bailey, F. N.; Kain, R. Y.

    A two-stage project is investigating the educational potential of a low-cost time-sharing system used as a simulation tool in Electrical Engineering (EE) education. Phase I involves a pilot study and Phase II a full integration. The system employs interactive computer simulation to teach engineering concepts which are not well handled by…

  8. Transition from Beam-Target to Thermonuclear Fusion in High-Current Deuterium Z -Pinch Simulations

    NASA Astrophysics Data System (ADS)

    Offermann, Dustin T.; Welch, Dale R.; Rose, Dave V.; Thoma, Carsten; Clark, Robert E.; Mostrom, Chris B.; Schmidt, Andrea E. W.; Link, Anthony J.

    2016-05-01

    Fusion yields from dense, Z -pinch plasmas are known to scale with the drive current, which is favorable for many potential applications. Decades of experimental studies, however, show an unexplained drop in yield for currents above a few mega-ampere (MA). In this work, simulations of DD Z -Pinch plasmas have been performed in 1D and 2D for a constant pinch time and initial radius using the code Lsp, and observations of a shift in scaling are presented. The results show that yields below 3 MA are enhanced relative to pure thermonuclear scaling by beamlike particles accelerated in the Rayleigh-Taylor induced electric fields, while yields above 3 MA are reduced because of energy lost by the instability and the inability of the beamlike ions to enter the pinch region.

  9. The OOPIC simulation project: Progress and validation

    SciTech Connect

    Gladd, N.T.; Verboncoeur, J.P.; Birdsall, C.K.; Cartwright, K.; Mardahl, P.; Peter, W.

    1994-12-31

    The OOPIC (Object-Oriented Particle-In-Cell) project is a three year, multi-institutional effort centering on the use of advanced computational methods to develop a 2{1/2}-D, relativistic, electromagnetic PIC code for application to vacuum electronic design. OOPIC is formulated with object-oriented concepts, implemented in C++, has a sophisticated graphical user interface, and operates on PCs as well as workstations. Specific software engineered protocols are being followed to insure that OOPIC is easy to modify and that its components will be reusable within other PIC projects. OOPIC is also interesting in its use of Langdon`s integral formulation of Maxwell`s equations for general quadrilateral grids. The authors report on the progress of the OOPIC project and, in particular, discuss the various tests developed to validate the code. Since OOPIC is intended for the public domain as a general tool for vacuum electronic design, are developing an extensive suite of tests against analytical and numerical theory.

  10. Advances in HYDRA and its application to simulations of Inertial Confinement Fusion targets

    NASA Astrophysics Data System (ADS)

    Marinak, M. M.; Kerbel, G. D.; Koning, J. M.; Patel, M. V.; Sepke, S. M.; Brown, P. N.; Chang, B.; Procassini, R.; Veitzer, S. A.

    2008-11-01

    We will outline new capabilities added to the HYDRA 2D/3D multiphysics ICF simulation code. These include a new SN multigroup radiation transport package (1D), constitutive models for elastic-plastic (strength) effects, and a mix model. A Monte Carlo burn package is being incorporated to model diagnostic signatures of neutrons, gamma rays and charged particles. A 3D MHD package that treats resistive MHD is available. Improvements to HYDRA's implicit Monte Carlo photonics package, including the addition of angular biasing, now enable integrated hohlraum simulations to complete in substantially shorter time. The heavy ion beam deposition package now includes a new model for ion stopping power developed by the Tech-X Corporation, with improved accuracy below the Bragg peak. Examples will illustrate HYDRA's enhanced capabilities to simulate various aspects of inertial confinement fusion targets.This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344. The work of Tech-X personnel was funded by the Department of Energy under Small Business Innovation Research Contract No. DE-FG02-03ER83797.

  11. The PLX- α project: Radiation-MHD Simulations of Imploding Plasma Liners Using USim

    NASA Astrophysics Data System (ADS)

    Beckwith, Kristian; Stoltz, Peter; Kundrapu, Madhusudhan; Hsu, Scott; PLX-α Team

    2015-11-01

    USim is a tool for modeling high energy density plasmas using multi-fluid models coupled to electromagnetics using fully-implicit iterative solvers, combined with finite volume discretizations on unstructured meshes. Prior work has demonstrated application of USim models and algorithms to simulation of supersonic plasma jets relevant to the Plasma Liner Experiment (PLX) and compared synthetic interferometry to that gathered from the experiment. Here, we give an overview of the models and algorithms included in USim; review results from prior modeling campaigns for the PLX; and describe plans for radiation magnetohydrodynamic (MHD) simulation efforts focusing on integrated plasma-liner implosion and target compression in a fusion-relevant regime using USim for the PLX- α project. Supported by ARPA-E's ALPHA program. Original PLX construction supported by OFES. USim development supported in part by Air Force Office of Scientific Research.

  12. Three dimensional simulations of space charge dominated heavy ion beams with applications to inertial fusion energy

    SciTech Connect

    Grote, D.P.

    1994-11-01

    Heavy ion fusion requires injection, transport and acceleration of high current beams. Detailed simulation of such beams requires fully self-consistent space charge fields and three dimensions. WARP3D, developed for this purpose, is a particle-in-cell plasma simulation code optimized to work within the framework of an accelerator`s lattice of accelerating, focusing, and bending elements. The code has been used to study several test problems and for simulations and design of experiments. Two applications are drift compression experiments on the MBE-4 facility at LBL and design of the electrostatic quadrupole injector for the proposed ILSE facility. With aggressive drift compression on MBE-4, anomalous emittance growth was observed. Simulations carried out to examine possible causes showed that essentially all the emittance growth is result of external forces on the beam and not of internal beam space-charge fields. Dominant external forces are the dodecapole component of focusing fields, the image forces on the surrounding pipe and conductors, and the octopole fields that result from the structure of the quadrupole focusing elements. Goal of the design of the electrostatic quadrupole injector is to produce a beam of as low emittance as possible. The simulations show that the dominant effects that increase the emittance are the nonlinear octopole fields and the energy effect (fields in the axial direction that are off-axis). Injectors were designed that minimized the beam envelope in order to reduce the effect of the nonlinear fields. Alterations to the quadrupole structure that reduce the nonlinear fields further were examined. Comparisons were done with a scaled experiment resulted in very good agreement.

  13. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  14. FY2014 FES (Fusion Energy Sciences) Theory & Simulation Performance Target, Final Report

    SciTech Connect

    Fu, Guoyong; Budny, Robert; Gorelenkov, Nikolai; Poli, Francesca; Chen, Yang; McClenaghan, Joseph; Lin, Zhihong; Spong, Don; Bass, Eric; Waltz, Ron

    2014-10-14

    We report here the work done for the FY14 OFES Theory Performance Target as given below: "Understanding alpha particle confinement in ITER, the world's first burning plasma experiment, is a key priority for the fusion program. In FY 2014, determine linear instability trends and thresholds of energetic particle-driven shear Alfven eigenmodes in ITER for a range of parameters and profiles using a set of complementary simulation models (gyrokinetic, hybrid, and gyrofluid). Carry out initial nonlinear simulations to assess the effects of the unstable modes on energetic particle transport". In the past year (FY14), a systematic study of the alpha-driven Alfven modes in ITER has been carried out jointly by researchers from six institutions involving seven codes including the transport simulation code TRANSP (r. Budny and F. Poli, PPPL), three gyrokinetic codes: GEM (Y. Chen, Univ. of Colorado), GTC (J. McClenaghan, Z. Lin, UCI), and GYRO (E. Bass, R. Waltz, UCSD/GA), the hybrid code M3D-K (G.Y. Fu, PPPL), the gyro-fluid code TAEFL (D. Spong, ORNL), and the linear kinetic stability code NOVA-K (N. Gorelenkov, PPPL). A range of ITER parameters and profiles are specified by TRANSP simulation of a hybrid scenario case and a steady state scenario case. Based on the specified ITER equilibria linear stability calculations are done to determine the stability boundary of alpha-driven high-n TAEs using the five initial value codes (GEM, GTC, GYRO, M3D-K, and TAEFL) and the kinetic stability code (NOVA-K). Both the effects of alpha particles and beam ions have been considered. Finally the effects of the unstable modes on energetic particle transport have been explored using GEM and M3D-K.

  15. Simulation of plume dispersion of multiple releases in Fusion Field Trial-07 experiment

    NASA Astrophysics Data System (ADS)

    Pandey, Gavendra; Sharan, Maithili

    2015-12-01

    For an efficient source term estimation, it is important to use an accurate dispersion model with appropriate dispersion parameters. This is examined by simulating the dispersion of plumes resulted from the available multiple releases conducted at Fusion Field Trials, Dugway Proving Ground, Utah. The simulation is carried out with an earlier developed IIT (Indian Institute of Technology) dispersion model using the dispersion parameters in terms of measurements of turbulent velocity fluctuations. Simulation is discussed separately in both stable and unstable conditions in light of (i) plume behavior of observed and predicted concentrations in the form of isopleths, (ii) peak/maximum concentrations and (iii) overall concentration distribution. Simulated results from IIT model are compared with those obtained using AERMOD. Both, IIT model and AERMOD, predicted peak concentrations within a factor of two in all the releases and tracer transport is mostly along the mean wind direction. With IIT model, the higher concentrations are predicted close to observations in all the trials of stable conditions and with in a factor of two in the trials of unstable conditions. However, the relatively smaller concentrations are under-predicted severely in stable conditions and over-predicted in unstable conditions. The AERMOD exhibits the similar prediction of concentrations as in IIT model except slightly over-prediction in stable conditions and under-prediction in unstable conditions. The statistical measures for both the models are found good in agreement with the observations and a quantitative analysis based on F-test shows that the performance from both the models are found to be similar at 5% significance level.

  16. Final Report for the "Fusion Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Cary, John R; Kruger, Scott

    2014-10-02

    The FACETS project over its lifetime developed the first self-consistent core-edge coupled capabilities, a new transport solver for modeling core transport in tokamak cores, developed a new code for modeling wall physics over long time scales, and significantly improved the capabilities and performance of legacy components, UEDGE, NUBEAM, GLF23, GYRO, and BOUT++. These improved capabilities leveraged the team’s expertise in applied mathematics (solvers and algorithms) and computer science (performance improvements and language interoperability). The project pioneered new methods for tackling the complexity of simulating the concomitant complexity of tokamak experiments.

  17. Revealing Surface Waters on an Antifreeze Protein by Fusion Protein Crystallography Combined with Molecular Dynamic Simulations.

    PubMed

    Sun, Tianjun; Gauthier, Sherry Y; Campbell, Robert L; Davies, Peter L

    2015-10-01

    Antifreeze proteins (AFPs) adsorb to ice through an extensive, flat, relatively hydrophobic surface. It has been suggested that this ice-binding site (IBS) organizes surface waters into an ice-like clathrate arrangement that matches and fuses to the quasi-liquid layer on the ice surface. On cooling, these waters join the ice lattice and freeze the AFP to its ligand. Evidence for the generality of this binding mechanism is limited because AFPs tend to crystallize with their IBS as a preferred protein-protein contact surface, which displaces some bound waters. Type III AFP is a 7 kDa globular protein with an IBS made up two adjacent surfaces. In the crystal structure of the most active isoform (QAE1), the part of the IBS that docks to the primary prism plane of ice is partially exposed to solvent and has clathrate waters present that match this plane of ice. The adjacent IBS, which matches the pyramidal plane of ice, is involved in protein-protein crystal contacts with few surface waters. Here we have changed the protein-protein contacts in the ice-binding region by crystallizing a fusion of QAE1 to maltose-binding protein. In this 1.9 Å structure, the IBS that fits the pyramidal plane of ice is exposed to solvent. By combining crystallography data with MD simulations, the surface waters on both sides of the IBS were revealed and match well with the target ice planes. The waters on the pyramidal plane IBS were loosely constrained, which might explain why other isoforms of type III AFP that lack the prism plane IBS are less active than QAE1. The AFP fusion crystallization method can potentially be used to force the exposure to solvent of the IBS on other AFPs to reveal the locations of key surface waters. PMID:26371748

  18. Multilevel fusion exploitation

    NASA Astrophysics Data System (ADS)

    Lindberg, Perry C.; Dasarathy, Belur V.; McCullough, Claire L.

    1996-06-01

    This paper describes a project that was sponsored by the U.S. Army Space and Strategic Defense Command (USASSDC) to develop, test, and demonstrate sensor fusion algorithms for target recognition. The purpose of the project was to exploit the use of sensor fusion at all levels (signal, feature, and decision levels) and all combinations to improve target recognition capability against tactical ballistic missile (TBM) targets. These algorithms were trained with simulated radar signatures to accurately recognize selected TBM targets. The simulated signatures represent measurements made by two radars (S-band and X- band) with the targets at a variety of aspect and roll angles. Two tests were conducted: one with simulated signatures collected at angles different from those in the training database and one using actual test data. The test results demonstrate a high degree of recognition accuracy. This paper describes the training and testing techniques used; shows the fusion strategy employed; and illustrates the advantages of exploiting multi-level fusion.

  19. Using a Scientific Process for Curriculum Development and Formative Evaluation: Project FUSION

    ERIC Educational Resources Information Center

    Doabler, Christian; Cary, Mari Strand; Clarke, Benjamin; Fien, Hank; Baker, Scott; Jungjohann, Kathy

    2011-01-01

    Given the vital importance of using a scientific approach for curriculum development, the authors employed a design experiment methodology (Brown, 1992; Shavelson et al., 2003) to develop and evaluate, FUSION, a first grade mathematics intervention intended for students with or at-risk for mathematics disabilities. FUSION, funded through IES…

  20. Perceptually aligning apical frequency regions leads to more binaural fusion of speech in a cochlear implant simulation.

    PubMed

    Staisloff, Hannah E; Lee, Daniel H; Aronoff, Justin M

    2016-07-01

    For bilateral cochlear implant users, the left and right arrays are typically not physically aligned, resulting in a degradation of binaural fusion, which can be detrimental to binaural abilities. Perceptually aligning the two arrays can be accomplished by disabling electrodes in one ear that do not have a perceptually corresponding electrode in the other side. However, disabling electrodes at the edges of the array will cause compression of the input frequency range into a smaller cochlear extent, which may result in reduced spectral resolution. An alternative approach to overcome this mismatch would be to only align one edge of the array. By aligning either only the apical or basal end of the arrays, fewer electrodes would be disabled, potentially causing less reduction in spectral resolution. The goal of this study was to determine the relative effect of aligning either the basal or apical end of the electrode with regards to binaural fusion. A vocoder was used to simulate cochlear implant listening conditions in normal hearing listeners. Speech signals were vocoded such that the two ears were either predominantly aligned at only the basal or apical end of the simulated arrays. The experiment was then repeated with a spectrally inverted vocoder to determine whether the detrimental effects on fusion were related to the spectral-temporal characteristics of the stimuli or the location in the cochlea where the misalignment occurred. In Experiment 1, aligning the basal portion of the simulated arrays led to significantly less binaural fusion than aligning the apical portions of the simulated array. However, when the input was spectrally inverted, aligning the apical portion of the simulated array led to significantly less binaural fusion than aligning the basal portions of the simulated arrays. These results suggest that, for speech, with its predominantly low frequency spectral-temporal modulations, it is more important to perceptually align the apical portion of

  1. M3D project for simulation studies of plasmas

    SciTech Connect

    Park, W.; Belova, E.V.; Fu, G.Y.; Strauss, H.R.; Sugiyama, L.E.

    1998-12-31

    The M3D (Multi-level 3D) project carries out simulation studies of plasmas of various regimes using multi-levels of physics, geometry, and mesh schemes in one code package. This paper and papers by Strauss, Sugiyama, and Belova in this workshop describe the project, and present examples of current applications. The currently available physics models of the M3D project are MHD, two-fluids, gyrokinetic hot particle/MHD hybrid, and gyrokinetic particle ion/two-fluid hybrid models. The code can be run with both structured and unstructured meshes.

  2. SIMRAND I- SIMULATION OF RESEARCH AND DEVELOPMENT PROJECTS

    NASA Technical Reports Server (NTRS)

    Miles, R. F.

    1994-01-01

    The Simulation of Research and Development Projects program (SIMRAND) aids in the optimal allocation of R&D resources needed to achieve project goals. SIMRAND models the system subsets or project tasks as various network paths to a final goal. Each path is described in terms of task variables such as cost per hour, cost per unit, availability of resources, etc. Uncertainty is incorporated by treating task variables as probabilistic random variables. SIMRAND calculates the measure of preference for each alternative network. The networks yielding the highest utility function (or certainty equivalence) are then ranked as the optimal network paths. SIMRAND has been used in several economic potential studies at NASA's Jet Propulsion Laboratory involving solar dish power systems and photovoltaic array construction. However, any project having tasks which can be reduced to equations and related by measures of preference can be modeled. SIMRAND analysis consists of three phases: reduction, simulation, and evaluation. In the reduction phase, analytical techniques from probability theory and simulation techniques are used to reduce the complexity of the alternative networks. In the simulation phase, a Monte Carlo simulation is used to derive statistics on the variables of interest for each alternative network path. In the evaluation phase, the simulation statistics are compared and the networks are ranked in preference by a selected decision rule. The user must supply project subsystems in terms of equations based on variables (for example, parallel and series assembly line tasks in terms of number of items, cost factors, time limits, etc). The associated cumulative distribution functions and utility functions for each variable must also be provided (allowable upper and lower limits, group decision factors, etc). SIMRAND is written in Microsoft FORTRAN 77 for batch execution and has been implemented on an IBM PC series computer operating under DOS.

  3. Simulation of normal and pathological gaits using a fusion knowledge strategy

    PubMed Central

    2013-01-01

    Gait distortion is the first clinical manifestation of many pathological disorders. Traditionally, the gait laboratory has been the only available tool for supporting both diagnosis and prognosis, but under the limitation that any clinical interpretation depends completely on the physician expertise. This work presents a novel human gait model which fusions two important gait information sources: an estimated Center of Gravity (CoG) trajectory and learned heel paths, by that means allowing to reproduce kinematic normal and pathological patterns. The CoG trajectory is approximated with a physical compass pendulum representation that has been extended by introducing energy accumulator elements between the pendulum ends, thereby emulating the role of the leg joints and obtaining a complete global gait description. Likewise, learned heel paths captured from actual data are learned to improve the performance of the physical model, while the most relevant joint trajectories are estimated using a classical inverse kinematic rule. The model is compared with standard gait patterns, obtaining a correlation coefficient of 0.96. Additionally,themodel simulates neuromuscular diseases like Parkinson (phase 2, 3 and 4) and clinical signs like the Crouch gait, case in which the averaged correlation coefficient is 0.92. PMID:23844901

  4. Exploring International Investment through a Classroom Portfolio Simulation Project

    ERIC Educational Resources Information Center

    Chen, Xiaoying; Yur-Austin, Jasmine

    2013-01-01

    A rapid integration of financial markets has prevailed during the last three decades. Investors are able to diversify investment beyond national markets to mitigate return volatility of a "pure domestic portfolio." This article discusses a simulation project through which students learn the role of international investment by managing…

  5. NASA/Haughton-Mars Project 2006 Lunar Medical Contingency Simulation

    NASA Technical Reports Server (NTRS)

    Scheuring, Richard A.; Jones, J. A.; Lee, P.; Comtois, J. M.; Chappell, S.; Rafiq, A.; Braham, S.

    2007-01-01

    A viewgraph presentation describing NASA's Haughton-Mars Project (HMP) medical requirements and lunar surface operations is shown. The topics onclude: 1) Mission Purpose/ Overview; 2) HMP as a Moon/Mars Analog; 3) Simulation objectives; 4) Discussion; and 5) Forward work.

  6. Vectorised simulation of the response of a time projection chamber

    NASA Astrophysics Data System (ADS)

    Georgiopoulos, C. H.; Mermikides, M. E.

    1989-12-01

    A Monte Carlo code used for the detailed simulation of the response of the ALEPH time projection chamber has been successfully restructured to exploit the vector architectures of the CDC CYBER-205, ETA10 and CRAY X-MP supercomputers. Some aspects of the vector implementation are discussed and the performance on the various processors is compared.

  7. Modeling and simulation support for ICRF heating of fusion plasmas. Annual report, 1990

    SciTech Connect

    1990-03-15

    Recent experimental, theoretical and computational results have shown the need and usefulness of a combined approach to the design, analysis and evaluation of ICH antenna configurations. The work at the University of Wisconsin (UW) in particular has shown that much needed information on the vacuum operation of ICH antennas can be obtained by a modest experimental and computational effort. These model experiments at UW and SAIC simulations have shown dramatically the potential for positive impact upon the ICRF program. Results of the UW-SAIC joint ICRF antenna analysis effort have been presented at several international meetings and numerous meetings in the United States. The PPPL bay M antenna has been modeled using the ARGUS code. The results of this effort are shown in Appendix C. SAIC has recently begun a collaboration with the ICRF antenna design and analysis group at ORNL. At present there are two separate projects underway. The first is associated with the simulation of and determination of the effect of adding slots in the antenna septum and side walls. The second project concerns the modeling and simulation of the ORNL folded waveguide (FWG) concept.

  8. Fast discontinuous Galerkin lattice-Boltzmann simulations on GPUs via maximal kernel fusion

    NASA Astrophysics Data System (ADS)

    Mazzeo, Marco D.

    2013-03-01

    A GPU implementation of the discontinuous Galerkin lattice-Boltzmann method with square spectral elements, and highly optimised for speed and precision of calculations is presented. An extensive analysis of the numerous variants of the fluid solver unveils that best performance is obtained by maximising CUDA kernel fusion and by arranging the resulting kernel tasks so as to trigger memory coherent and scattered loads in a specific manner, albeit at the cost of introducing cross-thread load unbalancing. Surprisingly, any attempt to vanish this, to maximise thread occupancy and to adopt conventional work tiling or distinct custom kernels highly tuned via ad hoc data and computation layouts invariably deteriorate performance. As such, this work sheds light into the possibility to hide fetch latencies of workloads involving heterogeneous loads in a way that is more effective than what is achieved with frequently suggested techniques. When simulating the lid-driven cavity on a NVIDIA GeForce GTX 480 via a 5-stage 4th-order Runge-Kutta (RK) scheme, the first four digits of the obtained centreline velocity values, or more, converge to those of the state-of-the-art literature data at a simulation speed of 7.0G primitive variable updates per second during the collision stage and 4.4G ones during each RK step of the advection by employing double-precision arithmetic (DPA) and a computational grid of 642 4×4-point elements only. The new programming engine leads to about 2× performance w.r.t. the best programming guidelines in the field. The new fluid solver on the above GPU is also 20-30 times faster than a highly optimised version running on a single core of a Intel Xeon X5650 2.66 GHz.

  9. Modelling neutral beams in fusion devices: Beamlet-based model for fast particle simulations

    NASA Astrophysics Data System (ADS)

    Asunta, O.; Govenius, J.; Budny, R.; Gorelenkova, M.; Tardini, G.; Kurki-Suonio, T.; Salmi, A.; Sipilä, S.

    2015-03-01

    Neutral beam injection (NBI) will be one of the main sources of heating and non-inductive current drive in ITER. Due to high level of injected power the beam induced heat loads present a potential threat to the integrity of the first wall of the device, particularly in the presence of non-axisymmetric perturbations of the magnetic field. Neutral beam injection can also destabilize Alfvén eigenmodes and energetic particle modes, and act as a source of plasma rotation. Therefore, reliable and accurate simulation of NBI is important for making predictions for ITER, as well as for any other current or future fusion device. This paper introduces a new beamlet-based neutral beam ionization model called BBNBI. It takes into account the fine structure of the injector, follows the injected neutrals until ionization, and generates a source ensemble of ionized NBI test particles for slowing down calculations. BBNBI can be used as a stand-alone model but together with the particle following code ASCOT it forms a complete and sophisticated tool for simulating neutral beam injection. The test particle ensembles from BBNBI are found to agree well with those produced by PENCIL for JET, and those produced by NUBEAM both for JET and ASDEX Upgrade plasmas. The first comprehensive comparisons of beam slowing down profiles of interest from BBNBI + ASCOT with results from PENCIL and NUBEAM/TRANSP, for both JET and AUG, are presented. It is shown that, for an axisymmetric plasma, BBNBI + ASCOT and NUBEAM agree remarkably well. Together with earlier 3D studies, these results further validate using BBNBI + ASCOT also for studying phenomena that require particle following in a truly three-dimensional geometry.

  10. The SIMRAND methodology - Simulation of Research and Development Projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  11. Particle-in-cell simulations of the magnetoacoustic cyclotron instability of fusion-born alpha-particles in tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Cook, J. W. S.; Dendy, R. O.; Chapman, S. C.

    2013-06-01

    Ion cyclotron emission (ICE) is the only collective radiative instability, driven by confined fusion-born alpha-particles, observed from deuterium-tritium (DT) plasmas in both JET and TFTR. Using first principles particle-in-cell simulations of the magnetoacoustic cyclotron instability (MCI), we elucidate some of the fully kinetic nonlinear processes that may underlie observations of ICE from fusion products in these large tokamaks. We find that the MCI is intrinsically self-limiting on very fast timescales, which may help explain the observed correlation between linear theory and observed ICE intensity. The simulations elaborate the nature of the excited electric and magnetic fluctuations, from first principles, confirming the dominant role of fast Alfvénic and electrostatic components which is assumed ab initio in analytical treatments.

  12. Fusion studies with low-intensity radioactive ion beams using an active-target time projection chamber

    NASA Astrophysics Data System (ADS)

    Kolata, J. J.; Howard, A. M.; Mittig, W.; Ahn, T.; Bazin, D.; Becchetti, F. D.; Beceiro-Novo, S.; Chajecki, Z.; Febbrarro, M.; Fritsch, A.; Lynch, W. G.; Roberts, A.; Shore, A.; Torres-Isea, R. O.

    2016-09-01

    The total fusion excitation function for 10Be+40Ar has been measured over the center-of-momentum (c.m.) energy range from 12 to 24 MeV using a time-projection chamber (TPC). The main purpose of this experiment, which was carried out in a single run of duration 90 h using a ≈100 particle per second (pps) 10Be beam, was to demonstrate the capability of an active-target TPC to determine fusion excitation functions for extremely weak radioactive ion beams. Cross sections as low as 12 mb were measured with acceptable (50%) statistical accuracy. It also proved to be possible to separate events in which charged particles were emitted from the fusion residue from those in which only neutrons were evaporated. The method permits simultaneous measurement of incomplete fusion, break-up, scattering, and transfer reactions, and therefore fully exploits the opportunities presented by the very exotic beams that will be available from the new generation of radioactive beam facilities.

  13. The GeantV project: Preparing the future of simulation

    SciTech Connect

    Amadio, G.; J. Apostolakis; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; de Fine Licht, J.; Gheata, A.; Iope, R. L.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2015-12-23

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. Furthermore, a set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.

  14. The GeantV project: Preparing the future of simulation

    DOE PAGESBeta

    Amadio, G.; J. Apostolakis; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; et al

    2015-12-23

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energymore » Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. Furthermore, a set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.« less

  15. The GeantV project: preparing the future of simulation

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; de Fine Licht, J.; Gheata, A.; Iope, R. L.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2015-12-01

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. A set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.

  16. Mechanisms of Plastic and Fracture Instabilities for Alloy Development of Fusion Materials. Final Project Report for period July 15, 1998 - July 14, 2003

    SciTech Connect

    Ghoniem, N. M.

    2003-07-14

    The main objective of this research was to develop new computational tools for the simulation and analysis of plasticity and fracture mechanisms of fusion materials, and to assist in planning and assessment of corresponding radiation experiments.

  17. Projected profile similarity in gyrokinetic simulations of Bohm and gyro-Bohm scaled DIII-D L and H modes

    SciTech Connect

    Waltz, R. E.; Candy, J.; Petty, C. C.

    2006-07-15

    Global gyrokinetic simulations of DIII-D [M. A. Mahdavi and J. L. Luxon, in 'DIII-D Tokamak Special Issue', Fusion Sci. Technol. 48, 2 (2005)] L- and H-mode dimensionally similar discharge pairs are treated in detail. The simulations confirm the Bohm scaling of the well-matched L-mode pair. The paradoxical but experimentally apparent gyro-Bohm scaling of the H-mode pair at larger relative gyroradius (rho-star) and lower transport levels is due to poor profile similarity. Simulations of projected experimental plasma profiles with perfect similarity show both the L- and H-mode pairs to have Bohm scaling. A {rho}{sub *} stabilization rule for predicting the breakdown of gyro-Bohm scaling from simulations of a single discharge is presented.

  18. The simulation model of teleradiology in telemedicine project.

    PubMed

    Goodini, Azadeh; Torabi, Mashallah; Goodarzi, Maryam; Safdari, Reza; Darayi, Mohamad; Tavassoli, Mahdieh; Shabani, MohammadMehdi

    2015-01-01

    Telemedicine projects are aimed at offering medical services to people who do not have access to direct diagnosis and treatment services. As a powerful tool for analyzing the performance of complex systems and taking probable events into consideration, systemic simulation can facilitate the analysis of implementation processes of telemedicine projects in real-life-like situations. The aim of the present study was to propose a model for planning resource capacities and allocating human and operational resources to promote the efficiency of telemedicine project by investigating the process of teleradiology. In this article, after verification of the conceptual model by the experts of this field, the computerized simulation model is developed using simulation software Arena. After specifying the required data, different improvement scenarios are run using the computerized model by feeding the data into the software and validation and verification of the model. Fixing input data of the system such as the number of patients, their waiting time, and process time of each function, for example, magnetic resonance imaging or scan, has been compared with the current radiology process. Implementing the teleradiology model resulted in reduction of time of patients in the system (current: 1.84 ± 0.00, tele: 0.81 ± 0.00). Furthermore, through this process, they can allocate the lower resources to perform better functions of staff. The use of computerized simulation is essential for designing processes, optimal allocation of resources, planning, and making appropriate decisions for providing timely services to patients. PMID:25627857

  19. Response to FESAC survey, non-fusion connections to Fusion Energy Sciences. Applications of the FES-supported beam and plasma simulation code, Warp

    SciTech Connect

    Friedman, A.; Grote, D. P.; Vay, J. L.

    2015-05-29

    The Fusion Energy Sciences Advisory Committee’s subcommittee on non-fusion applications (FESAC NFA) is conducting a survey to obtain information from the fusion community about non-fusion work that has resulted from their DOE-funded fusion research. The subcommittee has requested that members of the community describe recent developments connected to the activities of the DOE Office of Fusion Energy Sciences. Two questions in particular were posed by the subcommittee. This document contains the authors’ responses to those questions.

  20. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  1. Tooth model reconstruction based upon data fusion for orthodontic treatment simulation.

    PubMed

    Yau, Hong-Tzong; Yang, Tsan-Jui; Chen, Yi-Chen

    2014-05-01

    This paper proposes a full tooth reconstruction method by integrating 3D scanner data and computed tomography (CT) image sets. In traditional dental treatment, plaster models are used to record patient׳s oral information and assist dentists for diagnoses. However, plaster models only save surface information, and are therefore unable to provide further information for clinical treatment. With the rapid development of medical imaging technology, computed tomography images have become very popular in dental treatment. Computed tomography images with complete internal information can assist the clinical diagnosis for dental implants or orthodontic treatment, and a digital dental model can be used to simulate and predict results before treatment. However, a method of producing a high quality and precise dental model has yet to be developed. To this end, this paper presents a tooth reconstruction method based on the data fusion concept via integrating external scanned data and CT-based medical images. First, a plaster model is digitized with a 3D scanner. Then, each crown can be separated from the base according to the characteristics of tooth. CT images must be processed for feature enhancement and noise reduction, and to define the tooth axis direction which will be used for root slicing. The outline of each slice of dental root can then be determined by the level set algorithm, and converted to point cloud data. Finally, the crown and root data can be registered by the iterative closest point (ICP) algorithm. With this information, a complete digital dental model can be reconstructed by the Delaunay-based region-growing (DBRG) algorithm. The main contribution of this paper is to reconstruct a high quality customized dental model with root information that can offer significant help to the planning of dental implant and orthodontic treatment. PMID:24631784

  2. Adjacent segment disc pressures following two-level cervical disc replacement versus simulated anterior cervical fusion.

    PubMed

    Laxer, Eric B; Darden, Bruce V; Murrey, Daniel B; Milam, R Alden; Rhyne, Alfred L; Claytor, Brian; Nussman, Donna S; Powers, Timothy W; Davies, Matthew A; Bryant, S Chad; Larsen, Scott P; Bhatt, Meghal; Brodziak, John; Polic, Jelena

    2006-01-01

    Anterior cervical fusion (ACF) has been shown to alter the biomechanics of adjacent segments of the cervical spine. The goal of total disc replacement is to address pathology at a given disc with minimal disruption of the operated or adjacent segments. This study compares the pressure within discs adjacent to either a two-level simulated ACDF or a two-level total disc replacement with the ProDisc-C. A special automated motion testing apparatus was constructed. Four fresh cadaveric cervical spine specimens were affixed to the test stand and tested in flexion and extension under specific loads. Intradiscal, miniature strain-gauge-based transducers were placed in the discs above and below the "treated" levels. The specimens were then tested in flexion and extension. Pressure and overall angular displacement were measured. In the most extreme and highest quality specimen the difference at C3/C4 registered 800 kPa and the difference at C6/C7 registered 50 kPa. This same quality specimen treated with the ProDisc reached a flexion angle at much lower moments, 24.3 degrees at 5 N-m, when compared to the the SACF 12.2 degrees at 8.6 N-m. Therefore, the moment needed to achieve 15 degrees of flexion with the SACF treatment was 5.5 N-m and the ProDisc treatment was only 2.9 N-m. This initial data would indicate that adjacent level discs experience substantially lower pressure after two-level disc replacement when compared to two-level SACF. Additional testing to further support these observations is ongoing. PMID:17108473

  3. Community Petascale Project for Accelerator Science and Simulation

    SciTech Connect

    Warren B. Mori

    2013-02-01

    The UCLA Plasma Simulation Group is a major partner of the "Community Petascale Project for Accelerator Science and Simulation. This is the final technical report. We include an overall summary, a list of publications and individual progress reports for each years. During the past five years we have made tremendous progress in enhancing the capabilities of OSIRIS and QuickPIC, in developing new algorithms and data structures for PIC codes to run on GPUS and many future core architectures, and in using these codes to model experiments and in making new scientific discoveries. Here we summarize some highlights for which SciDAC was a major contributor.

  4. Data management and mission simulation for Spacelab projects

    NASA Astrophysics Data System (ADS)

    Mueller-Breitkreutz, M.; Panitz, H. J.

    1982-03-01

    It is pointed out that the data handling concept in the Spacelab mission Sl-1 is based on the centralized Command and Data Management System (CDMS). All the experiments are controlled by the experiment computer under the supervision of the Experiment Computer Operating System (ECOS) and application software. The decentralized data management system in the German Spacelab Mission D1 is described and compared with the ESA/NASA mission SL-1. Simulation techniques used at DFVLR for mission simulation and crew training in the D1 project are described.

  5. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  6. Preliminary Results from SCEC Earthquake Simulator Comparison Project

    NASA Astrophysics Data System (ADS)

    Tullis, T. E.; Barall, M.; Richards-Dinger, K. B.; Ward, S. N.; Heien, E.; Zielke, O.; Pollitz, F. F.; Dieterich, J. H.; Rundle, J. B.; Yikilmaz, M. B.; Turcotte, D. L.; Kellogg, L. H.; Field, E. H.

    2010-12-01

    Earthquake simulators are computer programs that simulate long sequences of earthquakes. If such simulators could be shown to produce synthetic earthquake histories that are good approximations to actual earthquake histories they could be of great value in helping to anticipate the probabilities of future earthquakes and so could play an important role in helping to make public policy decisions. Consequently it is important to discover how realistic are the earthquake histories that result from these simulators. One way to do this is to compare their behavior with the limited knowledge we have from the instrumental, historic, and paleoseismic records of past earthquakes. Another, but slow process for large events, is to use them to make predictions about future earthquake occurrence and to evaluate how well the predictions match what occurs. A final approach is to compare the results of many varied earthquake simulators to determine the extent to which the results depend on the details of the approaches and assumptions made by each simulator. Five independently developed simulators, capable of running simulations on complicated geometries containing multiple faults, are in use by some of the authors of this abstract. Although similar in their overall purpose and design, these simulators differ from one another widely in their details in many important ways. They require as input for each fault element a value for the average slip rate as well as a value for friction parameters or stress reduction due to slip. They share the use of the boundary element method to compute stress transfer between elements. None use dynamic stress transfer by seismic waves. A notable difference is the assumption different simulators make about the constitutive properties of the faults. The earthquake simulator comparison project is designed to allow comparisons among the simulators and between the simulators and past earthquake history. The project uses sets of increasingly detailed

  7. The AGORA High-resolution Galaxy Simulations Comparison Project

    NASA Astrophysics Data System (ADS)

    Kim, Ji-hoon; Abel, Tom; Agertz, Oscar; Bryan, Greg L.; Ceverino, Daniel; Christensen, Charlotte; Conroy, Charlie; Dekel, Avishai; Gnedin, Nickolay Y.; Goldbaum, Nathan J.; Guedes, Javiera; Hahn, Oliver; Hobbs, Alexander; Hopkins, Philip F.; Hummels, Cameron B.; Iannuzzi, Francesca; Keres, Dusan; Klypin, Anatoly; Kravtsov, Andrey V.; Krumholz, Mark R.; Kuhlen, Michael; Leitner, Samuel N.; Madau, Piero; Mayer, Lucio; Moody, Christopher E.; Nagamine, Kentaro; Norman, Michael L.; Onorbe, Jose; O'Shea, Brian W.; Pillepich, Annalisa; Primack, Joel R.; Quinn, Thomas; Read, Justin I.; Robertson, Brant E.; Rocha, Miguel; Rudd, Douglas H.; Shen, Sijing; Smith, Britton D.; Szalay, Alexander S.; Teyssier, Romain; Thompson, Robert; Todoroki, Keita; Turk, Matthew J.; Wadsley, James W.; Wise, John H.; Zolotov, Adi; AGORA Collaboration29,the

    2014-01-01

    We introduce the Assembling Galaxies Of Resolved Anatomy (AGORA) project, a comprehensive numerical study of well-resolved galaxies within the ΛCDM cosmology. Cosmological hydrodynamic simulations with force resolutions of ~100 proper pc or better will be run with a variety of code platforms to follow the hierarchical growth, star formation history, morphological transformation, and the cycle of baryons in and out of eight galaxies with halo masses M vir ~= 1010, 1011, 1012, and 1013 M ⊙ at z = 0 and two different ("violent" and "quiescent") assembly histories. The numerical techniques and implementations used in this project include the smoothed particle hydrodynamics codes GADGET and GASOLINE, and the adaptive mesh refinement codes ART, ENZO, and RAMSES. The codes share common initial conditions and common astrophysics packages including UV background, metal-dependent radiative cooling, metal and energy yields of supernovae, and stellar initial mass function. These are described in detail in the present paper. Subgrid star formation and feedback prescriptions will be tuned to provide a realistic interstellar and circumgalactic medium using a non-cosmological disk galaxy simulation. Cosmological runs will be systematically compared with each other using a common analysis toolkit and validated against observations to verify that the solutions are robust—i.e., that the astrophysical assumptions are responsible for any success, rather than artifacts of particular implementations. The goals of the AGORA project are, broadly speaking, to raise the realism and predictive power of galaxy simulations and the understanding of the feedback processes that regulate galaxy "metabolism." The initial conditions for the AGORA galaxies as well as simulation outputs at various epochs will be made publicly available to the community. The proof-of-concept dark-matter-only test of the formation of a galactic halo with a z = 0 mass of M vir ~= 1.7 × 1011 M ⊙ by nine different

  8. Progress of the NASAUSGS Lunar Regolith Simulant Project

    NASA Technical Reports Server (NTRS)

    Rickman, Douglas; McLemore, C.; Stoeser, D.; Schrader, C.; Fikes, J.; Street, K.

    2009-01-01

    Beginning in 2004 personnel at MSFC began serious efforts to develop a new generation of lunar simulants. The first two products were a replication of the previous JSC-1 simulant under a contract to Orbitec and a major workshop in 2005 on future simulant development. It was recognized in early 2006 there were serious limitations with the standard approach of simply taking a single terrestrial rock and grinding it. To a geologist, even a cursory examination of the Lunar Sourcebook shows that matching lunar heterogeneity, crystal size, relative mineral abundances, lack of H2O, plagioclase chemistry and glass abundance simply can not be done with any simple combination of terrestrial rocks. Thus the project refocused its efforts and approached simulant development in a new and more comprehensive manner, examining new approaches in simulant development and ways to more accurately compare simulants to actual lunar materials. This led to a multi-year effort with five major tasks running in parallel. The five tasks are Requirements, Lunar Analysis, Process Development, Feed Stocks, and Standards.

  9. SciDAC - Center for Plasma Edge Simulation - Project Summary

    SciTech Connect

    Parker, Scott

    2014-11-03

    Final Technical Report: Center for Plasma Edge Simulation (CPES) Principal Investigator: Scott Parker, University of Colorado, Boulder Description/Abstract First-principle simulations of edge pedestal micro-turbulence are performed with the global gyrokinetic turbulence code GEM for both low and high confinement tokamak plasmas. The high confinement plasmas show a larger growth rate, but nonlinearly a lower particle and heat flux. Numerical profiles are obtained from the XGC0 neoclassical code. XGC0/GEM code coupling is implemented under the EFFIS (“End-to-end Framework for Fusion Integrated Simulation”) framework. Investigations are underway to clearly identify the micro-instabilities in the edge pedestal using global and flux-tube gyrokinetic simulation with realistic experimental high confinement profiles. We use both experimental profiles and those obtained using the EFFIS XGC0/GEM coupled code framework. We find there are three types of instabilities at the edge: a low-n, high frequency electron mode, a high-n, low frequency ion mode, and possibly an ion mode like kinetic ballooning mode (KBM). Investigations are under way for the effects of the radial electric field. Finally, we have been investigating how plasmas dominated by ion-temperature gradient (ITG) driven turbulence, how cold Deuterium and Tritium ions near the edge will naturally pinch radially inward towards the core. We call this mechanism “natural fueling.” It is due to the quasi-neutral heat flux dominated nature of the turbulence and still applies when trapped and passing kinetic electron effects are included. To understand this mechanism, examine the situation where the electrons are adiabatic, and there is an ion heat flux. In such a case, lower energy particles move inward and higher energy particles move outward. If a trace amount of cold particles are added, they will move inward.

  10. The HiPER project for inertial confinement fusion and some experimental results on advanced ignition schemes

    NASA Astrophysics Data System (ADS)

    Batani, D.; Koenig, M.; Baton, S.; Perez, F.; Gizzi, L. A.; Koester, P.; Labate, L.; Honrubia, J.; Antonelli, L.; Morace, A.; Volpe, L.; Santos, J.; Schurtz, G.; Hulin, S.; Ribeyre, X.; Fourment, C.; Nicolai, P.; Vauzour, B.; Gremillet, L.; Nazarov, W.; Pasley, J.; Richetta, M.; Lancaster, K.; Spindloe, Ch; Tolley, M.; Neely, D.; Kozlová, M.; Nejdl, J.; Rus, B.; Wolowski, J.; Badziak, J.; Dorchies, F.

    2011-12-01

    This paper presents the goals and some of the results of experiments conducted within the Working Package 10 (Fusion Experimental Programme) of the HiPER Project. These experiments concern the study of the physics connected to 'advanced ignition schemes', i.e. the fast ignition and the shock ignition approaches to inertial fusion. Such schemes are aimed at achieving a higher gain, as compared with the classical approach which is used in NIF, as required for future reactors, and make fusion possible with smaller facilities. In particular, a series of experiments related to fast ignition were performed at the RAL (UK) and LULI (France) Laboratories and studied the propagation of fast electrons (created by a short-pulse ultra-high-intensity beam) in compressed matter, created either by cylindrical implosions or by compression of planar targets by (planar) laser-driven shock waves. A more recent experiment was performed at PALS and investigated the laser-plasma coupling in the 1016 W cm-2 intensity regime of interest for shock ignition.

  11. Magnetohydrodynamic Vapor Explosions: A Study with Potential Interest to the Safety of Fusion Reactor Project

    NASA Astrophysics Data System (ADS)

    Arias, F. J.

    2010-04-01

    In this paper, the possibility of vapor explosions in superheat liquids in presence of a magnetic field that undergo sudden variation of magnetic field is discussed. This possible phenomenon may play a very important role in the blanket design of future fusion reactors, where transients in magnetic field on liquid metals, could will be a potential hazard for safety.

  12. Unilateral spectral and temporal compression reduces binaural fusion for normal hearing listeners with cochlear implant simulations

    PubMed Central

    Aronoff, Justin M.; Shayman, Corey; Prasad, Akila; Suneel, Deepa; Stelmach, Julia

    2015-01-01

    Patients with single sided deafness have recently begun receiving cochlear implants in their deaf ear. These patients gain a significant benefit from having a cochlear implant. However, despite this benefit, they are considerably slower to develop binaural abilities such as summation compared to bilateral cochlear implant patients. This suggests that these patients have difficulty fusing electric and acoustic signals. Although this may reflect inherent differences between electric and acoustic stimulation, it may also reflect properties of the processor and fitting system, which result in spectral and temporal compression. To examine the possibility that unilateral spectral and temporal compression can adversely affect binaural fusion, this study tested normal hearing listeners’ binaural fusion through the use of vocoded speech with unilateral spectral and temporal compression. The results indicate that unilateral spectral and temporal compression can hinder binaural fusion and thus may adversely affect binaural abilities in patients with single sided deafness who use a cochlear implant in their deaf ear. PMID:25549574

  13. Progress of the NASA/USGS Lunar Regolith Simulant Project

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; MLemore, Carole; Wilson, Steve; Stoeser, Doug; Schrader, Christian; Fikes, John; Street, Kenneth

    2009-01-01

    Beginning in 2004 personnel at MSFC began serious efforts to develop a new generation of lunar simulants. The first two products were a replication of the previous JSC-1 simulant under a contract to Orbitec and a major workshop in 2005 on future simulant development. Beginning in 2006 the project refocused its efforts and approached simulant development in a new and more comprehensive manner, examining new approaches in simulant development and ways to more accurately compare simulants to actual lunar materials. This led to a multi-year effort with five major tasks running in parallel. The five tasks are Requirements, Lunar Analysis, Process Development, Feed Stocks, and Standards. Major progress has been made in all five areas. A substantial draft of a formal requirements document now exists and has been largely stable since 2007. It does evolve as specific details of the standards and Lunar Analysis efforts proceed. Lunar Analysis has turned out to be vastly more difficult than anticipated. After great effort to mine existing published and gray literature, the team has realized the necessity of making new measurements of the Apollo samples, an effort that is currently in progress. Process development is substantially ahead of expectations in 2006. It is now practical to synthesize glasses of appropriate composition and purity. It is also possible to make agglutinate particles in significant quantities. A series of minerals commonly found on the Moon has been synthesized. Separation of mineral constituents from starting rock material is also proceeding. Customized grinding and mixing processes have been developed and tested are now being documented. Identification and development of appropriate feedstocks has been both easier and more difficult than anticipated. The Stillwater Mining Company, operating in the Stillwater layered mafic intrusive complex of Montana, has been an amazing resource for the project, but finding adequate sources for some of the components

  14. Fusing simulation and experiment: The effect of mutations on the structure and activity of the influenza fusion peptide.

    PubMed

    Lousa, Diana; Pinto, Antónia R T; Victor, Bruno L; Laio, Alessandro; Veiga, Ana S; Castanho, Miguel A R B; Soares, Cláudio M

    2016-01-01

    During the infection process, the influenza fusion peptide (FP) inserts into the host membrane, playing a crucial role in the fusion process between the viral and host membranes. In this work we used a combination of simulation and experimental techniques to analyse the molecular details of this process, which are largely unknown. Although the FP structure has been obtained by NMR in detergent micelles, there is no atomic structure information in membranes. To answer this question, we performed bias-exchange metadynamics (BE-META) simulations, which showed that the lowest energy states of the membrane-inserted FP correspond to helical-hairpin conformations similar to that observed in micelles. BE-META simulations of the G1V, W14A, G12A/G13A and G4A/G8A/G16A/G20A mutants revealed that all the mutations affect the peptide's free energy landscape. A FRET-based analysis showed that all the mutants had a reduced fusogenic activity relative to the WT, in particular the mutants G12A/G13A and G4A/G8A/G16A/G20A. According to our results, one of the major causes of the lower activity of these mutants is their lower membrane affinity, which results in a lower concentration of peptide in the bilayer. These findings contribute to a better understanding of the influenza fusion process and open new routes for future studies. PMID:27302370

  15. Fusing simulation and experiment: The effect of mutations on the structure and activity of the influenza fusion peptide

    PubMed Central

    Lousa, Diana; Pinto, Antónia R. T.; Victor, Bruno L.; Laio, Alessandro; Veiga, Ana S.; Castanho, Miguel A. R. B.; Soares, Cláudio M.

    2016-01-01

    During the infection process, the influenza fusion peptide (FP) inserts into the host membrane, playing a crucial role in the fusion process between the viral and host membranes. In this work we used a combination of simulation and experimental techniques to analyse the molecular details of this process, which are largely unknown. Although the FP structure has been obtained by NMR in detergent micelles, there is no atomic structure information in membranes. To answer this question, we performed bias-exchange metadynamics (BE-META) simulations, which showed that the lowest energy states of the membrane-inserted FP correspond to helical-hairpin conformations similar to that observed in micelles. BE-META simulations of the G1V, W14A, G12A/G13A and G4A/G8A/G16A/G20A mutants revealed that all the mutations affect the peptide’s free energy landscape. A FRET-based analysis showed that all the mutants had a reduced fusogenic activity relative to the WT, in particular the mutants G12A/G13A and G4A/G8A/G16A/G20A. According to our results, one of the major causes of the lower activity of these mutants is their lower membrane affinity, which results in a lower concentration of peptide in the bilayer. These findings contribute to a better understanding of the influenza fusion process and open new routes for future studies. PMID:27302370

  16. A Particle-in-Cell Simulation for the Traveling Wave Direct Energy Converter (TWDEC) for Fusion Propulsion

    NASA Technical Reports Server (NTRS)

    Chap, Andrew; Tarditi, Alfonso G.; Scott, John H.

    2013-01-01

    A Particle-in-cell simulation model has been developed to study the physics of the Traveling Wave Direct Energy Converter (TWDEC) applied to the conversion of charged fusion products into electricity. In this model the availability of a beam of collimated fusion products is assumed; the simulation is focused on the conversion of the beam kinetic energy into alternating current (AC) electric power. The model is electrostatic, as the electro-dynamics of the relatively slow ions can be treated in the quasistatic approximation. A two-dimensional, axisymmetric (radial-axial coordinates) geometry is considered. Ion beam particles are injected on one end and travel along the axis through ring-shaped electrodes with externally applied time-varying voltages, thus modulating the beam by forming a sinusoidal pattern in the beam density. Further downstream, the modulated beam passes through another set of ring electrodes, now electrically oating. The modulated beam induces a time alternating potential di erence between adjacent electrodes. Power can be drawn from the electrodes by connecting a resistive load. As energy is dissipated in the load, a corresponding drop in beam energy is measured. The simulation encapsulates the TWDEC process by reproducing the time-dependent transfer of energy and the particle deceleration due to the electric eld phase time variations.

  17. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. III. Collisionless tearing mode

    NASA Astrophysics Data System (ADS)

    Liu, Dongjian; Bao, Jian; Han, Tao; Wang, Jiaqi; Lin, Zhihong

    2016-02-01

    A finite-mass electron fluid model for low frequency electromagnetic fluctuations, particularly the collisionless tearing mode, has been implemented in the gyrokinetic toroidal code. Using this fluid model, linear properties of the collisionless tearing mode have been verified. Simulations verify that the linear growth rate of the single collisionless tearing mode is proportional to De2, where De is the electron skin depth. On the other hand, the growth rate of a double tearing mode is proportional to De in the parameter regime of fusion plasmas.

  18. Three-dimensional simulation strategy to determine the effects of turbulent mixing on inertial-confinement-fusion capsule performance.

    PubMed

    Haines, Brian M; Grinstein, Fernando F; Fincke, James R

    2014-05-01

    In this paper, we present and justify an effective strategy for performing three-dimensional (3D) inertial-confinement-fusion (ICF) capsule simulations. We have evaluated a frequently used strategy in which two-dimensional (2D) simulations are rotated to 3D once sufficient relevant 2D flow physics has been captured and fine resolution requirements can be restricted to relatively small regions. This addresses situations typical of ICF capsules which are otherwise prohibitively intensive computationally. We tested this approach for our previously reported fully 3D simulations of laser-driven reshock experiments where we can use the available 3D data as reference. Our studies indicate that simulations that begin as purely 2D lead to significant underprediction of mixing and turbulent kinetic energy production at later time when compared to the fully 3D simulations. If, however, additional suitable nonuniform perturbations are applied at the time of rotation to 3D, we show that one can obtain good agreement with the purely 3D simulation data, as measured by vorticity distributions as well as integrated mixing and turbulent kinetic energy measurements. Next, we present results of simulations of a simple OMEGA-type ICF capsule using the developed strategy. These simulations are in good agreement with available experimental data and suggest that the dominant mechanism for yield degradation in ICF implosions is hydrodynamic instability growth seeded by long-wavelength surface defects. This effect is compounded by drive asymmetries and amplified by repeated shock interactions with an increasingly distorted shell, which results in further yield reduction. Our simulations are performed with and without drive asymmetries in order to compare the importance of these effects to those of surface defects; our simulations indicate that long-wavelength surface defects degrade yield by approximately 60% and short-wavelength drive asymmetry degrades yield by a further 30%. PMID

  19. The Jefferson Project: Large-eddy simulations of a watershed

    NASA Astrophysics Data System (ADS)

    Watson, C.; Cipriani, J.; Praino, A. P.; Treinish, L. A.; Tewari, M.; Kolar, H.

    2015-12-01

    The Jefferson Project is a new endeavor at Lake George, NY by IBM Research, Rensselaer Polytechnic Institute (RPI) and The Fund for Lake George. Lake George is an oligotrophic lake - one of low nutrients - and a 30-year study recently published by RPI's Darrin Fresh Water Institute highlighted the renowned water quality is declining from the injection of salt (from runoff), algae, and invasive species. In response, the Jefferson Project is developing a system to provide extensive data on relevant physical, chemical and biological parameters that drive ecosystem function. The system will be capable of real-time observations and interactive modeling of the atmosphere, watershed hydrology, lake circulation and food web dynamics. In this presentation, we describe the development of the operational forecast system used to simulate the atmosphere in the model stack, Deep ThunderTM (a configuration of the ARW-WRF model). The model performs 48-hr forecasts twice daily in a nested configuration, and in this study we present results from ongoing tests where the innermost domains are dx = 333-m and 111-m. We discuss the model's ability to simulate boundary layer processes, lake surface conditions (an input into the lake model), and precipitation (an input into the hydrology model) during different weather regimes, and the challenges of data assimilation and validation at this scale. We also explore the potential for additional nests over select regions of the watershed to better capture turbulent boundary layer motions.

  20. Integrated fusion simulation with self-consistent core-pedestal coupling

    NASA Astrophysics Data System (ADS)

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, D. L.; Elwasif, W.; Grierson, B. A.; Holland, C.

    2016-04-01

    Accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile, and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self-consistent solution to this strongly coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Testing against a DIII-D discharge shows that the workflow is capable of robustly predicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in a good agreement with the experiments. An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff .

  1. Simulating the magnetized liner inertial fusion plasma confinement with smaller-scale experiments

    SciTech Connect

    Ryutov, D. D.; Cuneo, M. E.; Herrmann, M. C.; Sinars, D. B.; Slutz, S. A.

    2012-06-15

    The recently proposed magnetized liner inertial fusion approach to a Z-pinch driven fusion [Slutz et al., Phys. Plasmas 17, 056303 (2010)] is based on the use of an axial magnetic field to provide plasma thermal insulation from the walls of the imploding liner. The characteristic plasma transport regimes in the proposed approach cover parameter domains that have not been studied yet in either magnetic confinement or inertial confinement experiments. In this article, an analysis is presented of the scalability of the key physical processes that determine the plasma confinement. The dimensionless scaling parameters are identified and conclusion is drawn that the plasma behavior in scaled-down experiments can correctly represent the full-scale plasma, provided these parameters are approximately the same in two systems. This observation is important in that smaller-scale experiments typically have better diagnostic access and more experiments per year are possible.

  2. Integrated fusion simulation with self-consistent core-pedestal coupling

    DOE PAGESBeta

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, D. L.; Elwasif, W.; et al

    2016-04-20

    In this study, accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments.more » An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff.« less

  3. Integrated fusion simulation with self-consistent core-pedestal coupling

    SciTech Connect

    Meneghini, Orso; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, David L; Elwasif, Wael R; Grierson, Brian A.; Holland, C.

    2016-01-01

    Accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments. An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Z eff.

  4. Code OK1—Simulation of multi-beam irradiation on a spherical target in heavy ion fusion

    NASA Astrophysics Data System (ADS)

    Ogoyski, A. I.; Someya, T.; Kawata, S.

    2004-02-01

    Code OK1 is a fast and precise three-dimensional computer program designed for simulations of heavy ion beam (HIB) irradiation on a direct-driven spherical fuel pellet in heavy ion fusion (HIF). OK1 provides computational capabilities of a three-dimensional energy deposition profile on a spherical fuel pellet and the HIB irradiation non-uniformity evaluation, which are valuables for optimizations of the beam parameters and the fuel pellet structure, as well for further HIF experiment design. The code is open and complete, and can be easily modified or adapted for users' purposes in this field. Program summaryTitle of program: OK1 Catalogue identifier: ADST Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADST Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer: PC (Pentium 4, ˜1 GHz or more recommended) Operating system: Windows or UNIX Program language used: C++ Memory required to execute with typical data: 911 MB No. of bits in a word: 32 No. of processors used: 1 CPU Has the code been vectorized or parallelized: No No. of bytes in distributed program, including test data: 16 557 Distribution format: tar gzip file Keywords: Heavy ion beam, inertial confinement fusion, energy deposition, fuel pellet Nature of physical problem: Nuclear fusion energy may have attractive features as one of our human energy resources. In this paper we focus on heavy ion inertial confinement fusion (HIF). Due to a favorable energy deposition behavior of heavy ions in matter [J.J. Barnard et al., UCRL-LR-108095, 1991; C. Deutsch et al., J. Plasma Fusion Res. 77 (2001) 33; T. Someya et al., Fusion Sci. Tech. (2003), submitted] it is expected that heavy ion beam (HIB) would be one of energy driver candidates to operate a future inertial confinement fusion power plant. For a successful fuel ignition and fusion energy release, a stringent requirement is imposed on the HIB irradiation non-uniformity, which should be less than a few percent

  5. NASA GRC UAS Project: Communications Modeling and Simulation Status

    NASA Technical Reports Server (NTRS)

    Kubat, Greg

    2013-01-01

    The integration of Unmanned Aircraft Systems (UAS) in the National Airspace represents new operational concepts required in civil aviation. These new concepts are evolving as the nation moves toward the Next Generation Air Transportation System (NextGen) under the leadership of the Joint Planning and Development Office (JPDO), and through ongoing work by the Federal Aviation Administration (FAA). The desire and ability to fly UAS in the National Air Space (NAS) in the near term has increased dramatically, and this multi-agency effort to develop and implement a national plan to successfully address the challenges of UAS access to the NAS in a safe and timely manner is well underway. As part of the effort to integrate UAS in the National Airspace, NASA Glenn Research Center is currently involved with providing research into Communications systems and Communication system operations in order to assist with developing requirements for this implementation. In order to provide data and information regarding communication systems performance that will be necessary, NASA GRC is tasked with developing and executing plans for simulations of candidate future UAS command and control communications, in line with architectures and communications technologies being developed and/or proposed by NASA and relevant aviation organizations (in particular, RTCA SC-203). The simulations and related analyses will provide insight into the ability of proposed communications technologies and system architectures to enable safe operation of UAS, meeting UAS in the NAS project goals (including performance requirements, scalability, and interoperability), and ultimately leading to a determination of the ability of NextGen communication systems to accommodate UAS. This presentation, compiled by the NASA GRC team, will provide a view of the overall planned simulation effort and objectives, a description of the simulation concept and status of the design and development that has occurred to date.

  6. NASA GRC UAS Project - Communications Modeling and Simulation Development Status

    NASA Technical Reports Server (NTRS)

    Apaza, Rafael; Bretmersky, Steven; Dailey, Justin; Satapathy, Goutam; Ditzenberger, David; Ye, Chris; Kubat, Greg; Chevalier, Christine; Nguyen, Thanh

    2014-01-01

    The integration of Unmanned Aircraft Systems (UAS) in the National Airspace represents new operational concepts required in civil aviation. These new concepts are evolving as the nation moves toward the Next Generation Air Transportation System (NextGen) under the leadership of the Joint Planning and Development Office (JPDO), and through ongoing work by the Federal Aviation Administration (FAA). The desire and ability to fly UAS in the National Air Space (NAS) in the near term has increased dramatically, and this multi-agency effort to develop and implement a national plan to successfully address the challenges of UAS access to the NAS in a safe and timely manner is well underway. As part of the effort to integrate UAS in the National Airspace, NASA Glenn Research Center is currently involved with providing research into Communications systems and Communication system operations in order to assist with developing requirements for this implementation. In order to provide data and information regarding communication systems performance that will be necessary, NASA GRC is tasked with developing and executing plans for simulations of candidate future UAS command and control communications, in line with architectures and communications technologies being developed and or proposed by NASA and relevant aviation organizations (in particular, RTCA SC-203). The simulations and related analyses will provide insight into the ability of proposed communications technologies and system architectures to enable safe operation of UAS, meeting UAS in the NAS project goals (including performance requirements, scalability, and interoperability), and ultimately leading to a determination of the ability of NextGen communication systems to accommodate UAS. This presentation, compiled by the NASA GRC Modeling and Simulation team, will provide an update to this ongoing effort at NASA GRC as follow-up to the overview of the planned simulation effort presented at ICNS in 2013. The objective

  7. Neutral Buoyancy Simulator-EASE Project (NB32)

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Once the United States' space program had progressed from Earth's orbit into outerspace, the prospect of building and maintaining a permanent presence in space was realized. To accomplish this feat, NASA launched a temporary workstation, Skylab, to discover the effects of low gravity and weightlessness on the human body, and also to develop tools and equipment that would be needed in the future to build and maintain a more permanent space station. The structures, techniques, and work schedules had to be carefully designed to fit this unique construction site. The components had to be lightweight for transport into orbit, yet durable. The station also had to be made with removable parts for easy servicing and repairs by astronauts. All of the tools necessary for service and repairs had to be designed for easy manipulation by a suited astronaut. Construction methods had to be efficient due to the limited time the astronauts could remain outside their controlled environment. In lieu of all the specific needs for this project, an environment on Earth had to be developed that could simulate a low gravity atmosphere. A Neutral Buoyancy Simulator (NBS) was constructed by NASA's Marshall Space Flight Center (MSFC) in 1968. Since then, NASA scientists have used this facility to understand how humans work best in low gravity and also provide information about the different kinds of structures that can be built. Pictured is a Massachusetts Institute of Technology (MIT) student working in a spacesuit on the Experimental Assembly of Structures in Extravehicular Activity (EASE) project which was developed as a joint effort between MFSC and MIT. The EASE experiment required that crew members assemble small components to form larger components, working from the payload bay of the space shuttle. The MIT student in this photo is assembling two six-beam tetrahedrons.

  8. Computer simulations for minds-on learning with ``Project Spectra!''

    NASA Astrophysics Data System (ADS)

    Wood, E. L.; Renfrow, S.; Marks, N.; Christofferson, R.

    2010-12-01

    How do we gain information about the Sun? How do we know Mars has CO2 or that Titan has a nitrogen-rich atmosphere? How do we use light in astronomy? These concepts are something education professionals generally struggle with because they are abstract. Making use of visualizations and presenting material so it can be manipulated is the easiest way to conquer abstractions to bring them home to students. Using simulations and computer interactives (games) where students experience and manipulate the information makes concepts accessible. “Project Spectra!” is a science and engineering program that uses computer-based Flash interactives to expose students to astronomical spectroscopy and actual data in a way that is not possible with traditional in-class activities. Visualizing lessons with multi-media is a way to solidify understanding and retention of knowledge and is completely unlike its paper-and-pencil counterpart. To engage students in “Project Spectra!”, students are given a mission, which connects them with the research at hand. Missions range from exploring remote planetary atmospheres and surfaces, experimenting with the Sun using different filters, and comparing spectroscopic atmospheric features between different bodies. Additionally, students have an opportunity to learn about NASA missions, view movies, and see images connected with their mission. In the end, students are asked critical thinking questions and conduct web-based research. These interactives complement the in-class activities where students engineer spectrographs and explore the electromagnetic spectrum.

  9. Susitna Hydroelectric Project: terrestrial environmental workshop and preliminary simulation model

    USGS Publications Warehouse

    Everitt, Robert R.; Sonntag, Nicholas C.; Auble, Gregory T.; Roelle, James E.; Gazey, William

    1982-01-01

    The technical feasibility, economic viability, and environmental impacts of a hydroelectric development project in the Susitna River Basin are being studied by Acres American, Inc. on behalf of the Alaska Power Authority. As part of these studies, Acres American recently contracted LGL Alaska Research Associates, Inc. to coordinate the terrestrial environmental studies being performed by the Alaska Department of Fish and Game and, as subcontractors to LGL, several University of Alaska research groups. LGL is responsible for further quantifying the potential impacts of the project on terrestrial wildlife and vegetation, and for developing a plan to mitigate adverse impacts on the terrestrial environment. The impact assessment and mitigation plan will be included as part of a license application to the Federal Energy Regulatory Commission (FERC) scheduled for the first quarter of 1983. The quantification of impacts, mitigation planning, and design of future research is being organized using a computer simulation modelling approach. Through a series of workshops attended by researchers, resource managers, and policy-makers, a computer model is being developed and refined for use in the quantification of impacts on terrestrial wildlife and vegetation, and for evaluating different mitigation measures such as habitat enhancement and the designation of replacement lands to be managed by wildlife habitat. This report describes the preliminary model developed at the first workshop held August 23 -27, 1982 in Anchorage.

  10. Spinal fusion

    MedlinePlus

    ... Anterior spinal fusion; Spine surgery - spinal fusion; Low back pain - fusion; Herniated disk - fusion ... If you had chronic back pain before surgery, you will likely still have some pain afterward. Spinal fusion is unlikely to take away all your pain ...

  11. Data fusion through simulated annealing of registered range and reflectance images

    SciTech Connect

    Beckerman, M.; Sweeney, F.J.

    1993-06-01

    In this paper we present results of a study of registered range and reflectance images acquired using a prototype amplitude-modulated CW laser radar. Ranging devices such as laser radars represent new technologies which are being applied in aerospace, nuclear and other hazardous environments where remote inspections, 3D identifications and measurements are required. However, data acquired using devices of this type may contain non-stationary, signal-dependent noise, range-reflectance crosstalk and low-reflectance range artifacts. Low level fusion algorithms play an essential role in achieving reliable performace by handling the complex noise, systematic errors and artifacts. The objective of our study is the development of a stochastic fusion algorithm which takes as its input the registered image pair and produces as its output a reliable description of the underlying physical scene in terms of locally smooth surfaces separated by well-defined depth discontinuities. To construct the algorithm we model each image as a set of coupled Markov random fields representing pixel and several orders of line processes. Within this framework we (i) impose local smoothness constraints, introducing a simple linearity property in place of the usual sums over clique potentials; (ii) fuse the range and reflectance images through line process couplings, and (iii) use nonstationary, signal-dependent variances, adaptive thresholding, and a form of Markov natural selection. We show that the resulting algorithm yields reliable results even in worst-case scenarios.

  12. Data fusion through simulated annealing of registered range and reflectance images

    SciTech Connect

    Beckerman, M.; Sweeney, F.J.

    1993-01-01

    In this paper we present results of a study of registered range and reflectance images acquired using a prototype amplitude-modulated CW laser radar. Ranging devices such as laser radars represent new technologies which are being applied in aerospace, nuclear and other hazardous environments where remote inspections, 3D identifications and measurements are required. However, data acquired using devices of this type may contain non-stationary, signal-dependent noise, range-reflectance crosstalk and low-reflectance range artifacts. Low level fusion algorithms play an essential role in achieving reliable performace by handling the complex noise, systematic errors and artifacts. The objective of our study is the development of a stochastic fusion algorithm which takes as its input the registered image pair and produces as its output a reliable description of the underlying physical scene in terms of locally smooth surfaces separated by well-defined depth discontinuities. To construct the algorithm we model each image as a set of coupled Markov random fields representing pixel and several orders of line processes. Within this framework we (i) impose local smoothness constraints, introducing a simple linearity property in place of the usual sums over clique potentials; (ii) fuse the range and reflectance images through line process couplings, and (iii) use nonstationary, signal-dependent variances, adaptive thresholding, and a form of Markov natural selection. We show that the resulting algorithm yields reliable results even in worst-case scenarios.

  13. Semi-analytic modeling and simulation of magnetized liner inertial fusion

    NASA Astrophysics Data System (ADS)

    McBride, R. D.; Slutz, S. A.; Hansen, S. B.

    2013-10-01

    Presented is a semi-analytic model of magnetized liner inertial fusion (MagLIF). This model accounts for several key aspects of MagLIF, including: (1) pre-heat of the fuel; (2) pulsed-power-driven liner implosion; (3) liner compressibility with an analytic equation of state, artificial viscosity, and internal magnetic pressure and heating; (4) adiabatic compression and heating of the fuel; (5) radiative losses and fuel opacity; (6) magnetic flux compression with Nernst thermoelectric losses; (7) magnetized electron and ion thermal conduction losses; (8) deuterium-deuterium and deuterium-tritium primary fusion reactions; and (9) magnetized alpha-particle heating. We will first show that this simplified model, with its transparent and accessible physics, can be used to reproduce the general 1D behavior presented throughout the original MagLIF paper. We will then use this model to illustrate the MagLIF parameter space, energetics, and efficiencies, and to show the experimental challenges that we will likely be facing as we begin testing MagLIF using the infrastructure presently available at the Z facility. Finally, we will demonstrate how this scenario could likely change as various facility upgrades are made over the next three to five years and beyond. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  14. Simulation of Regional Explosion S-Phases (SIRES) Project

    SciTech Connect

    Myers, S; Preston, L; Larsen, S; Smith, K; Wlater, W

    2004-07-15

    Generation of S-waves from explosion sources continues to be an intriguing area of seismological research. Empirical studies document a general decrease in regional S-phase amplitudes (compared to P-phases) for explosions sources. Although decreased S-phase amplitude for explosive (compressional) sources is intuitive, a comprehensive physical understanding of the many mechanisms that contribute to S-phase excitation does not currently exist. Despite the success of many regional discriminant and magnitude methods that rely on decreased S-phase amplitude for explosion sources, instances remain where explosions produce anomalous S-phases amplitudes that confound regional methods. Scattering of the Rg phase is forwarded in several studies as an important mechanism for the generation of explosion S-waves. In this study we construct a 3-dimensional model of the Nevada Test Site (NTS) and the surrounding region. Extensive databases of geologic information, including existing 3-dimensional models developed under past and ongoing NTS programs, are used in the construction of a local model. The detailed local model is merged into a regional model that extends several hundred kilometers from the NTS. In addition to deterministic geologic structure and topography we introduce stochastic variability along geologic contacts and within geologic units. Model roughness made possible by the stochastic perturbations enhances scattering, allowing realistic simulation of the local and regional wavefield. In this phase of the project we report on version 1 of the NTS model and on preliminary validation tests. Validation simulations use e3d, a fully elastic, finite difference computer code. This code allows us to introduce 3D topographic effects, as well as 3D geologic variability. Simulations are compared to recordings of the 1993 NPE experiment. The validation data set consists of local and regional distance seismograms and provides a rigorous test of the distance and time evolution of

  15. Hanford tank waste operation simulator operational waste volume projection verification and validation procedure

    SciTech Connect

    HARMSEN, R.W.

    1999-10-28

    The Hanford Tank Waste Operation Simulator is tested to determine if it can replace the FORTRAN-based Operational Waste Volume Projection computer simulation that has traditionally served to project double-shell tank utilization. Three Test Cases are used to compare the results of the two simulators; one incorporates the cleanup schedule of the Tri Party Agreement.

  16. QUANTIFYING OBSERVATIONAL PROJECTION EFFECTS USING MOLECULAR CLOUD SIMULATIONS

    SciTech Connect

    Beaumont, Christopher N.; Offner, Stella S.R.; Shetty, Rahul; Glover, Simon C. O.; Goodman, Alyssa A.

    2013-11-10

    The physical properties of molecular clouds are often measured using spectral-line observations, which provide the only probes of the clouds' velocity structure. It is hard, though, to assess whether and to what extent intensity features in position-position-velocity (PPV) space correspond to 'real' density structures in position-position-position (PPP) space. In this paper, we create synthetic molecular cloud spectral-line maps of simulated molecular clouds, and present a new technique for measuring the reality of individual PPV structures. Using a dendrogram algorithm, we identify hierarchical structures in both PPP and PPV space. Our procedure projects density structures identified in PPP space into corresponding intensity structures in PPV space and then measures the geometric overlap of the projected structures with structures identified from the synthetic observation. The fractional overlap between a PPP and PPV structure quantifies how well the synthetic observation recovers information about the three-dimensional structure. Applying this machinery to a set of synthetic observations of CO isotopes, we measure how well spectral-line measurements recover mass, size, velocity dispersion, and virial parameter for a simulated star-forming region. By disabling various steps of our analysis, we investigate how much opacity, chemistry, and gravity affect measurements of physical properties extracted from PPV cubes. For the simulations used here, which offer a decent, but not perfect, match to the properties of a star-forming region like Perseus, our results suggest that superposition induces a ∼40% uncertainty in masses, sizes, and velocity dispersions derived from {sup 13}CO (J = 1-0). As would be expected, superposition and confusion is worst in regions where the filling factor of emitting material is large. The virial parameter is most affected by superposition, such that estimates of the virial parameter derived from PPV and PPP information typically disagree

  17. Laser fusion

    SciTech Connect

    Smit, W.A.; Boskma, P.

    1980-12-01

    Unrestricted laser fusion offers nations an opportunity to circumvent arms control agreements and develop thermonuclear weapons. Early laser weapons research sought a clean radiation-free bomb to replace the fission bomb, but this was deceptive because a fission bomb was needed to trigger the fusion reaction and additional radioactivity was induced by generating fast neutrons. As laser-implosion experiments focused on weapons physics, simulating weapons effects, and applications for new weapons, the military interest shifted from developing a laser-ignited hydrogen bomb to more sophisticated weapons and civilian applications for power generation. Civilian and military research now overlap, making it possible for several countries to continue weapons activities and permitting proliferation of nuclear weapons. These countries are reluctant to include inertial confinement fusion research in the Non-Proliferation Treaty. 16 references. (DCK)

  18. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    DOE PAGESBeta

    Hager, Robert; Yoon, E. S.; Ku, S.; D'Azevedo, E. F.; Worley, P. H.; Chang, C. S.

    2016-04-04

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. The non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. Moreover, the finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computingmore » systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. As a result, the collision operator's good weak and strong scaling behavior are shown.« less

  19. A fully non-linear multi-species Fokker-Planck-Landau collision operator for simulation of fusion plasma

    NASA Astrophysics Data System (ADS)

    Hager, Robert; Yoon, E. S.; Ku, S.; D'Azevedo, E. F.; Worley, P. H.; Chang, C. S.

    2016-06-01

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. In this article, the non-linear single-species Fokker-Planck-Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. The finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker-Planck-Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computing systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. The collision operator's good weak and strong scaling behavior are shown.

  20. Adaptive {delta}f Monte Carlo Method for Simulation of RF-heating and Transport in Fusion Plasmas

    SciTech Connect

    Hoeoek, J.; Hellsten, T.

    2009-11-26

    Essential for modeling heating and transport of fusion plasma is determining the distribution function of the plasma species. Characteristic for RF-heating is creation of particle distributions with a high energy tail. In the high energy region the deviation from a Maxwellian distribution is large while in the low energy region the distribution is close to a Maxwellian due to the velocity dependency of the collision frequency. Because of geometry and orbit topology Monte Carlo methods are frequently used. To avoid simulating the thermal part, {delta}f methods are beneficial. Here we present a new {delta}f Monte Carlo method with an adaptive scheme for reducing the total variance and sources, suitable for calculating the distribution function for RF-heating.

  1. Transition from Beam-Target to Thermonuclear Fusion in High-Current Deuterium Z-Pinch Simulations.

    PubMed

    Offermann, Dustin T; Welch, Dale R; Rose, Dave V; Thoma, Carsten; Clark, Robert E; Mostrom, Chris B; Schmidt, Andrea E W; Link, Anthony J

    2016-05-13

    Fusion yields from dense, Z-pinch plasmas are known to scale with the drive current, which is favorable for many potential applications. Decades of experimental studies, however, show an unexplained drop in yield for currents above a few mega-ampere (MA). In this work, simulations of DD Z-Pinch plasmas have been performed in 1D and 2D for a constant pinch time and initial radius using the code Lsp, and observations of a shift in scaling are presented. The results show that yields below 3 MA are enhanced relative to pure thermonuclear scaling by beamlike particles accelerated in the Rayleigh-Taylor induced electric fields, while yields above 3 MA are reduced because of energy lost by the instability and the inability of the beamlike ions to enter the pinch region. PMID:27232025

  2. LINE: a code which simulates spectral line shapes for fusion reaction products generated by various speed distributions

    SciTech Connect

    Slaughter, D.

    1985-03-01

    A computer code is described which estimates the energy spectrum or ''line-shape'' for the charged particles and ..gamma..-rays produced by the fusion of low-z ions in a hot plasma. The simulation has several ''built-in'' ion velocity distributions characteristic of heated plasmas and it also accepts arbitrary speed and angular distributions although they must all be symmetric about the z-axis. An energy spectrum of one of the reaction products (ion, neutron, or ..gamma..-ray) is calculated at one angle with respect to the symmetry axis. The results are shown in tabular form, they are plotted graphically, and the moments of the spectrum to order ten are calculated both with respect to the origin and with respect to the mean.

  3. MULTI-IFE-A one-dimensional computer code for Inertial Fusion Energy (IFE) target simulations

    NASA Astrophysics Data System (ADS)

    Ramis, R.; Meyer-ter-Vehn, J.

    2016-06-01

    The code MULTI-IFE is a numerical tool devoted to the study of Inertial Fusion Energy (IFE) microcapsules. It includes the relevant physics for the implosion and thermonuclear ignition and burning: hydrodynamics of two component plasmas (ions and electrons), three-dimensional laser light ray-tracing, thermal diffusion, multigroup radiation transport, deuterium-tritium burning, and alpha particle diffusion. The corresponding differential equations are discretized in spherical one-dimensional Lagrangian coordinates. Two typical application examples, a high gain laser driven capsule and a low gain radiation driven marginally igniting capsule are discussed. In addition to phenomena relevant for IFE, the code includes also components (planar and cylindrical geometries, transport coefficients at low temperature, explicit treatment of Maxwell's equations) that extend its range of applicability to laser-matter interaction at moderate intensities (<1016  W cm-2). The source code design has been kept simple and structured with the aim to encourage user's modifications for specialized purposes.

  4. Fast ignition in system Dynamic Hohlraum with Monte-Carlo simulations of fusion kinetic and radiation processes

    NASA Astrophysics Data System (ADS)

    Andreev, Alexander A.; Platonov, Konstantin Y.; Zacharov, Sergey V.; Gus'kov, Sergei Y.; Rozanov, Vladimir B.; Il'in, Dmitrii V.; Levkovskii, Aleksey A.; Sherman, Vladimir E.

    2004-06-01

    The scheme of fast ignition by super-intense laser of DT target placed at a cavity of the radiate plasma liner, created in a "dynamic-hohlraum" system is considered. It is shown that this scheme can supply effective TN fusion. The process of compression and preheating of DT fuel of shell target by X-ray radiation of Dynamic Hohlraum is simulated by the code TRITON with parameters of Z-generator of Sandia National Laboratory. The optimum parameters of target are obtained. The mechanism of ignitor creation by protons, accelerated by ultra-shot laser radiation is considered and corresponding laser parameters are evaluated. The mathematical simulation of the following thermonuclear (TN) burn wave propagation in DT target is carried out with the use of TERA code based upon the direct statistical simulation of kinetics of fast charged particles and quantum of thermal radiation on each time step of hydrodynamics. The released TN energy is obtained as a function of ignition energy. The theoretical explanations of obtained dependencies are presented. The laser parameters necessary to produce G>>1 are determined.

  5. Simulation and Experimental Study on the Efficiency of Traveling Wave Direct Energy Conversion for Application to Aneutronic Fusion Reactions

    NASA Astrophysics Data System (ADS)

    Tarditi, Alfonso; Chap, Andrew; Miley, George; Scott, John

    2013-10-01

    A study based on both Particle-in-cell (PIC) simulation and experiments is being developed to study the physics of the Traveling Wave Direct Energy Converter (TWDEC,) with the perspective of application to aneutronic fusion reaction products and space propulsion. The PIC model is investigating in detail the key TWDEC physics process by simulating the time-dependent transfer of energy from the ion beam to an electric load connected to ring-type electrodes in cylindrical symmetry. An experimental effort is in progress on a TWDEC test article at NASA, Johnson Space Center with the purpose of studying the conditions for improving the efficiency of the direct energy conversion process. Using a scaled-down ion energy source, the experiment is primarily focused on the effect of the (bunched) beam density on the efficiency and on the optimization of the electrode design. The simulation model is guiding the development of the experimental configuration and will provide details of the beam dynamics for direct comparison with experimental diagnostics. Work supported by NASA, Johnson Space Center.

  6. Data management on the fusion computational pipeline

    NASA Astrophysics Data System (ADS)

    Klasky, S.; Beck, M.; Bhat, V.; Feibush, E.; Ludäscher, B.; Parashar, M.; Shoshani, A.; Silver, D.; Vouk, M.

    2005-01-01

    Fusion energy science, like other science areas in DOE, is becoming increasingly data intensive and network distributed. We discuss data management techniques that are essential for scientists making discoveries from their simulations and experiments, with special focus on the techniques and support that Fusion Simulation Project (FSP) scientists may need. However, the discussion applies to a broader audience since most of the fusion SciDAC's, and FSP proposals include a strong data management component. Simulations on ultra scale computing platforms imply an ability to efficiently integrate and network heterogeneous components (computational, storage, networks, codes, etc), and to move large amounts of data over large distances. We discuss the workflow categories needed to support such research as well as the automation and other aspects that can allow an FSP scientist to focus on the science and spend less time tending information technology.

  7. Projections of African drought extremes in CORDEX regional climate simulations

    NASA Astrophysics Data System (ADS)

    Gbobaniyi, Emiola; Nikulin, Grigory; Jones, Colin; Kjellström, Erik

    2013-04-01

    We investigate trends in drought extremes for different climate regions of the African continent over a combined historical and future period 1951-2100. Eight CMIP5 coupled atmospheric global climate models (CanESM2, CNRM-CM5, HadGEM2-ES, NorESM1-M, EC-EARTH, MIROC5, GFDL-ESM2M and MPI-ESM-LR) under two forcing scenarios, the relative concentration pathways (RCP) 4.5 and 8.5, with spatial resolution varying from about 1° to 3° are downscaled to 0.44° resolution by the Rossby Centre (SMHI) regional climate model RCA4. We use data from the ensuing ensembles of CORDEX-Africa regional climate simulations to explore three drought indices namely: standardized precipitation index (SPI), moisture index (MI) and difference in precipitation and evaporation (P-E). Meteorological and agricultural drought conditions are assessed in our analyses and a climate change signal is obtained for the SPI by calculating gamma functions for future SPI with respect to a baseline present climate. Results for the RCP4.5 and RCP8.5 scenarios are inter-compared to assess uncertainties in the future projections. We show that there is a pronounced sensitivity to the choice of forcing GCM which indicates that assessments of future drought conditions in Africa would benefit from large model ensembles. We also note that the results are sensitive to the choice of drought index. We discuss both spatial and temporal variability of drought extremes for different climate zones of Africa and the importance of the ensemble mean. Our study highlights the usefulness of CORDEX simulations in identifying possible future impacts of climate at local and regional scales.

  8. Appreciating the Complexity of Project Management Execution: Using Simulation in the Classroom

    ERIC Educational Resources Information Center

    Hartman, Nathan S.; Watts, Charles A.; Treleven, Mark D.

    2013-01-01

    As the popularity and importance of project management increase, so does the need for well-prepared project managers. This article discusses our experiences using a project management simulation in undergraduate and MBA classes to help students better grasp the complexity of project management. This approach gives students hands-on experience with…

  9. A Fast Iterated Orthogonal Projection Framework for Smoke Simulation.

    PubMed

    Yang, Yang; Yang, Xubo; Yang, Shuangcai

    2016-05-01

    We present a fast iterated orthogonal projection (IOP) framework for smoke simulations. By modifying the IOP framework with a different means for convergence, our framework significantly reduces the number of iterations required to converge to the desired precision. Our new iteration framework adds a divergence redistributor component to IOP that can improve the impeded convergence logic of IOP. We tested Jacobi, GS and SOR as divergence redistributors and used the Multigrid scheme to generate a highly efficient Poisson solver. It provides a rapid convergence rate and requires less computation time. In all of our experiments, our method only requires 2-3 iterations to satisfy the convergence condition of 1e-5 and 5-7 iterations for 1e-10. Compared with the commonly used Incomplete Cholesky Preconditioned Conjugate Gradient(ICPCG) solver, our Poisson solver accelerates the overall speed to approximately 7- to 30-fold faster for grids ranging from 128(3) to 256(3). Our solver can accelerate more on larger grids because of the property that the iteration count required to satisfy the convergence condition is independent of the problem size. We use various experimental scenes and settings to demonstrate the efficiency of our method. In addition, we present a feasible method for both IOP and our fast IOP to support free surfaces. PMID:27045907

  10. Experiments with Memory-to-Memory Coupling for End-to-End fusion Simulation Workflows

    SciTech Connect

    Docan, Ciprian; Zhang, Fan; Parashar, Manish; Cummings, Julian; Podhorszki, Norbert; Klasky, Scott A

    2010-01-01

    Scientific applications are striving to accurately simulate multiple interacting physical processes that comprise complex phenomena being modeled. Efficient and scalable parallel implementations of these coupled simulations present challenging interaction and coordination requirements, especially when the coupled physical processes are computationally heterogeneous and progress at different speeds. In this paper, we present the design, implementation and evaluation of a memory-to-memory coupling framework for coupled scientific simulations on high-performance parallel computing platforms. The framework is driven by the coupling requirements of the Center for Plasma Edge Simulation, and it provides simple coupling abstractions as well as efficient asynchronous (RDMA-based) memory-to-memory data transport mechanisms that complement existing parallel programming systems and data sharing frameworks. The framework enables flexible coupling behaviors that are asynchronous in time and space, and it supports dynamic coupling between heterogeneous simulation processes without enforcing any synchronization constraints. We evaluate the performance and scalability of the coupling framework using a specific coupling scenario, on the Jaguar Cray XT5 system at Oak Ridge National Laboratory.

  11. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    SciTech Connect

    Vold, E. L.; Molvig, K.; Joglekar, A. S.; Ortega, M. I.; Moll, R.; Fenn, D.

    2015-11-15

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion (ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. We have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasma viscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasma viscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasma viscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Plasma viscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  12. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    NASA Astrophysics Data System (ADS)

    Vold, E. L.; Joglekar, A. S.; Ortega, M. I.; Moll, R.; Fenn, D.; Molvig, K.

    2015-11-01

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion (ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. We have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasma viscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasma viscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasma viscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Plasma viscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  13. A simulation-based and analytic analysis of the off-Hugoniot response of alternative inertial confinement fusion ablator materials

    NASA Astrophysics Data System (ADS)

    Moore, Alastair S.; Prisbrey, Shon; Baker, Kevin L.; Celliers, Peter M.; Fry, Jonathan; Dittrich, Thomas R.; Wu, Kuang-Jen J.; Kervin, Margaret L.; Schoff, Michael E.; Farrell, Mike; Nikroo, Abbas; Hurricane, Omar A.

    2016-09-01

    The attainment of self-propagating fusion burn in an inertial confinement target at the National Ignition Facility will require the use of an ablator with high rocket-efficiency and ablation pressure. The ablation material used during the National Ignition Campaign (Lindl et al. 2014) [1], a glow-discharge polymer (GDP), does not couple as efficiently as simulations indicated to the multiple-shock inducing radiation drive environment created by laser power profile (Robey et al., 2012). We investigate the performance of two other ablators, boron carbide (B4C) and high-density carbon (HDC) compared to the performance of GDP under the same hohlraum conditions. Ablation performance is determined through measurement of the shock speed produced in planar samples of the ablator material subjected to the identical multiple-shock inducing radiation drive environments that are similar to a generic three-shock ignition drive. Simulations are in better agreement with the off-Hugoniot performance of B4C than either HDC or GDP, and analytic estimations of the ablation pressure indicate that while the pressure produced by B4C and GDP is similar when the ablator is allowed to release, the pressure reached by B4C seems to exceed that of HDC when backed by a Au/quartz layer.

  14. Teaching Engineering Statistics with Technology, Group Learning, Contextual Projects, Simulation Models and Student Presentations

    ERIC Educational Resources Information Center

    Romeu, Jorge Luis

    2008-01-01

    This article discusses our teaching approach in graduate level Engineering Statistics. It is based on the use of modern technology, learning groups, contextual projects, simulation models, and statistical and simulation software to entice student motivation. The use of technology to facilitate group projects and presentations, and to generate,…

  15. Subcascade formation in displacement cascade simulations: Implications for fusion reactor materials

    SciTech Connect

    Stoller, R.E.; Greenwood, L.R.

    1998-03-01

    Primary radiation damage formation in iron has been investigated by the method of molecular dynamics (MD) for cascade energies up to 40 keV. The initial energy EMD given to the simulated PKA is approximately equivalent to the damage energy in the standard secondary displacement model by Norgett, Robinson, and Torrens (NRT); hence, EMD is less than the corresponding PKA energy. Using the values of EMD in Table 1, the corresponding EPKA and the NRT defects in iron have been calculated using the procedure described in Ref. 1 with the recommended 40 eV displacement threshold. These values are also listed in Table 1. Note that the difference between the EMD and the PKA energy increases as the PKA energy increases and that the highest simulated PKA energy of 61.3 keV is the average for a collision with a 1.77 MeV neutron. Thus, these simulations have reached well into the fast neutron energy regime. For purposes of comparison, the parameters for the maximum DT neutron energy of 14.1 MeV are also included in Table 1. Although the primary damage parameters derived from the MD cascades exhibited a strong dependence on cascade energy up to 10 keV, this dependence was diminished and slightly reversed between 20 and 40 keV, apparently due to the formation of well-defined subcascades in this energy region. Such an explanation is only qualitative at this time, and additional analysis of the high energy cascades is underway in an attempt to obtain a quantitative measure of the relationship between cascade morphology and defect survival.

  16. Reflex Project: Using Model-Data Fusion to Characterize Confidence in Analyzes and Forecasts of Terrestrial C Dynamics

    NASA Astrophysics Data System (ADS)

    Fox, A. M.; Williams, M.; Richardson, A.; Cameron, D.; Gove, J. H.; Ricciuto, D. M.; Tomalleri, E.; Trudinger, C.; van Wijk, M.; Quaife, T.; Li, Z.

    2008-12-01

    The Regional Flux Estimation Experiment, REFLEX, is a model-data fusion inter-comparison project, aimed at comparing the strengths and weaknesses of various model-data fusion techniques for estimating carbon model parameters and predicting carbon fluxes and states. The key question addressed here is: what are the confidence intervals on (a) model parameters calibrated from eddy covariance (EC) and leaf area index (LAI) data and (b) on model analyses and predictions of net ecosystem C exchange (NEE) and carbon stocks? The experiment has an explicit focus on how different algorithms and protocols quantify the confidence intervals on parameter estimates and model forecasts, given the same model and data. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. Both observed daily NEE data from FluxNet sites and synthetic NEE data, generated by a model, were used to estimate the parameters and states of a simple C dynamics model. The results of the analyses supported the hypothesis that parameters linked to fast-response processes that mostly determine net ecosystem exchange of CO2 (NEE) were well constrained and well characterised. Parameters associated with turnover of wood and allocation to roots, only indirectly related to NEE, were poorly characterised. There was only weak agreement on estimations of uncertainty on NEE and its components, photosynthesis and ecosystem respiration, with some algorithms successfully locating the true values of these fluxes from synthetic experiments within relatively narrow 90% confidence intervals. This exercise has demonstrated that a range of techniques exist that can generate useful estimates of parameter probability density functions for C models from eddy covariance time series data. When these parameter PDFs are propagated to generate estimates of annual C fluxes there was a wide variation in size of the 90% confidence intervals. However, some algorithms were able to make

  17. Four-Dimensional Continuum Gyrokinetic Code: Neoclassical Simulation of Fusion Edge Plasmas

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.

    2005-10-01

    We are developing a continuum gyrokinetic code, TEMPEST, to simulate edge plasmas. Our code represents velocity space via a grid in equilibrium energy and magnetic moment variables, and configuration space via poloidal magnetic flux and poloidal angle. The geometry is that of a fully diverted tokamak (single or double null) and so includes boundary conditions for both closed magnetic flux surfaces and open field lines. The 4-dimensional code includes kinetic electrons and ions, and electrostatic field-solver options, and simulates neoclassical transport. The present implementation is a Method of Lines approach where spatial finite-differences (higher order upwinding) and implicit time advancement are used. We present results of initial verification and validation studies: transition from collisional to collisionless limits of parallel end-loss in the scrape-off layer, self-consistent electric field, and the effect of the real X-point geometry and edge plasma conditions on the standard neoclassical theory, including a comparison of our 4D code with other kinetic neoclassical codes and experiments.

  18. Simulation of plasma–surface interactions in a fusion reactor by means of QSPA plasma streams: recent results and prospects

    NASA Astrophysics Data System (ADS)

    Garkusha, I. E.; Aksenov, N. N.; Byrka, O. V.; Makhlaj, V. A.; Herashchenko, S. S.; Malykhin, S. V.; Petrov, Yu V.; Staltsov, V. V.; Surovitskiy, S. V.; Wirtz, M.; Linke, J.; Sadowski, M. J.; Skladnik-Sadowska, E.

    2016-09-01

    This paper is devoted to plasma–surface interaction issues at high heat-loads which are typical for fusion reactors. For the International Thermonuclear Experimental Reactor (ITER), which is now under construction, the knowledge of erosion processes and the behaviour of various constructional materials under extreme conditions is a very critical issue, which will determine a successful realization of the project. The most important plasma–surface interaction (PSI) effects in 3D geometry have been studied using a QSPA Kh-50 powerful quasi-stationary plasma accelerator. Mechanisms of the droplet and dust generation have been investigated in detail. It was found that the droplets emission from castellated surfaces has a threshold character and a cyclic nature. It begins only after a certain number of the irradiating plasma pulses when molten and shifted material is accumulated at the edges of the castellated structure. This new erosion mechanism, connected with the edge effects, results in an increase in the size of the emitted droplets (as compared with those emitted from a flat surface). This mechanism can even induce the ejection of sub-mm particles. A concept of a new-generation QSPA facility, the current status of this device maintenance, and prospects for further experiments are also presented.

  19. Kinetic simulations of stimulated Raman backscattering and related processes for the shock-ignition approach to inertial confinement fusion

    SciTech Connect

    Riconda, C.; Weber, S.; Tikhonchuk, V. T.; Heron, A.

    2011-09-15

    A detailed description of stimulated Raman backscattering and related processes for the purpose of inertial confinement fusion requires multi-dimensional kinetic simulations of a full speckle in a high-temperature, large-scale, inhomogeneous plasma. In particular for the shock-ignition scheme operating at high laser intensities, kinetic aspects are predominant. High- (I{lambda}{sub o}{sup 2}{approx}5x10{sup 15}W{mu}m{sup 2}/cm{sup 2}) as well as low-intensity (I{lambda}{sub o}{sup 2}{approx}10{sup 15}W{mu}m{sup 2}/cm{sup 2}) cases show the predominance of collisionless, collective processes for the interaction. While the two-plasmon decay instability and the cavitation scenario are hardly affected by intensity variation, inflationary Raman backscattering proves to be very sensitive. Brillouin backscattering evolves on longer time scales and dominates the reflectivities, although it is sensitive to the intensity. Filamentation and self-focusing do occur for all cases but on time scales too long to affect Raman backscattering.

  20. Exponential yield sensitivity to long-wavelength asymmetries in three-dimensional simulations of inertial confinement fusion capsule implosions

    SciTech Connect

    Haines, Brian M.

    2015-08-15

    In this paper, we perform a series of high-resolution 3D simulations of an OMEGA-type inertial confinement fusion (ICF) capsule implosion with varying levels of initial long-wavelength asymmetries in order to establish the physical energy loss mechanism for observed yield degradation due to long-wavelength asymmetries in symcap (gas-filled capsule) implosions. These simulations demonstrate that, as the magnitude of the initial asymmetries is increased, shell kinetic energy is increasingly retained in the shell instead of being converted to fuel internal energy. This is caused by the displacement of fuel mass away from and shell material into the center of the implosion due to complex vortical flows seeded by the long-wavelength asymmetries. These flows are not fully turbulent, but demonstrate mode coupling through non-linear instability development during shell stagnation and late-time shock interactions with the shell interface. We quantify this effect by defining a separation lengthscale between the fuel mass and internal energy and show that this is correlated with yield degradation. The yield degradation shows an exponential sensitivity to the RMS magnitude of the long-wavelength asymmetries. This strong dependence may explain the lack of repeatability frequently observed in OMEGA ICF experiments. In contrast to previously reported mechanisms for yield degradation due to turbulent instability growth, yield degradation is not correlated with mixing between shell and fuel material. Indeed, an integrated measure of mixing decreases with increasing initial asymmetry magnitude due to delayed shock interactions caused by growth of the long-wavelength asymmetries without a corresponding delay in disassembly.

  1. Project Shuttle simulation math model coordination catalog, revision 1

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A catalog is presented of subsystem and environment math models used or planned for space shuttle simulations. The purpose is to facilitate sharing of similar math models between shuttle simulations. It provides information on mach model requirements, formulations, schedules, and contact persons for further information.

  2. Secretarial Administration: Project In/Vest: Insurance Simulation Insures Learning

    ERIC Educational Resources Information Center

    Geier, Charlene

    1978-01-01

    Describes a simulated model office to replicate various insurance occupations set up in Greenfield High School, Wisconsin. Local insurance agents and students from other disciplines, such as distributive education, are involved in the simulation. The training is applicable to other business office positions, as it models not only an insurance…

  3. A 3d particle simulation code for heavy ion fusion accelerator studies

    SciTech Connect

    Friedman, A.; Bangerter, R.O.; Callahan, D.A.; Grote, D.P.; Langdon, A.B. ); Haber, I. )

    1990-06-08

    We describe WARP, a new particle-in-cell code being developed and optimized for ion beam studies in true geometry. We seek to model transport around bends, axial compression with strong focusing, multiple beamlet interaction, and other inherently 3d processes that affect emittance growth. Constraints imposed by memory and running time are severe. Thus, we employ only two 3d field arrays ({rho} and {phi}), and difference {phi} directly on each particle to get E, rather than interpolating E from three meshes; use of a single 3d array is feasible. A new method for PIC simulation of bent beams follows the beam particles in a family of rotated laboratory frames, thus straightening'' the bends. We are also incorporating an envelope calculation, an (r, z) model, and 1d (axial) model within WARP. The BASIS development and run-time system is used, providing a powerful interactive environment in which the user has access to all variables in the code database. 10 refs., 3 figs.

  4. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  5. The Maya Project: Numerical Simulations of Black Hole Collisions

    NASA Astrophysics Data System (ADS)

    Smith, Kenneth; Calabrese, Gioel; Garrison, David; Kelly, Bernard; Laguna, Pablo; Lockitch, Keith; Pullin, Jorge; Shoemaker, Deirdre; Tiglio, Manuel

    2001-04-01

    The main objective of the MAYA project is the development of a numerical code to solve the vacuum Einstein's field equations for spacetimes containing multiple black hole singularities. Incorporating knowledge gained from previous similar efforts (Binary Black Holes Alliance and the AGAVE project) as well as one-dimensional numerical studies, MAYA has been built from the ground up within the architecture of Cactus 4.0, with particular attention paid to the software engineering aspects of code development. The goal of this new effort is to ultimately have a robust, efficient, readable, and stable numerical code for black hole evolution. This poster presents an overview of the project, focusing on the innovative aspects of the project as well as its current development status.

  6. A Student Project to use Geant4 Simulations for a TMS-PET combination

    SciTech Connect

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-10-26

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  7. Fusion Studies in Japan

    NASA Astrophysics Data System (ADS)

    Ogawa, Yuichi

    2016-05-01

    A new strategic energy plan decided by the Japanese Cabinet in 2014 strongly supports the steady promotion of nuclear fusion development activities, including the ITER project and the Broader Approach activities from the long-term viewpoint. Atomic Energy Commission (AEC) in Japan formulated the Third Phase Basic Program so as to promote an experimental fusion reactor project. In 2005 AEC has reviewed this Program, and discussed on selection and concentration among many projects of fusion reactor development. In addition to the promotion of ITER project, advanced tokamak research by JT-60SA, helical plasma experiment by LHD, FIREX project in laser fusion research and fusion engineering by IFMIF were highly prioritized. Although the basic concept is quite different between tokamak, helical and laser fusion researches, there exist a lot of common features such as plasma physics on 3-D magnetic geometry, high power heat load on plasma facing component and so on. Therefore, a synergetic scenario on fusion reactor development among various plasma confinement concepts would be important.

  8. Numerical analysis of applied magnetic field dependence in Malmberg-Penning Trap for compact simulator of energy driver in heavy ion fusion

    NASA Astrophysics Data System (ADS)

    Sato, T.; Park, Y.; Soga, Y.; Takahashi, K.; Sasaki, T.; Kikuchi, T.; Harada, Nob

    2016-05-01

    To simulate a pulse compression process of space charge dominated beams in heavy ion fusion, we have demonstrated a multi-particle numerical simulation as an equivalent beam using the Malmberg-Penning trap device. The results show that both transverse and longitudinal velocities as a function of external magnetic field strength are increasing during the longitudinal compression. The influence of space-charge effect, which is related to the external magnetic field, was observed as the increase of high velocity particles at the weak external magnetic field.

  9. Fission thrust sail as booster for high Δv fusion based propulsion

    NASA Astrophysics Data System (ADS)

    Ceyssens, Frederik; Wouters, Kristof; Driesen, Maarten

    2015-12-01

    The fission thrust sail as booster for nuclear fusion-based rocket propulsion for future starships is introduced and studied. First order calculations are used together with Monte Carlo simulations to assess system performance. If a D-D fusion rocket such as e.g. considered in Project Icarus has relatively low efficiency (~30%) in converting fusion fuel to a directed exhaust, adding a fission sail is shown to be beneficial for the obtainable delta-v. In addition, this type of fission-fusion hybrid propulsion has the potential to improve acceleration and act as a micrometeorite shield.

  10. The design and simulation of high-voltage Applied-B ion diodes for inertial confinement fusion

    NASA Astrophysics Data System (ADS)

    Slutz, S. A.; Seidel, D. B.; Coats, R. S.

    1987-06-01

    We present the design of the high-voltage (30 MV) Applied-B ion diode that is now being tested on the PBFA-II accelerator at Sandia National Laboratories. This diode design is the first application of a new set of numerical design tools that have been developed over the past several years. Furthermore, this design represents significant departures from previous designs due to much higher voltage and the use of a nonprotonic ion, Li+. The higher voltage increases the magnetic field strength required to insulate the diode from 1 to 2 T of previous diodes to 3-7 T. This represents a very large increase in the magnetic field energy and the magnetic forces exerted on the field-coil structures. Our new design incorporates changes in the field-coil locations to significantly reduce the field energy and the forces on the field-coil structures. The use of nonprotonic ions introduces a new complication in that these ions will be stripped when they penetrate material, i.e., the gas cell membrane. The importance of current neutralization, charge-exchange reactions, and the conservation of canonical angular momentum are discussed in the context of designing light ion diodes suitable as drivers for inertial confinement fusion. We have simulated the performance of this diode design using the electromagnetic particle-in-cell code, magic. We find that the most sensitive point in the power flow is the transition from the self-magnetically insulated transmission line to the applied field region of the diode.

  11. How historic simulation-observation discrepancy affects future warming projections in a very large model ensemble

    NASA Astrophysics Data System (ADS)

    Goodwin, Philip

    2016-01-01

    Projections of future climate made by model-ensembles have credibility because the historic simulations by these models are consistent with, or near-consistent with, historic observations. However, it is not known how small inconsistencies between the ranges of observed and simulated historic climate change affects the future projections made by a model ensemble. Here, the impact of historical simulation-observation inconsistencies on future warming projections is quantified in a 4-million member Monte Carlo ensemble from a new efficient Earth System Model (ESM). Of the 4-million ensemble members, a subset of 182,500 are consistent with historic ranges of warming, heat uptake and carbon uptake simulated by the Climate Model Intercomparison Project 5 (CMIP5) ensemble. This simulation-consistent subset projects similar future warming ranges to the CMIP5 ensemble for all four RCP scenarios, indicating the new ESM represents an efficient tool to explore parameter space for future warming projections based on historic performance. A second subset of 14,500 ensemble members are consistent with historic observations for warming, heat uptake and carbon uptake. This observation-consistent subset projects a narrower range for future warming, with the lower bounds of projected warming still similar to CMIP5, but the upper warming bounds reduced by 20-35 %. These findings suggest that part of the upper range of twenty-first century CMIP5 warming projections may reflect historical simulation-observation inconsistencies. However, the agreement of lower bounds for projected warming implies that the likelihood of warming exceeding dangerous levels over the twenty-first century is unaffected by small discrepancies between CMIP5 models and observations.

  12. Improving Faculty Perceptions of and Intent to Use Simulation: An Intervention Project

    ERIC Educational Resources Information Center

    Tucker, Charles

    2013-01-01

    Human patient simulation is an innovative teaching strategy that can facilitate practice development and preparation for entry into today's healthcare environment for nursing students. Unfortunately, the use of human patient simulation has been limited due to the perceptions of nursing faculty members. This project sought to explore those…

  13. VALIDATION OF A SIMULATION PROCEDURE FOR GENERATING BREAST TOMOSYNTHESIS PROJECTION IMAGES.

    PubMed

    Petersson, Hannie; Warren, Lucy M; Tingberg, Anders; Dustler, Magnus; Timberg, Pontus

    2016-06-01

    In order to achieve optimal diagnostic performance in breast tomosynthesis (BT) imaging, the parameters of the imaging chain should be evaluated. For the purpose of such evaluations, a simulation procedure based on the Monte Carlo code system Penelope and the geometry of a Siemens BT system has been developed to generate BT projection images. In this work, the simulation procedure is validated by comparing contrast and sharpness in simulated images with contrast and sharpness in real images acquired with the BT system. The results of the study showed a good agreement of sharpness in real and simulated reconstructed image planes, but the contrast was shown to be higher in the simulated compared with the real projection images. The developed simulation procedure could be used to generate BT images, but it is of interest to further investigate how the procedure could be modified to generate more realistic image noise and contrast. PMID:26842713

  14. A system simulation development project: Leveraging resources through partnerships

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.; Owen, A. Karl; Davis, Milt W.

    1995-01-01

    Partnerships between government agencies are an intellectually attractive method of conducting scientific research; the goal is to establish mutually beneficial participant roles for technology exchange that ultimately pays-off in a stronger R&D program for each partner. Anticipated and current aerospace research budgetary pressures through the 90's provide additional impetus for Government research agencies to candidly assess their R&D for those simulation activities no longer unique enough to warrant 'going it alone,' or for those elements where partnerships or teams can offset development costs. This paper describes a specific inter-agency system simulation activity that leverages the development cost of mutually beneficial R&D. While the direct positive influence of partnerships on complex technology developments is our main thesis, we also address on-going teaming issues and hope to impart to the reader the immense indirect (sometimes immeasurable) benefits that meaningful interagency partnerships can produce.

  15. Three dimensional projection environment for molecular design and surgical simulation.

    PubMed

    Wickstrom, Eric; Chen, Chang-Po; Devadhas, Devakumar; Wampole, Matthew; Jin, Yuan-Yuan; Sanders, Jeffrey M; Kairys, John C; Ankeny, Martha L; Hu, Rui; Barner, Kenneth E; Steiner, Karl V; Thakur, Mathew L

    2011-01-01

    We are developing agents for positron emission tomography (PET) imaging of cancer gene mRNA expression and software to fuse mRNA PET images with anatomical computerized tomography (CT) images to enable volumetric (3D) haptic (touch-and-feel) simulation of pancreatic cancer and surrounding organs prior to surgery in a particular patient. We have identified a novel ligand specific for epidermal growth factor receptor (EGFR) to direct PET agent uptake specifically into cancer cells, and created a volumetric haptic surgical simulation of human pancreatic cancer reconstructed from patient CT data. Young's modulus and the Poisson ratio for each tissue will be adjusted to fit the experience of participating surgeons. PMID:21335882

  16. Molecular dynamics simulation of the evolution of hydrophobic defects in one monolayer of a phosphatidylcholine bilayer: relevance for membrane fusion mechanisms.

    PubMed Central

    Tieleman, D Peter; Bentz, Joe

    2002-01-01

    The spontaneous formation of the phospholipid bilayer underlies the permeability barrier function of the biological membrane. Tears or defects that expose water to the acyl chains are spontaneously healed by lipid lateral diffusion. However, mechanical barriers, e.g., protein aggregates held in place, could sustain hydrophobic defects. Such defects have been postulated to occur in processes such as membrane fusion. This gives rise to a new question in bilayer structure: What do the lipids do in the absence of lipid lateral diffusion to minimize the free energy of a hydrophobic defect? As a first step to understand this rather fundamental question about bilayer structure, we performed molecular dynamic simulations of up to 10 ns of a planar bilayer from which lipids have been deleted randomly from one monolayer. In one set of simulations, approximately one-half of the lipids in the defect monolayer were restrained to form a mechanical barrier. In the second set, lipids were free to diffuse around. The question was simply whether the defects caused by removing a lipid would aggregate together, forming a large hydrophobic cavity, or whether the membrane would adjust in another way. When there are no mechanical barriers, the lipids in the defect monolayer simply spread out and thin with little effect on the other intact monolayer. In the presence of a mechanical barrier, the behavior of the lipids depends on the size of the defect. When 3 of 64 lipids are removed, the remaining lipids adjust the lower one-half of their chains, but the headgroup structure changes little and the intact monolayer is unaffected. When 6 to 12 lipids are removed, the defect monolayer thins, lipid disorder increases, and lipids from the intact monolayer move toward the defect monolayer. Whereas this is a highly simplified model of a fusion site, this engagement of the intact monolayer into the fusion defect is strikingly consistent with recent results for influenza hemagglutinin mediated

  17. Simulation of slag control for the Plasma Hearth Project

    SciTech Connect

    Power, M.A.; Carney, K.P.; Peters. G.G.

    1996-12-31

    The goal of the Plasma Hearth Project is to stabilize alpha-emitting radionuclides in a vitreous slag and to reduce the effective storage volume of actinide-containing waste for long-term burial. The actinides have been shown to partition into the vitreous slag phase of the melt. The slag composition may be changed by adding glass-former elements to ensure that this removable slag has the most desired physical and chemical properties for long-term burial. A data acquisition and control system has been designed to regulate the composition of five elements in the slag.

  18. Simulation of particle acceleration in the PLASMONX project

    NASA Astrophysics Data System (ADS)

    Benedetti, Carlo

    2010-02-01

    In this paper I will present some numerical studies and parameter scans performed with the electromagnetic, rela-tivistic, fully-self consistent particle-in-cell (PIC) code ALaDyn (Acceleration by LAser and DYNamics of charged particles), concerning electron acceleration via plasma waves in the framework of the INFN-PLASMONX (PLASma acceleration and MONochromatic X-ray production) project. In particular I will focus on the modelling of the SITE (Self Injection Test Experiment) which will be a relevant part of the commissioning of the FLAME laser. Some issues related to the quality of the accelerated bunch will be discussed.

  19. Simulated hydroclimatic impacts of projected Brazilian sugarcane expansion

    NASA Astrophysics Data System (ADS)

    Georgescu, M.; Lobell, D. B.; Field, C. B.; Mahalov, A.

    2013-03-01

    Sugarcane area is currently expanding in Brazil, largely in response to domestic and international demand for sugar-based ethanol. To investigate the potential hydroclimatic impacts of future expansion, a regional climate model is used to simulate 5 years of a scenario in which cerrado and cropland areas (~1.1E6 km2) within south-central Brazil are converted to sugarcane. Results indicate a cooling of up to ~1.0°C during the peak of the growing season, mainly as a result of increased albedo of sugarcane relative to the previous landscape. After harvest, warming of similar magnitude occurs from a significant decline in evapotranspiration and a repartitioning toward greater sensible heating. Overall, annual temperature changes from large-scale conversion are expected to be small because of offsetting reductions in net radiation absorption and evapotranspiration. The decline in net water flux from land to the atmosphere implies a reduction in regional precipitation, which is consistent with progressively decreasing simulated average rainfall for the study period, upon conversion to sugarcane. However, rainfall changes were not robust across three ensemble members. The results suggest that sugarcane expansion will not drastically alter the regional energy or water balance, but could result in important local and seasonal effects.

  20. The BOUT Project: Validation and Benchmark of BOUT Code and Experimental Diagnostic Tools for Fusion Boundary Turbulence

    SciTech Connect

    Xu, X Q

    2001-08-09

    A boundary plasma turbulence code BOUT is presented. The preliminary encouraging results have been obtained when comparing with probe measurements for a typical Ohmic discharge in CT-7 tokamak. The validation and benchmark of BOUT code and experimental diagnostic tools for fusion boundary plasma turbulence is proposed.

  1. The Tokamak Fusion Test Reactor decontamination and decommissioning project and the Tokamak Physics Experiment at the Princeton Plasma Physics Laboratory. Environmental Assessment

    SciTech Connect

    1994-05-27

    If the US is to meet the energy needs of the future, it is essential that new technologies emerge to compensate for dwindling supplies of fossil fuels and the eventual depletion of fissionable uranium used in present-day nuclear reactors. Fusion energy has the potential to become a major source of energy for the future. Power from fusion energy would provide a substantially reduced environmental impact as compared with other forms of energy generation. Since fusion utilizes no fossil fuels, there would be no release of chemical combustion products to the atmosphere. Additionally, there are no fission products formed to present handling and disposal problems, and runaway fuel reactions are impossible due to the small amounts of deuterium and tritium present. The purpose of the TPX Project is to support the development of the physics and technology to extend tokamak operation into the continuously operating (steady-state) regime, and to demonstrate advances in fundamental tokamak performance. The purpose of TFTR D&D is to ensure compliance with DOE Order 5820.2A ``Radioactive Waste Management`` and to remove environmental and health hazards posed by the TFTR in a non-operational mode. There are two proposed actions evaluated in this environmental assessment (EA). The actions are related because one must take place before the other can proceed. The proposed actions assessed in this EA are: the decontamination and decommissioning (D&D) of the Tokamak Fusion Test Reactor (TFTR); to be followed by the construction and operation of the Tokamak Physics Experiment (TPX). Both of these proposed actions would take place primarily within the TFTR Test Cell Complex at the Princeton Plasma Physics Laboratory (PPPL). The TFTR is located on ``D-site`` at the James Forrestal Campus of Princeton University in Plainsboro Township, Middlesex County, New Jersey, and is operated by PPPL under contract with the United States Department of Energy (DOE).

  2. Final Technical Report for Center for Plasma Edge Simulation Research

    SciTech Connect

    Pankin, Alexei Y.; Bateman, Glenn; Kritz, Arnold H.

    2012-02-29

    The CPES research carried out by the Lehigh fusion group has sought to satisfy the evolving requirements of the CPES project. Overall, the Lehigh group has focused on verification and validation of the codes developed and/or integrated in the CPES project. Consequently, contacts and interaction with experimentalists have been maintained during the course of the project. Prof. Arnold Kritz, the leader of the Lehigh Fusion Group, has participated in the executive management of the CPES project. The code development and simulation studies carried out by the Lehigh fusion group are described in more detail in the sections below.

  3. Inertial electrostatic confinement and nuclear fusion in the interelectrode plasma of a nanosecond vacuum discharge. II: Particle-in-cell simulations

    NASA Astrophysics Data System (ADS)

    Kurilenkov, Yu. K.; Tarakanov, V. P.; Gus'kov, S. Yu.

    2010-12-01

    Results of particle-in-sell simulations of ion acceleration by using the KARAT code in a cylindrical geometry in the problem formulation corresponding to an actual experiment with a low-energy vacuum discharge with a hollow cathode are presented. The fundamental role of the formed virtual cathode is analyzed. The space-time dynamics of potential wells related to the formation of the virtual cathode is discussed. Quasi-steady potential wells (with a depth of ˜80% of the applied voltage) cause acceleration of deuterium ions to energies about the electron beam energy (˜50 keV). In the well, a quasi-isotropic velocity distribution function of fast ions forms. The results obtained are compared with available data on inertial electrostatic confinement fusion (IECF). In particular, similar correlations between the structure of potential wells and the neutron yield, as well as the scaling of the fusion power density, which increases with decreasing virtual cathode radius and increasing potential well depth, are considered. The chosen electrode configuration and potential well parameters provide power densities of nuclear DD fusion in a nanosecond vacuum discharge noticeably higher than those achieved in other similar IECF systems.

  4. Project ARGO: Gas phase formation in simulated microgravity

    NASA Technical Reports Server (NTRS)

    Powell, Michael R.; Waligora, James M.; Norfleet, William T.; Kumar, K. Vasantha

    1993-01-01

    The ARGO study investigated the reduced incidence of joint pain decompression sickness (DCS) encountered in microgravity as compared with an expected incidence of joint pain DCS experienced by test subjects in Earth-based laboratories (unit gravity) with similar protocols. Individuals who are decompressed from saturated conditions usually acquire joint pain DCS in the lower extremities. Our hypothesis is that the incidence of joint pain DCS can be limited by a significant reduction in the tissue gas micronuclei formed by stress-assisted nucleation. Reductions in dynamic and kinetic stresses in vivo are linked to hypokinetic and adynamic conditions of individuals in zero g. We employed the Doppler ultrasound bubble detection technique in simulated microgravity studies to determine quantitatively the degree of gas phase formation in the upper and lower extremities of test subjects during decompression. We found no evidence of right-to-left shunting through pulmonary vasculature. The volume of gas bubble following decompression was examined and compared with the number following saline contrast injection. From this, we predict a reduced incidence of DCS on orbit, although the incidence of predicted mild DCS still remains larger than that encountered on orbit.

  5. Numerical model for simulating the dynamic response of an inertial confinement fusion cavity gas to a target explosion

    SciTech Connect

    McCarville, T.J.

    1982-01-01

    One of the methods suggested for protecting the first wall of an inertial confinement fusion cavity from the x-rays and ions emitted by an exploding target is to fill the cavity with a buffer gas. A computer code package is developed in this thesis for studying the radiative and hydrodynamic response of the gas to an exploding target.

  6. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    ERIC Educational Resources Information Center

    Misfeldt, Morten

    2015-01-01

    In this paper I describe how students use a project management simulation game based on an attack-defense mechanism where two teams of players compete by challenging each other's projects. The project management simulation game is intended to be played by pre-service construction workers and engineers. The gameplay has two parts: a planning part,…

  7. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  8. Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.; Sostaric, Ronald r.; Johnson, Andrew E.

    2008-01-01

    Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.

  9. Label fusion strategy selection.

    PubMed

    Robitaille, Nicolas; Duchesne, Simon

    2012-01-01

    Label fusion is used in medical image segmentation to combine several different labels of the same entity into a single discrete label, potentially more accurate, with respect to the exact, sought segmentation, than the best input element. Using simulated data, we compared three existing label fusion techniques-STAPLE, Voting, and Shape-Based Averaging (SBA)-and observed that none could be considered superior depending on the dissimilarity between the input elements. We thus developed an empirical, hybrid technique called SVS, which selects the most appropriate technique to apply based on this dissimilarity. We evaluated the label fusion strategies on two- and three-dimensional simulated data and showed that SVS is superior to any of the three existing methods examined. On real data, we used SVS to perform fusions of 10 segmentations of the hippocampus and amygdala in 78 subjects from the ICBM dataset. SVS selected SBA in almost all cases, which was the most appropriate method overall. PMID:22518113

  10. Modeling and Simulation Optimization and Feasibility Studies for the Neutron Detection without Helium-3 Project

    SciTech Connect

    Ely, James H.; Siciliano, Edward R.; Swinhoe, Martyn T.; Lintereur, Azaree T.

    2013-01-01

    This report details the results of the modeling and simulation work accomplished for the ‘Neutron Detection without Helium-3’ project during the 2011 and 2012 fiscal years. The primary focus of the project is to investigate commercially available technologies that might be used in safeguards applications in the relatively near term. Other technologies that are being developed may be more applicable in the future, but are outside the scope of this study.

  11. Moomba Lower Daralingie Beds (LDB) gas storage project: Reservoir management using a novel numerical simulation technique

    SciTech Connect

    Jamal, F.G.

    1994-12-31

    Engineers managing underground gas storage projects are often faced with challenges involving gas migration, inventory variance, gas quality and inventory-pressures. This paper discusses a unique underground gas storage project where sales gas and ethane are stored in two different but communicating regions of the same reservoir. A commercially available reservoir simulator was used to model the fluid flow behavior in this reservoir, hence, providing a tool for better management and use of the existing gas storage facilities.

  12. Final Report for LDRD Project on Rapid Problem Setup for Mesh-Based Simulation (Rapsodi)

    SciTech Connect

    Brown, D L; Henshaw, W; Petersson, N A; Fast, P; Chand, K

    2003-02-07

    Under LLNL Exploratory Research LDRD funding, the Rapsodi project developed rapid setup technology for computational physics and engineering problems that require computational representations of complex geometry. Many simulation projects at LLNL involve the solution of partial differential equations in complex 3-D geometries. A significant bottleneck in carrying out these simulations arises in converting some specification of a geometry, such as a computer-aided design (CAD) drawing to a computationally appropriate 3-D mesh that can be used for simulation and analysis. Even using state-of-the-art mesh generation software, this problem setup step typically has required weeks or months, which is often much longer than required to carry out the computational simulation itself. The Rapsodi project built computational tools and designed algorithms that help to significantly reduce this setup time to less than a day for many realistic problems. The project targeted rapid setup technology for computational physics and engineering problems that use mixed-element unstructured meshes, overset meshes or Cartesian-embedded boundary (EB) meshes to represent complex geometry. It also built tools that aid in constructing computational representations of geometry for problems that do not require a mesh. While completely automatic mesh generation is extremely difficult, the amount of manual labor required can be significantly reduced. By developing novel, automated, component-based mesh construction procedures and automated CAD geometry repair and cleanup tools, Rapsodi has significantly reduced the amount of hand crafting required to generate geometry and meshes for scientific simulation codes.

  13. Monte-Carlo simulation of the kinetics of nuclear and radiative processes upon fast ignition of the fusion target in a `double liner' system

    NASA Astrophysics Data System (ADS)

    Andreev, Aleksandr A.; Gus'kov, Sergei Yu; Zakharov, S. V.; Il'in, Dmitrii V.; Levkovskii, Aleksei A.; Platonov, Konstantin Yu; Rozanov, Vladislav B.; Sherman, Vladimir E.

    2004-05-01

    A laser ignition scheme is considered for a fusion target placed in the cavity of a radiating plasma liner produced in a `double liner' system. It is shown that this scheme can be employed to realise an efficient thermonuclear burst. The precompression and heating of a deuterium — tritium target with an iron shell by a thermal radiation pulse was simulated using the TRITON mathematical code for the parameters of the Z-generator at the Sandia National Laboratories (USA). Laser and target parameters were optimised for the ignition of the deuterium — tritium fuel by protons accelerated by laser radiation. The propagation of the thermonuclear burning wave during the fast ignition was calculated employing the TERA mathematical code, which involves Monte-Carlo simulation of the kinetics of fast thermonuclear particles and hard gamma-ray quanta at each time step of hydrodynamic calculations. The dependence of the fusion energy gain G on the ignition energy is theoretically explained. The laser parameters required to obtain G gg 1 are determined.

  14. Particle-in-cell simulations of the excitation mechanism for fusion-product-driven ion cyclotron emission from tokamaks

    NASA Astrophysics Data System (ADS)

    Dendy, Richard; Cook, James; Chapman, Sandra

    2009-11-01

    Suprathermal ion cyclotron emission (ICE) was the first collective radiative instability, driven by fusion products, observed on JET and TFTR. Strong emission occurs at sequential cyclotron harmonics of the energetic ion population at the outer mid-plane. Its intensity scales linearly with fusion reactivity, including its time evolution during a discharge. The emission mechanism is probably the magnetoacoustic cyclotron instability (MCI), involving resonance between: fast Alfv'en waves; cyclotron harmonic waves supported by the energetic particle population and by the background thermal plasma; and a subset of the centrally born fusion products, just inside the trapped-passing boundary, whose drift orbits make large radial excursions. The linear growth rate of the MCI has been intensively studied analytically, and yields good agreement with several key observational features of ICE. To address outstanding issues in the nonlinear ICE regime, we have developed a particle-in-cell code which self-consistently evolves electron and multi-species ion macroparticles and the electromagnetic field. We focus on the growth rate of the MCI, as it evolves from the linear into the nonlinear regime for JET-like parameters.

  15. Retrieval process development and enhancements project Fiscal year 1995: Simulant development technology task progress report

    SciTech Connect

    Golcar, G.R.; Bontha, J.R.; Darab, J.G.

    1997-01-01

    The mission of the Retrieval Process Development and Enhancements (RPD&E) project is to develop an understanding of retrieval processes, including emerging and existing technologies, gather data on these technologies, and relate the data to specific tank problems such that end-users have the requisite technical bases to make retrieval and closure decisions. The development of waste simulants is an integral part of this effort. The work of the RPD&E simulant-development task is described in this document. The key FY95 accomplishments of the RPD&E simulant-development task are summarized below.

  16. Final Report for "Community Petascale Project for Accelerator Science and Simulations".

    SciTech Connect

    Cary, J. R.; Bruhwiler, D. L.; Stoltz, P. H.; Cormier-Michel, E.; Cowan, B.; Schwartz, B. T.; Bell, G.; Paul, K.; Veitzer, S.

    2013-04-19

    This final report describes the work that has been accomplished over the past 5 years under the Community Petascale Project for Accelerator and Simulations (ComPASS) at Tech-X Corporation. Tech-X had been involved in the full range of ComPASS activities with simulation of laser plasma accelerator concepts, mainly in collaboration with LOASIS program at LBNL, simulation of coherent electron cooling in collaboration with BNL, modeling of electron clouds in high intensity accelerators, in collaboration with researchers at Fermilab and accurate modeling of superconducting RF cavity in collaboration with Fermilab, JLab and Cockcroft Institute in the UK.

  17. Visual Analysis of Multi-Run Spatio-Temporal Simulations Using Isocontour Similarity for Projected Views.

    PubMed

    Fofonov, Alexey; Molchanov, Vladimir; Linsen, Lars

    2016-08-01

    Multi-run simulations are widely used to investigate how simulated processes evolve depending on varying initial conditions. Frequently, such simulations model the change of spatial phenomena over time. Isocontours have proven to be effective for the visual representation and analysis of 2D and 3D spatial scalar fields. We propose a novel visualization approach for multi-run simulation data based on isocontours. By introducing a distance function for isocontours, we generate a distance matrix used for a multidimensional scaling projection. Multiple simulation runs are represented by polylines in the projected view displaying change over time. We propose a fast calculation of isocontour differences based on a quasi-Monte Carlo approach. For interactive visual analysis, we support filtering and selection mechanisms on the multi-run plot and on linked views to physical space visualizations. Our approach can be effectively used for the visual representation of ensembles, for pattern and outlier detection, for the investigation of the influence of simulation parameters, and for a detailed analysis of the features detected. The proposed method is applicable to data of any spatial dimensionality and any spatial representation (gridded or unstructured). We validate our approach by performing a user study on synthetic data and applying it to different types of multi-run spatio-temporal simulation data. PMID:26561458

  18. Simulating Limb Formation in the U.S. EPA Virtual Embryo - Risk Assessment Project

    EPA Science Inventory

    The U.S. EPA’s Virtual Embryo project (v-Embryo™) is a computer model simulation of morphogenesis that integrates cell and molecular level data from mechanistic and in vitro assays with knowledge about normal development processes to assess in silico the effects of chemicals on d...

  19. Improving frost-simulation subroutines of the Water Erosion Prediction Project (WEPP) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Erosion models play an important role in assessing the influence of human activities on the environment. For cold areas, adequate frost simulation is crucial for predicting surface runoff and water erosion. The Water Erosion Prediction Project (WEPP) model, physically-based erosion-prediction softwa...

  20. The Virtual Liver Project: Simulating Tissue Injury Through Molecular and Cellular Processes

    EPA Science Inventory

    Efficiently and humanely testing the safety of thousands of environmental chemicals is a challenge. The US EPA Virtual Liver Project (v-Liver™) is aimed at simulating the effects of environmental chemicals computationally in order to estimate the risk of toxic outcomes in humans...

  1. The Virtual Liver Project: Modeling Tissue Response To Chemicals Through Multiscale Simulation

    EPA Science Inventory

    The US EPA Virtual Liver Project is aimed at simulating the risk of toxic effects from environmental chemicals in silico. The computational systems model of organ injury due to chronic chemical exposure is based on: (i) the dynamics of perturbed molecular pathways, (ii) their lin...

  2. Simulating the magnetized liner inertial fusion plasma confinement with smaller-scale experiments [Simulating the MagLIF plasma confinement with smaller-scale experiments

    SciTech Connect

    Ryutov, D. D.; Cuneo, M. E.; Herrmann, M. C.; Sinars, D. B.; Slutz, S. A.

    2012-06-20

    The recently proposed magnetized liner inertial fusion approach to a Z-pinch driven fusion [Slutz et al., Phys. Plasmas17, 056303 (2010)] is based on the use of an axial magnetic field to provide plasma thermal insulation from the walls of the imploding liner. The characteristic plasma transport regimes in the proposed approach cover parameter domains that have not been studied yet in either magnetic confinement or inertial confinement experiments. In this article, an analysis is presented of the scalability of the key physical processes that determine the plasma confinement. The dimensionless scaling parameters are identified and conclusion is drawn that the plasma behavior in scaled-down experiments can correctly represent the full-scale plasma, provided these parameters are approximately the same in two systems. Furthermore, this observation is important in that smaller-scale experiments typically have better diagnostic access and more experiments per year are possible.

  3. Fusion in diffusion MRI for improved fibre orientation estimation: An application to the 3T and 7T data of the Human Connectome Project.

    PubMed

    Sotiropoulos, Stamatios N; Hernández-Fernández, Moisés; Vu, An T; Andersson, Jesper L; Moeller, Steen; Yacoub, Essa; Lenglet, Christophe; Ugurbil, Kamil; Behrens, Timothy E J; Jbabdi, Saad

    2016-07-01

    Determining the acquisition parameters in diffusion magnetic resonance imaging (dMRI) is governed by a series of trade-offs. Images of lower resolution have less spatial specificity but higher signal to noise ratio (SNR). At the same time higher angular contrast, important for resolving complex fibre patterns, also yields lower SNR. Considering these trade-offs, the Human Connectome Project (HCP) acquires high quality dMRI data for the same subjects at different field strengths (3T and 7T), which are publically released. Due to differences in the signal behavior and in the underlying scanner hardware, the HCP 3T and 7T data have complementary features in k- and q-space. The 3T dMRI has higher angular contrast and resolution, while the 7T dMRI has higher spatial resolution. Given the availability of these datasets, we explore the idea of fusing them together with the aim of combining their benefits. We extend a previously proposed data-fusion framework and apply it to integrate both datasets from the same subject into a single joint analysis. We use a generative model for performing parametric spherical deconvolution and estimate fibre orientations by simultaneously using data acquired under different protocols. We illustrate unique features from each dataset and how they are retained after fusion. We further show that this allows us to complement benefits and improve brain connectivity analysis compared to analyzing each of the datasets individually. PMID:27071694

  4. Regional climate change projections over southern Africa: Benefits of a high resolution climate change simulation

    NASA Astrophysics Data System (ADS)

    Haensler, A.; Hagemann, S.; Jacob, D.

    2009-12-01

    The southern African region is known to be a biodiversity hotspot but future climate change is likely to have a major influence on the biodiversity. To estimate the impacts of climate change on the biosphere high resolution climate information is needed for both current and future conditions. In the framework of the BIOTA South project we are therefore applying the regional climate model (RCM) REMO of the Max-Planck-Institute for Meteorology (MPI-M) over the southern African region. The model is integrated for a transient climate change simulation for the time period 1960 to 2100 at 1/2 degree and 1/6 degree horizontal resolution using a double-nesting approach. The 1/6 degree simulation is the first long-term climate projection for southern Africa on such a high horizontal resolution. The boundary forcing for the 1/2 degree projection is taken from a global ECHAM5/MPIOM IPCC A1B scenario simulation. In the current study we will analyse projected changes on the hydrological cycle, thereby focusing on the Orange river catchment and on the main BIOTA research transect, which spans from the north-east corner of Namibia to the Cape region in the South. In order to quantify the impact of model resolution on the projected changes we will intercompare the two REMO simulations and the ECHAM5/MPIOM forcing data. A comparison for the high resolution REMO validation simulation and its forcing ERA40 data already revealed an added value in the representation of the seasonal rainfall characteristics for the region. The benefits of using high resolution RCM data for climate change studies will be highlighted and uncertainties introduced by the application of an RCM will be discussed.

  5. Comparisons of Simulated Hydrodynamics and Water Quality for Projected Demands in 2046, Pueblo Reservoir, Southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.; Galloway, Joel M.; Miller, Lisa D.; Mau, David P.

    2008-01-01

    Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Bureau of Reclamation is working to meet its goal to issue a Final Environmental Impact Statement (EIS) on the Southern Delivery System project (SDS). SDS is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various EIS alternatives and plans by Pueblo West to discharge treated water into the reservoir. Plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (water years 2000 through

  6. Toward the credibility of Northeast United States summer precipitation projections in CMIP5 and NARCCAP simulations

    NASA Astrophysics Data System (ADS)

    Thibeault, Jeanne M.; Seth, A.

    2015-10-01

    Precipitation projections for the northeast United States and nearby Canada (Northeast) are examined for 15 Fifth Phase of the Coupled Model Intercomparison Project (CMIP5) models. A process-based evaluation of atmospheric circulation features associated with wet Northeast summers is performed to examine whether credibility can be differentiated within the multimodel ensemble. Based on these evaluations, and an analysis of the interannual statistical properties of area-averaged precipitation, model subsets were formed. Multimodel precipitation projections from each subset were compared to the multimodel projection from all of the models. Higher-resolution North American Regional Climate Change Assessment Program (NARCCAP) regional climate models (RCMs) were subjected to a similar evaluation, grouping into subsets, and examination of future projections. CMIP5 models adequately simulate most large-scale circulation features associated with wet Northeast summers, though all have errors in simulating observed sea level pressure and moisture divergence anomalies in the western tropical Atlantic/Gulf of Mexico. Relevant large-scale processes simulated by the RCMs resemble those of their driving global climate models (GCMs), which are not always realistic. Future RCM studies could benefit from a process analysis of potential driving GCMs prior to dynamical downscaling. No CMIP5 or NARCCAP models were identified as clearly more credible, but six GCMs and four RCMs performed consistently better. Among the "Better" models, there is no consistency in the direction of future summer precipitation change. CMIP5 projections suggest that the Northeast precipitation response depends on the dynamics of the North Atlantic anticyclone and associated circulation and moisture convergence patterns, which vary among "Better" models. Even when model credibility cannot be clearly differentiated, examination of simulated processes provides important insights into their evolution under

  7. Improved Arctic sea ice thickness projections using bias corrected CMIP5 simulations

    NASA Astrophysics Data System (ADS)

    Melia, N.; Haines, K.; Hawkins, E.

    2015-07-01

    Projections of Arctic sea ice thickness (SIT) have the potential to inform stakeholders about accessibility to the region, but are currently rather uncertain. The latest suite of CMIP5 Global Climate Models (GCMs) produce a wide range of simulated SIT in the historical period (1979-2014) and exhibit various spatial and temporal biases when compared with the Pan-Arctic Ice Ocean Modelling and Assimilation System (PIOMAS) sea ice reanalysis. We present a new method to constrain such GCM simulations of SIT to narrow projection uncertainty via a statistical bias correction technique. The bias correction successfully constrains the spatial SIT distribution and temporal variability in the CMIP5 projections whilst retaining the climatic fluctuations from individual ensemble members. The bias correction acts to reduce the uncertainty in projections of SIT and reveals the significant contributions of sea ice internal variability in the first half of the century and of scenario uncertainty from mid-century onwards. The projected date of ice-free conditions in the Arctic under the RCP8.5 high emission scenario occurs in the 2050s, which is a decade earlier than without the bias correction, with potentially significant implications for stakeholders in the Arctic such as the shipping industry. The bias correction methodology developed could be similarly applied to other variables to narrow uncertainty in climate projections more generally.

  8. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. I. Internal kink mode

    NASA Astrophysics Data System (ADS)

    McClenaghan, J.; Lin, Z.; Holod, I.; Deng, W.; Wang, Z.

    2014-12-01

    The gyrokinetic toroidal code (GTC) capability has been extended for simulating internal kink instability with kinetic effects in toroidal geometry. The global simulation domain covers the magnetic axis, which is necessary for simulating current-driven instabilities. GTC simulation in the fluid limit of the kink modes in cylindrical geometry is verified by benchmarking with a magnetohydrodynamic eigenvalue code. Gyrokinetic simulations of the kink modes in the toroidal geometry find that ion kinetic effects significantly reduce the growth rate even when the banana orbit width is much smaller than the radial width of the perturbed current layer at the mode rational surface.

  9. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. I. Internal kink mode

    SciTech Connect

    McClenaghan, J.; Lin, Z.; Holod, I.; Deng, W.; Wang, Z.

    2014-12-15

    The gyrokinetic toroidal code (GTC) capability has been extended for simulating internal kink instability with kinetic effects in toroidal geometry. The global simulation domain covers the magnetic axis, which is necessary for simulating current-driven instabilities. GTC simulation in the fluid limit of the kink modes in cylindrical geometry is verified by benchmarking with a magnetohydrodynamic eigenvalue code. Gyrokinetic simulations of the kink modes in the toroidal geometry find that ion kinetic effects significantly reduce the growth rate even when the banana orbit width is much smaller than the radial width of the perturbed current layer at the mode rational surface.

  10. Simulations of the fusion of necklace-ring pattern in the complex Ginzburg-Landau equation by lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Zhang, Jianying; Yan, Guangwu

    2016-04-01

    A lattice Boltzmann model for solving the (2+1) dimensional cubic-quintic complex Ginzburg-Landau equation (CQCGLE) is proposed. Different from the classic lattice Boltzmann models, this lattice Boltzmann model is based on uniformly distributed lattice points in a two-dimensional space, and the evolution of the model is about a spatial axis rather than time. The algorithm provides advantages similar to the lattice Boltzmann method in that it is easily adapted to complex Ginzburg-Landau equations. Numerical results reproduce the phenomena of the fusion of necklace-ring pattern and the effect of non-linearity on the soliton in the CQCGLE.

  11. Sensor fusion for synthetic vision

    NASA Technical Reports Server (NTRS)

    Pavel, M.; Larimer, J.; Ahumada, A.

    1991-01-01

    Display methodologies are explored for fusing images gathered by millimeter wave sensors with images rendered from an on-board terrain data base to facilitate visually guided flight and ground operations in low visibility conditions. An approach to fusion based on multiresolution image representation and processing is described which facilitates fusion of images differing in resolution within and between images. To investigate possible fusion methods, a workstation-based simulation environment is being developed.

  12. Measurements and analyses of decay radioactivity induced in simulated deuterium-tritium neutron environments for fusion reactor structural materials

    SciTech Connect

    Ikeda, Y.; Konno, C.; Kosako, K.; Oyama, Y.; Maekawa, F.; Maekawa, H.; Kumar, A.; Youssef, M.Z.; Abdou, M.A.

    1995-08-01

    To meet urgent requirements for data validation, an experimental analysis has been carried out for isotopic radioactivity induced by deuterium-tritium neutron irradiation in structural materials. The primary objective is to examine the adequacy of the activation cross sections implemented in the current activation calculation codes considered for use in fusion reactor nuclear design. Four activation cross-section libraries, namely, JENDL, LIB90, REAC{sup *}63, and REAC{sup *}175 were investigated in this current analysis. The isotopic induced radioactivity calculations using these four libraries are compared with experimental values obtained in the Japan Atomic Energy Research Institute/U.S. Department of Energy collaborative program on fusion blanket neutronics. The nine materials studied are aluminum, silicon, titanium, vanadium, chromium, MnCu alloy, iron, nickel, niobium, and Type 316 stainless steel. The adequacy of the cross sections is investigated through the calculation to experiment analysis. As a result, most of the discrepancies in the calculations from experiments can be explained by inadequate activation cross sections. In addition, uncertainties due to neutron energy groups and neutron transport calculation are considered. The JENDL library gives the best agreement with experiments, followed by REAC{sup *}175, LIB90, and REAC{sup *}63, in this order. 45 refs., 32 figs., 5 tabs.

  13. A Hybrid Parachute Simulation Environment for the Orion Parachute Development Project

    NASA Technical Reports Server (NTRS)

    Moore, James W.

    2011-01-01

    A parachute simulation environment (PSE) has been developed that aims to take advantage of legacy parachute simulation codes and modern object-oriented programming techniques. This hybrid simulation environment provides the parachute analyst with a natural and intuitive way to construct simulation tasks while preserving the pedigree and authority of established parachute simulations. NASA currently employs four simulation tools for developing and analyzing air-drop tests performed by the CEV Parachute Assembly System (CPAS) Project. These tools were developed at different times, in different languages, and with different capabilities in mind. As a result, each tool has a distinct interface and set of inputs and outputs. However, regardless of the simulation code that is most appropriate for the type of test, engineers typically perform similar tasks for each drop test such as prediction of loads, assessment of altitude, and sequencing of disreefs or cut-aways. An object-oriented approach to simulation configuration allows the analyst to choose models of real physical test articles (parachutes, vehicles, etc.) and sequence them to achieve the desired test conditions. Once configured, these objects are translated into traditional input lists and processed by the legacy simulation codes. This approach minimizes the number of sim inputs that the engineer must track while configuring an input file. An object oriented approach to simulation output allows a common set of post-processing functions to perform routine tasks such as plotting and timeline generation with minimal sensitivity to the simulation that generated the data. Flight test data may also be translated into the common output class to simplify test reconstruction and analysis.

  14. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  15. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  16. Improved Arctic sea ice thickness projections using bias-corrected CMIP5 simulations

    NASA Astrophysics Data System (ADS)

    Melia, N.; Haines, K.; Hawkins, E.

    2015-12-01

    Projections of Arctic sea ice thickness (SIT) have the potential to inform stakeholders about accessibility to the region, but are currently rather uncertain. The latest suite of CMIP5 global climate models (GCMs) produce a wide range of simulated SIT in the historical period (1979-2014) and exhibit various biases when compared with the Pan-Arctic Ice-Ocean Modelling and Assimilation System (PIOMAS) sea ice reanalysis. We present a new method to constrain such GCM simulations of SIT via a statistical bias correction technique. The bias correction successfully constrains the spatial SIT distribution and temporal variability in the CMIP5 projections whilst retaining the climatic fluctuations from individual ensemble members. The bias correction acts to reduce the spread in projections of SIT and reveals the significant contributions of climate internal variability in the first half of the century and of scenario uncertainty from the mid-century onwards. The projected date of ice-free conditions in the Arctic under the RCP8.5 high emission scenario occurs in the 2050s, which is a decade earlier than without the bias correction, with potentially significant implications for stakeholders in the Arctic such as the shipping industry. The bias correction methodology developed could be similarly applied to other variables to reduce spread in climate projections more generally.

  17. Beam dynamics simulations and measurements at the Project X Test Facility

    SciTech Connect

    Gianfelice-Wendt, E.; Scarpine, V.E.; Webber, R.C.; /Fermilab

    2011-03-01

    Project X, under study at Fermilab, is a multitask high-power superconducting RF proton beam facility, aiming to provide high intensity protons for rare processes experiments and nuclear physics at low energy, and simultaneously for the production of neutrinos, as well as muon beams in the long term. A beam test facility - former known as High Intensity Neutrino Source (HINS) - is under commissioning for testing critical components of the project, e.g. dynamics and diagnostics at low beam energies, broadband beam chopping, RF power generation and distribution. In this paper we describe the layout of the test facility and present beam dynamics simulations and measurements.

  18. Overview of High-Fidelity Modeling Activities in the Numerical Propulsion System Simulations (NPSS) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2002-01-01

    A high-fidelity simulation of a commercial turbofan engine has been created as part of the Numerical Propulsion System Simulation Project. The high-fidelity computer simulation utilizes computer models that were developed at NASA Glenn Research Center in cooperation with turbofan engine manufacturers. The average-passage (APNASA) Navier-Stokes based viscous flow computer code is used to simulate the 3D flow in the compressors and turbines of the advanced commercial turbofan engine. The 3D National Combustion Code (NCC) is used to simulate the flow and chemistry in the advanced aircraft combustor. The APNASA turbomachinery code and the NCC combustor code exchange boundary conditions at the interface planes at the combustor inlet and exit. This computer simulation technique can evaluate engine performance at steady operating conditions. The 3D flow models provide detailed knowledge of the airflow within the fan and compressor, the high and low pressure turbines, and the flow and chemistry within the combustor. The models simulate the performance of the engine at operating conditions that include sea level takeoff and the altitude cruise condition.

  19. Integrated Vehicle Health Management Project-Modeling and Simulation for Wireless Sensor Applications

    NASA Technical Reports Server (NTRS)

    Wallett, Thomas M.; Mueller, Carl H.; Griner, James H., Jr.

    2009-01-01

    This paper describes the efforts in modeling and simulating electromagnetic transmission and reception as in a wireless sensor network through a realistic wing model for the Integrated Vehicle Health Management project at the Glenn Research Center. A computer model in a standard format for an S-3 Viking aircraft was obtained, converted to a Microwave Studio software format, and scaled to proper dimensions in Microwave Studio. The left wing portion of the model was used with two antenna models, one transmitting and one receiving, to simulate radio frequency transmission through the wing. Transmission and reception results were inconclusive.

  20. Hardware Accelerated Simulated Radiography

    SciTech Connect

    Laney, D; Callahan, S; Max, N; Silva, C; Langer, S; Frank, R

    2005-04-12

    We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32 bit floating point texture capabilities to obtain validated solutions to the radiative transport equation for X-rays. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedra that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester. We show that the hardware accelerated solution is faster than the current technique used by scientists.

  1. Fusion breeder

    SciTech Connect

    Moir, R.W.

    1982-04-20

    The fusion breeder is a fusion reactor designed with special blankets to maximize the transmutation by 14 MeV neutrons of uranium-238 to plutonium or thorium to uranium-233 for use as a fuel for fission reactors. Breeding fissile fuels has not been a goal of the US fusion energy program. This paper suggests it is time for a policy change to make the fusion breeder a goal of the US fusion program and the US nuclear energy program. The purpose of this paper is to suggest this policy change be made and tell why it should be made, and to outline specific research and development goals so that the fusion breeder will be developed in time to meet fissile fuel needs.

  2. The Importance of Simulation Workflow and Data Management in the Accelerated Climate Modeling for Energy Project

    NASA Astrophysics Data System (ADS)

    Bader, D. C.

    2015-12-01

    The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.

  3. Effects of non-local electron transport in one-dimensional and two-dimensional simulations of shock-ignited inertial confinement fusion targets

    NASA Astrophysics Data System (ADS)

    Marocchino, A.; Atzeni, S.; Schiavi, A.

    2014-01-01

    In some regions of a laser driven inertial fusion target, the electron mean-free path can become comparable to or even longer than the electron temperature gradient scale-length. This can be particularly important in shock-ignited (SI) targets, where the laser-spike heated corona reaches temperatures of several keV. In this case, thermal conduction cannot be described by a simple local conductivity model and a Fick's law. Fluid codes usually employ flux-limited conduction models, which preserve causality, but lose important features of the thermal flow. A more accurate thermal flow modeling requires convolution-like non-local operators. In order to improve the simulation of SI targets, the non-local electron transport operator proposed by Schurtz-Nicolaï-Busquet [G. P. Schurtz et al., Phys. Plasmas 7, 4238 (2000)] has been implemented in the DUED fluid code. Both one-dimensional (1D) and two-dimensional (2D) simulations of SI targets have been performed. 1D simulations of the ablation phase highlight that while the shock profile and timing might be mocked up with a flux-limiter; the electron temperature profiles exhibit a relatively different behavior with no major effects on the final gain. The spike, instead, can only roughly be reproduced with a fixed flux-limiter value. 1D target gain is however unaffected, provided some minor tuning of laser pulses. 2D simulations show that the use of a non-local thermal conduction model does not affect the robustness to mispositioning of targets driven by quasi-uniform laser irradiation. 2D simulations performed with only two final polar intense spikes yield encouraging results and support further studies.

  4. Effects of non-local electron transport in one-dimensional and two-dimensional simulations of shock-ignited inertial confinement fusion targets

    SciTech Connect

    Marocchino, A.; Atzeni, S.; Schiavi, A.

    2014-01-15

    In some regions of a laser driven inertial fusion target, the electron mean-free path can become comparable to or even longer than the electron temperature gradient scale-length. This can be particularly important in shock-ignited (SI) targets, where the laser-spike heated corona reaches temperatures of several keV. In this case, thermal conduction cannot be described by a simple local conductivity model and a Fick's law. Fluid codes usually employ flux-limited conduction models, which preserve causality, but lose important features of the thermal flow. A more accurate thermal flow modeling requires convolution-like non-local operators. In order to improve the simulation of SI targets, the non-local electron transport operator proposed by Schurtz-Nicolaï-Busquet [G. P. Schurtz et al., Phys. Plasmas 7, 4238 (2000)] has been implemented in the DUED fluid code. Both one-dimensional (1D) and two-dimensional (2D) simulations of SI targets have been performed. 1D simulations of the ablation phase highlight that while the shock profile and timing might be mocked up with a flux-limiter; the electron temperature profiles exhibit a relatively different behavior with no major effects on the final gain. The spike, instead, can only roughly be reproduced with a fixed flux-limiter value. 1D target gain is however unaffected, provided some minor tuning of laser pulses. 2D simulations show that the use of a non-local thermal conduction model does not affect the robustness to mispositioning of targets driven by quasi-uniform laser irradiation. 2D simulations performed with only two final polar intense spikes yield encouraging results and support further studies.

  5. Project Report on DOE Young Investigator Grant (Contract No. DE-FG02-02ER25525) Dynamic Scheduling and Fusion of Irregular Computation (August 15, 2002 to August 14, 2005)

    SciTech Connect

    Ding, Chen

    2005-08-16

    Computer simulation has become increasingly important in many scientiï¬ c disciplines, but its performance and scalability are severely limited by the memory throughput on today's computer systems. With the support of this grant, we ï¬ rst designed training-based prediction, which accurately predicts the memory performance of large applications before their execution. Then we developed optimization techniques using dynamic computation fusion and large-scale data transformation. The research work has three major components. The ï¬ rst is modeling and prediction of cache behav- ior. We have developed a new technique, which uses reuse distance information from training inputs then extracts a parameterized model of the program's cache miss rates for any input size and for any size of fully associative cache. Using the model we have built a web-based tool using three dimensional visualization. The new model can help to build cost-effective computer systems, design better benchmark suites, and improve task scheduling on heterogeneous systems. The second component is global computation for improving cache performance. We have developed an algorithm for dynamic data partitioning using sampling theory and probability distribution. Recent work from a number of groups show that manual or semi-manual computation fusion has signiï¬ cant beneï¬ ts in physical, mechanical, and biological simulations as well as information retrieval and machine veriï¬ cation. We have developed an au- tomatic tool that measures the potential of computation fusion. The new system can be used by high-performance application programmers to estimate the potential of locality improvement for a program before trying complex transformations for a speciï¬ c cache system. The last component studies models of spatial locality and the problem of data layout. In scientific programs, most data are stored in arrays. Grand challenge problems such as hydrodynamics simulation and data mining may use

  6. Interactive visualization system to analyze corrugated millimeter-waveguide component of ECH in nuclear fusion with FDTD simulation

    NASA Astrophysics Data System (ADS)

    Kashima, N.; Nakamura, H.; Tamura, Y.; Ito, A. M.; Kubo, S.

    2014-03-01

    We have simulated distribution of electromagnetic waves through the system composed of miter bends by Finite-Difference Time-Domain (FDTD) simulation. We develop the interactive visualization system using a new interactive GUI system which is composed of the virtual reality system and android tablet to analyze the FDTD simulation. The effect of the waveguide system with grooves have been investigated to quantitatively by visualization system. Comparing waveguide system with grooves and without grooves, grooves have been confirmed to suppress the surface current at the metal surface. The surface current at complex shape such as the miter bend have been investigated.

  7. Fusion Implementation

    SciTech Connect

    J.A. Schmidt

    2002-02-20

    If a fusion DEMO reactor can be brought into operation during the first half of this century, fusion power production can have a significant impact on carbon dioxide production during the latter half of the century. An assessment of fusion implementation scenarios shows that the resource demands and waste production associated with these scenarios are manageable factors. If fusion is implemented during the latter half of this century it will be one element of a portfolio of (hopefully) carbon dioxide limiting sources of electrical power. It is time to assess the regional implications of fusion power implementation. An important attribute of fusion power is the wide range of possible regions of the country, or countries in the world, where power plants can be located. Unlike most renewable energy options, fusion energy will function within a local distribution system and not require costly, and difficult, long distance transmission systems. For example, the East Coast of the United States is a prime candidate for fusion power deployment by virtue of its distance from renewable energy sources. As fossil fuels become less and less available as an energy option, the transmission of energy across bodies of water will become very expensive. On a global scale, fusion power will be particularly attractive for regions separated from sources of renewable energy by oceans.

  8. Hanford Waste Simulants Created to Support the Research and Development on the River Protection Project - Waste Treatment Plant

    SciTech Connect

    Eibling, R.E.

    2001-07-26

    The development of nonradioactive waste simulants to support the River Protection Project - Waste Treatment Plant bench and pilot-scale testing is crucial to the design of the facility. The report documents the simulants development to support the SRTC programs and the strategies used to produce the simulants.

  9. Haughton-Mars Project (HMP)/NASA 2006 Lunar Medical Contingency Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Scheuring, R. A.; Jones, J. A.; Lee, P.; Comtois, J. M.; Chappell, S.; Rafiq, A.; Braham, S.; Hodgson, E.; Sullivan, P.; Wilkinson, N.

    2006-01-01

    Medical requirements are currently being developed for NASA's space exploration program. Lunar surface operations for crews returning to the moon will be performed on a daily basis to conduct scientific research and construct a lunar habitat. Inherent to aggressive surface activities is the potential risk of injury to crew members. To develop an evidence-base for handling medical contingencies on the lunar surface, a simulation project was conducted using the moon-Mars analog environment at Devon Island, Nunavut, high Canadian Arctic. A review of the Apollo lunar surface activities and personal communications with Apollo lunar crew members provided a knowledge base of plausible scenarios that could potentially injure an astronaut during a lunar extravehicular activity. Objectives were established to 1) demonstrate stabilization, field extraction and transfer an injured crew member to the habitat and 2) evaluate audio, visual and biomedical communication capabilities with ground controllers at multiple mission control centers. The simulation project s objectives were achieved. Among these objectives were 1) extracting a crew member from a sloped terrain by a two-member team in a 1-g analog environment, 2) establishing real-time communication to multiple space centers, 3) providing biomedical data to flight controllers and crew members, and 4) establishing a medical diagnosis and treatment plan from a remote site. The simulation project provided evidence for the types of equipment and methods needed for planetary space exploration. During the project, the crew members were confronted with a number of unexpected scenarios including environmental, communications, EVA suit, and navigation challenges. These trials provided insight into the challenges of carrying out a medical contingency in an austere environment. The knowledge gained from completing the objectives of this project will be incorporated into the exploration medical requirements involving an incapacited

  10. Information integration for data fusion

    SciTech Connect

    Bray, O.H.

    1997-01-01

    Data fusion has been identified by the Department of Defense as a critical technology for the U.S. defense industry. Data fusion requires combining expertise in two areas - sensors and information integration. Although data fusion is a rapidly growing area, there is little synergy and use of common, reusable, and/or tailorable objects and models, especially across different disciplines. The Laboratory-Directed Research and Development project had two purposes: to see if a natural language-based information modeling methodology could be used for data fusion problems, and if so, to determine whether this methodology would help identify commonalities across areas and achieve greater synergy. The project confirmed both of the initial hypotheses: that the natural language-based information modeling methodology could be used effectively in data fusion areas and that commonalities could be found that would allow synergy across various data fusion areas. The project found five common objects that are the basis for all of the data fusion areas examined: targets, behaviors, environments, signatures, and sensors. Many of the objects and the specific facts related to these objects were common across several areas and could easily be reused. In some cases, even the terminology remained the same. In other cases, different areas had their own terminology, but the concepts were the same. This commonality is important with the growing use of multisensor data fusion. Data fusion is much more difficult if each type of sensor uses its own objects and models rather than building on a common set. This report introduces data fusion, discusses how the synergy generated by this LDRD would have benefited an earlier successful project and contains a summary information model from that project, describes a preliminary management information model, and explains how information integration can facilitate cross-treaty synergy for various arms control treaties.

  11. Comparing ensemble projections of flooding against flood estimation by continuous simulation

    NASA Astrophysics Data System (ADS)

    Smith, Andrew; Freer, Jim; Bates, Paul; Sampson, Christopher

    2014-04-01

    Climate impact studies focused on the projection of changing flood risk are increasingly utilized to inform future flood risk policy. These studies typically use the output from global (GCMs) and regional climate models (RCMs). However the direct application of GCM/RCM output is controversial as often significant biases exist in predicted rainfall; instead a number of alternative 'correction' approaches have emerged. In this study an ensemble of RCMs from the ENSEMBLES and UKCP09 projects are applied, via a number of application techniques, to explore the possible impacts of climate change on flooding in the Avon catchment, in the UK. The analysis is conducted under a continuous simulation methodology, using a stochastic rainfall generator to drive the HBV-light rainfall run-off model under a parameter uncertainty framework. This permitted a comparison between the projections produced by differing application approaches, whilst also considering the uncertainty associated with flood risk projections under observed conditions. The results from each of the application approaches project an increase in annual maximum flows under the future (2061-2099) climate scenario. However the magnitude and spread of the projected changes varied significantly. These findings highlight the need to incorporate multiple approaches in climate impact studies focusing on flood risk. Additionally these results outline the significant uncertainties associated with return period estimates under current climate conditions, suggesting that uncertainty over this observed record already poses a challenge to develop robust risk management plans.

  12. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-08-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  13. High resolution global climate modelling; the UPSCALE project, a large simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-01-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environmental Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the high performance computing center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE dataset. This dataset is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  14. Towards the petaflop for Lattice QCD simulations the PetaQCD project

    NASA Astrophysics Data System (ADS)

    Anglès d'Auriac, Jean-Christian; Barthou, Denis; Becirevic, Damir; Bilhaut, René; Bodin, François; Boucaud, Philippe; Brand-Foissac, Olivier; Carbonell, Jaume; Eisenbeis, Christine; Gallard, Pascal; Grosdidier, Gilbert; Guichon, Pierre; Honoré, Pierre-François; Le Meur, Guy; Pène, Olivier; Rilling, Louis; Roudeau, Patrick; Seznec, André; Stocchi, Achille; Touze, François

    2010-04-01

    The study and design of a very ambitious petaflop cluster exclusively dedicated to Lattice QCD simulations started in early '08 among a consortium of 7 laboratories (IN2P3, CNRS, INRIA, CEA) and 2 SMEs. This consortium received a grant from the French ANR agency in July '08, and the PetaQCD project kickoff took place in January '09. Building upon several years of fruitful collaborative studies in this area, the aim of this project is to demonstrate that the simulation of a 256 x 1283 lattice can be achieved through the HMC/ETMC software, using a machine with efficient speed/cost/reliability/power consumption ratios. It is expected that this machine can be built out of a rather limited number of processors (e.g. between 1000 and 4000), although capable of a sustained petaflop CPU performance. The proof-of-concept should be a mock-up cluster built as much as possible with off-the-shelf components, and 2 particularly attractive axis will be mainly investigated, in addition to fast all-purpose multi-core processors: the use of the new brand of IBM-Cell processors (with on-chip accelerators) and the very recent Nvidia GP-GPUs (off-chip co-processors). This cluster will obviously be massively parallel, and heterogeneous. Communication issues between processors, implied by the Physics of the simulation and the lattice partitioning, will certainly be a major key to the project.

  15. Simulations of high-gain shock-ignited inertial-confinement-fusion implosions using less than 1 MJ of direct KrF-laser energy

    NASA Astrophysics Data System (ADS)

    Bates, Jason W.; Schmitt, Andrew J.; Fyfe, David E.; Obenschain, Steve P.; Zalesak, Steve T.

    2010-06-01

    In this paper, we report on recent numerical simulations of inertial-confinement-fusion (ICF) implosions using the FAST radiation hydrocode at the U.S. Naval Research Laboratory. Our study focuses on three classes of shock-ignited target designs utilizing less than 1 MJ of direct, krypton-fluoride (KrF) laser energy, which was "zoomed" to maximize the coupling efficiency. In the shock-ignition approach [R. Betti, C.D. Zhou, K.S. Anderson, et al., Phys. Rev. Lett. 98 (2007) 155001], a moderate-intensity, compressive laser pulse is followed by a short-duration high-intensity spike that launches a spherically-convergent shock wave to ignite a thick shell of compressed fuel. Such an arrangement appears to offer several significant advantages, including a low ignition threshold, high gain, and less susceptibility to the deleterious effects of hydrodynamic and laser-plasma instabilities. According to one-dimensional simulations, fusion gains over 200 can be achieved with shock-ignited targets using less than 750 kJ of laser energy. This represents a significant improvement in performance over conventional centrally-ignited designs. To examine the stability of these targets, several two-dimensional simulations were also performed that incorporated realistic perturbation sources such as laser imprinting and roughness spectra for inner/outer pellet surfaces. Although the simulations indicate that appreciable low-mode distortion of the fuel shell can occur at late time as a result of these perturbations, high gains are still achieved in many cases owing to the low in-flight aspect ratios of shock-ignited targets. We should remark, though, that the high convergence ratios of these same designs suggest that other sources of low-mode asymmetries, which were not considered in this study (e.g., beam misalignment and energy-balance errors), may be important in determining overall pellet stability and performance. We discuss these issues, as well as other salient design

  16. A fusion of minds

    NASA Astrophysics Data System (ADS)

    Corfield, Richard

    2013-02-01

    Mystery still surrounds the visit of the astronomer Sir Bernard Lovell to the Soviet Union in 1963. But his collaboration - and that of other British scientists - eased geopolitical tensions at the height of the Cold War and paved the way for today's global ITER fusion project, as Richard Corfield explains.

  17. Simulations in the Introductory Astronomy Laboratory: Six Years of Project CLEA

    NASA Astrophysics Data System (ADS)

    Marschall, L. A.

    1998-12-01

    Since 1992, Project CLEA (Contemporary Laboratory Experiences in Astronomy) has been developing introductory computer-based exercises aimed at the introductory astronomy laboratory. These exercises simulate important techniques of astronomical research using digital data and Windows- based software. Each of the 9 exercises developed to date consists of software, technical guides for teachers, and student manuals for the exercises. CLEA software is used widely at many institutions, and at a variety of setting from middle school to upperclass astronomy classes. The current design philosophy and goals of Project CLEA will be discussed, along with the results of both formal and informal assessments of the strengths and weaknesses of is approach. Plans for future development will be presented. Project CLEA is supported by grants from Gettysburg College and the National Science Foundation

  18. Improved point scale climate projections using a block bootstrap simulation and quantile matching method

    NASA Astrophysics Data System (ADS)

    Kokic, Philip; Jin, Huidong; Crimp, Steven

    2013-08-01

    Statistical downscaling methods are commonly used to address the scale mismatch between coarse resolution Global Climate Model output and the regional or local scales required for climate change impact assessments. The effectiveness of a downscaling method can be measured against four broad criteria: consistency with the existing baseline data in terms of means, trends and distributional characteristics; consistency with the broader scale climate data used to generate the projections; the degree of transparency and repeatability; and the plausibility of results produced. Many existing downscaling methods fail to fulfil all of these criteria. In this paper we examine a block bootstrap simulation technique combined with a quantile prediction and matching method for simulating future daily climate data. By utilising this method the distributional properties of the projected data will be influenced by the distribution of the observed data, the trends in predictors derived from the Global Climate Models and the relationship of these predictors to the observed data. Using observed data from several climate stations in Vanuatu and Fiji and out-of-sample validation techniques, we show that the method is successful at projecting various climate characteristics including the variability and auto-correlation of daily temperature and rainfall, the correlations between these variables and between spatial locations. This paper also illustrates how this novel method can produce more effective point scale projections and a more credible alternative to other approaches in the Pacific region.

  19. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    SciTech Connect

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  20. ITER Fusion Energy

    ScienceCinema

    Dr. Norbert Holtkamp

    2010-01-08

    ITER (in Latin ?the way?) is designed to demonstrate the scientific and technological feasibility of fusion energy. Fusion is the process by which two light atomic nuclei combine to form a heavier over one and thus release energy. In the fusion process two isotopes of hydrogen ? deuterium and tritium ? fuse together to form a helium atom and a neutron. Thus fusion could provide large scale energy production without greenhouse effects; essentially limitless fuel would be available all over the world. The principal goals of ITER are to generate 500 megawatts of fusion power for periods of 300 to 500 seconds with a fusion power multiplication factor, Q, of at least 10. Q ? 10 (input power 50 MW / output power 500 MW). The ITER Organization was officially established in Cadarache, France, on 24 October 2007. The seven members engaged in the project ? China, the European Union, India, Japan, Korea, Russia and the United States ? represent more than half the world?s population. The costs for ITER are shared by the seven members. The cost for the construction will be approximately 5.5 billion Euros, a similar amount is foreseen for the twenty-year phase of operation and the subsequent decommissioning.

  1. Magnetized Target Fusion

    NASA Technical Reports Server (NTRS)

    Griffin, Steven T.

    2002-01-01

    Magnetized target fusion (MTF) is under consideration as a means of building a low mass, high specific impulse, and high thrust propulsion system for interplanetary travel. This unique combination is the result of the generation of a high temperature plasma by the nuclear fusion process. This plasma can then be deflected by magnetic fields to provide thrust. Fusion is initiated by a small traction of the energy generated in the magnetic coils due to the plasma's compression of the magnetic field. The power gain from a fusion reaction is such that inefficiencies due to thermal neutrons and coil losses can be overcome. Since the fusion reaction products are directly used for propulsion and the power to initiate the reaction is directly obtained from the thrust generation, no massive power supply for energy conversion is required. The result should be a low engine mass, high specific impulse and high thrust system. The key is to successfully initiate fusion as a proof-of-principle for this application. Currently MSFC is implementing MTF proof-of-principle experiments. This involves many technical details and ancillary investigations. Of these, selected pertinent issues include the properties, orientation and timing of the plasma guns and the convergence and interface development of the "pusher" plasma. Computer simulations of the target plasma's behavior under compression and the convergence and mixing of the gun plasma are under investigation. This work is to focus on the gun characterization and development as it relates to plasma initiation and repeatability.

  2. Laser fusion monthly -- August 1980

    SciTech Connect

    Ahlstrom, H.G.

    1980-08-01

    This report documents the monthly progress for the laser fusion research at Lawrence Livermore National Laboratory. First it gives facilities report for both the Shiva and Argus projects. Topics discussed include; laser system for the Nova Project; the fusion experiments analysis facility; optical/x-ray streak camera; Shiva Dante System temporal response; 2{omega}{sub 0} experiment; and planning for an ICF engineering test facility.

  3. Image fusion

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    The topics covered include the following: a system overview of the basic components of a system designed to improve the ability of a pilot to fly through low-visibility conditions such as fog; the role of visual sciences; fusion issues; sensor characterization; sources of information; image processing; and image fusion.

  4. Survey of ion-acoustic-instability particle simulations and relevance to laser-fusion thermal-transport inhibition

    SciTech Connect

    Mead, W.C.

    1980-09-11

    Ion acoustic turbulence is examined as one mechanism which could contribute to the inhibition of electron thermal transport which has been inferred from many laser-plasma experiments. The behavior of the ion acoustic instability is discussed from the viewpoint of the literature of 2-dimensional particle-in-cell simulations. Simulation techniques, limitations, and reported saturation mechanisms and levels are discussed. A scaling law for the effective collision frequency ..nu..* can be fit to several workers' results to within an order-of-magnitude. The inferred ..nu..* is shown to be 1-2 orders-of-magnitude too small to account for the transport inhibition seen in Nd-laser-produced plasmas. Several differences between the simulation conditions and laser-produced plasma conditions are noted.

  5. Fusion of psychiatric and medical high fidelity patient simulation scenarios: effect on nursing student knowledge, retention of knowledge, and perception.

    PubMed

    Kameg, Kirstyn M; Englert, Nadine Cozzo; Howard, Valerie M; Perozzi, Katherine J

    2013-12-01

    High fidelity patient simulation (HFPS) has become an increasingly popular teaching methodology in nursing education. To date, there have not been any published studies investigating HFPS scenarios incorporating medical and psychiatric nursing content. This study utilized a quasi-experimental design to assess if HFPS improved student knowledge and retention of knowledge utilizing three parallel 30-item Elsevier HESI(TM) Custom Exams. A convenience sample of 37 senior level nursing students participated in the study. The results of the study revealed the mean HESI test scores decreased following the simulation intervention although an analysis of variance (ANOVA) determined the difference was not statistically significant (p = .297). Although this study did not reveal improved student knowledge following the HFPS experiences, the findings did provide preliminary evidence that HFPS may improve knowledge in students who are identified as "at-risk." Additionally, students responded favorably to the simulations and viewed them as a positive learning experience. PMID:24274245

  6. Differentiating Self-Projection from Simulation during Mentalizing: Evidence from fMRI

    PubMed Central

    Schurz, Matthias; Kogler, Christoph; Scherndl, Thomas; Kronbichler, Martin; Kühberger, Anton

    2015-01-01

    We asked participants to predict which of two colors a similar other (student) and a dissimilar other (retiree) likes better. We manipulated if color-pairs were two hues from the same color-category (e.g. green) or two conceptually different colors (e.g. green versus blue). In the former case, the mental state that has to be represented (i.e., the percept of two different hues of green) is predominantly non-conceptual or phenomenal in nature, which should promote mental simulation as a strategy for mentalizing. In the latter case, the mental state (i.e. the percept of green versus blue) can be captured in thought by concepts, which facilitates the use of theories for mentalizing. In line with the self-projection hypothesis, we found that cortical midline areas including vmPFC / orbitofrontal cortex and precuneus were preferentially activated for mentalizing about a similar other. However, activation was not affected by the nature of the color-difference, suggesting that self-projection subsumes simulation-like processes but is not limited to them. This indicates that self-projection is a universal strategy applied in different contexts—irrespective of the availability of theories for mentalizing. Along with midline activations linked to self-projection, we also observed activation in right lateral frontal and dorsal parietal areas showing a theory-like pattern. Taken together, this shows that mentalizing does not operate based on simulation or theory, but that both strategies are used concurrently to predict the choices of others. PMID:25807390

  7. Differentiating self-projection from simulation during mentalizing: evidence from fMRI.

    PubMed

    Schurz, Matthias; Kogler, Christoph; Scherndl, Thomas; Kronbichler, Martin; Kühberger, Anton

    2015-01-01

    We asked participants to predict which of two colors a similar other (student) and a dissimilar other (retiree) likes better. We manipulated if color-pairs were two hues from the same color-category (e.g. green) or two conceptually different colors (e.g. green versus blue). In the former case, the mental state that has to be represented (i.e., the percept of two different hues of green) is predominantly non-conceptual or phenomenal in nature, which should promote mental simulation as a strategy for mentalizing. In the latter case, the mental state (i.e. the percept of green versus blue) can be captured in thought by concepts, which facilitates the use of theories for mentalizing. In line with the self-projection hypothesis, we found that cortical midline areas including vmPFC / orbitofrontal cortex and precuneus were preferentially activated for mentalizing about a similar other. However, activation was not affected by the nature of the color-difference, suggesting that self-projection subsumes simulation-like processes but is not limited to them. This indicates that self-projection is a universal strategy applied in different contexts--irrespective of the availability of theories for mentalizing. Along with midline activations linked to self-projection, we also observed activation in right lateral frontal and dorsal parietal areas showing a theory-like pattern. Taken together, this shows that mentalizing does not operate based on simulation or theory, but that both strategies are used concurrently to predict the choices of others. PMID:25807390

  8. Using historical and projected future climate model simulations as drivers of agricultural and biological models (Invited)

    NASA Astrophysics Data System (ADS)

    Stefanova, L. B.

    2013-12-01

    Climate model evaluation is frequently performed as a first step in analyzing climate change simulations. Atmospheric scientists are accustomed to evaluating climate models through the assessment of model climatology and biases, the models' representation of large-scale modes of variability (such as ENSO, PDO, AMO, etc) and the relationship between these modes and local variability (e.g. the connection between ENSO and the wintertime precipitation in the Southeast US). While these provide valuable information about the fidelity of historical and projected climate model simulations from an atmospheric scientist's point of view, the application of climate model data to fields such as agriculture, ecology and biology may require additional analyses focused on the particular application's requirements and sensitivities. Typically, historical climate simulations are used to determine a mapping between the model and observed climate, either through a simple (additive for temperature or multiplicative for precipitation) or a more sophisticated (such as quantile matching) bias correction on a monthly or seasonal time scale. Plants, animals and humans however are not directly affected by monthly or seasonal means. To assess the impact of projected climate change on living organisms and related industries (e.g. agriculture, forestry, conservation, utilities, etc.), derivative measures such as the heating degree-days (HDD), cooling degree-days (CDD), growing degree-days (GDD), accumulated chill hours (ACH), wet season onset (WSO) and duration (WSD), among others, are frequently useful. We will present a comparison of the projected changes in such derivative measures calculated by applying: (a) the traditional temperature/precipitation bias correction described above versus (b) a bias correction based on the mapping between the historical model and observed derivative measures themselves. In addition, we will present and discuss examples of various application-based climate

  9. Microstructural Evolution and Mechanical Properties of Fusion Welds and Simulated Heat-Affected Zones in an Iron-Copper Based Multi-Component Steel

    NASA Astrophysics Data System (ADS)

    Farren, Jeffrey David

    NUCu-140 is a copper-precipitation strengthened steel that exhibits excellent mechanical properties with a relatively simple chemical composition and processing schedule. As a result, NUCu-140 is a candidate material for use in many naval and structural applications. Before NUCu-140 can be implemented as a replacement for currently utilized materials, a comprehensive welding strategy must be developed under a wide range of welding conditions. This research represents an initial step toward understanding the microstructural and mechanical property evolution that occurs during fusion welding of NUCu-140. The following dissertation is presented as a series of four chapters. Chapter one is a review of the relevant literature on the iron-copper system including the precipitation of copper in steel, the development of the NUCu family of alloys, and the formation of acicular ferrite in steel weldments. Chapter two is a detailed study of the precipitate, microstructural, and mechanical property evolution of NUCu-140 fusion welds. Microhardness testing, tensile testing, local-electrode atom probe (LEAP) tomography, MatCalc kinetic simulations, and Russell-Brown strengthening results for gas-tungsten and gas-metal arc welds are presented. Chapter three is a thorough study of the microstructural and mechanical property evolution that occurs in the four critical regions of the HAZ. Simulated HAZ specimens were produced and evaluated using microhardness, tensile testing, and charpy impact testing. MatCalc simulations and R-B strengthening calculations were also performed in an effort to model the experimentally observed mechanical property trends. Chapter 4 is a brief investigation into the capabilities of MatCalc and the R-B model to determine if the two techniques could be used as predictive tools for a series of binary iron-copper alloys without the aid of experimentally measured precipitate data. The mechanical property results show that local softening occurs in the heat

  10. Ion-kinetic simulations of D-3He gas-filled inertial confinement fusion target implosions with moderate to large Knudsen number

    NASA Astrophysics Data System (ADS)

    Larroche, O.; Rinderknecht, H. G.; Rosenberg, M. J.; Hoffman, N. M.; Atzeni, S.; Petrasso, R. D.; Amendt, P. A.; Séguin, F. H.

    2016-01-01

    Experiments designed to investigate the transition to non-collisional behavior in D3He-gas inertial confinement fusion target implosions display increasingly large discrepancies with respect to simulations by standard hydrodynamics codes as the expected ion mean-free-paths λc increase with respect to the target radius R (i.e., when the Knudsen number NK=λc/R grows). To take properly into account large NK's, multi-ion-species Vlasov-Fokker-Planck computations of the inner gas in the capsules have been performed, for two different values of NK, one moderate and one large. The results, including nuclear yield, reactivity-weighted ion temperatures, nuclear emissivities, and surface brightness, have been compared with the experimental data and with the results of hydrodynamical simulations, some of which include an ad hoc modeling of kinetic effects. The experimental results are quite accurately rendered by the kinetic calculations in the smaller-NK case, much better than by the hydrodynamical calculations. The kinetic effects at play in this case are thus correctly understood. However, in the higher-NK case, the agreement is much worse. The remaining discrepancies are shown to arise from kinetic phenomena (e.g., inter-species diffusion) occurring at the gas-pusher interface, which should be investigated in the future work.

  11. Ion-kinetic simulations of D-3He gas-filled inertial confinement fusion target implosions with moderate to large Knudsen number

    DOE PAGESBeta

    Larroche, O.; Rinderknecht, H. G.; Rosenberg, M. J.; Hoffman, N. M.; Atzeni, S.; Petrasso, R. D.; Amendt, P. A.; Seguin, F. H.

    2016-01-06

    Experiments designed to investigate the transition to non-collisional behavior in D3He-gas inertial confinement fusion target implosions display increasingly large discrepancies with respect to simulations by standard hydrodynamics codes as the expected ion mean-free-paths λc increase with respect to the target radius R (i.e., when the Knudsen number NK = λc/R grows). To take properly into account large NK's, multi-ion-species Vlasov-Fokker-Planck computations of the inner gas in the capsules have been performed, for two different values of NK, one moderate and one large. The results, including nuclear yield, reactivity-weighted ion temperatures, nuclear emissivities, and surface brightness, have been compared with themore » experimental data and with the results of hydrodynamical simulations, some of which include an ad hocmodeling of kinetic effects. The experimental results are quite accurately rendered by the kinetic calculations in the smaller-NK case, much better than by the hydrodynamical calculations. The kinetic effects at play in this case are thus correctly understood. However, in the higher-NK case, the agreement is much worse. Furthermore, the remaining discrepancies are shown to arise from kinetic phenomena (e.g., inter-species diffusion) occurring at the gas-pusher interface, which should be investigated in the future work.« less

  12. Geophysical data fusion for subsurface imaging

    NASA Astrophysics Data System (ADS)

    Hoekstra, P.; Vandergraft, J.; Blohm, M.; Porter, D.

    1993-08-01

    A geophysical data fusion methodology is under development to combine data from complementary geophysical sensors and incorporate geophysical understanding to obtain three dimensional images of the subsurface. The research reported here is the first phase of a three phase project. The project focuses on the characterization of thin clay lenses (aquitards) in a highly stratified sand and clay coastal geology to depths of up to 300 feet. The sensor suite used in this work includes time-domain electromagnetic induction (TDEM) and near surface seismic techniques. During this first phase of the project, enhancements to the acquisition and processing of TDEM data were studied, by use of simulated data, to assess improvements for the detection of thin clay layers. Secondly, studies were made of the use of compressional wave and shear wave seismic reflection data by using state-of-the-art high frequency vibrator technology. Finally, a newly developed processing technique, called 'data fusion' was implemented to process the geophysical data, and to incorporate a mathematical model of the subsurface strata. Examples are given of the results when applied to real seismic data collected at Hanford, WA, and for simulated data based on the geology of the Savannah River Site.

  13. Simulations of Aperture Synthesis Imaging Radar for the EISCAT_3D Project

    NASA Astrophysics Data System (ADS)

    La Hoz, C.; Belyey, V.

    2012-12-01

    EISCAT_3D is a project to build the next generation of incoherent scatter radars endowed with multiple 3-dimensional capabilities that will replace the current EISCAT radars in Northern Scandinavia. Aperture Synthesis Imaging Radar (ASIR) is one of the technologies adopted by the EISCAT_3D project to endow it with imaging capabilities in 3-dimensions that includes sub-beam resolution. Complemented by pulse compression, it will provide 3-dimensional images of certain types of incoherent scatter radar targets resolved to about 100 metres at 100 km range, depending on the signal-to-noise ratio. This ability will open new research opportunities to map small structures associated with non-homogeneous, unstable processes such as aurora, summer and winter polar radar echoes (PMSE and PMWE), Natural Enhanced Ion Acoustic Lines (NEIALs), structures excited by HF ionospheric heating, meteors, space debris, and others. To demonstrate the feasibility of the antenna configurations and the imaging inversion algorithms a simulation of synthetic incoherent scattering data has been performed. The simulation algorithm incorporates the ability to control the background plasma parameters with non-homogeneous, non-stationary components over an extended 3-dimensional space. Control over the positions of a number of separated receiving antennas, their signal-to-noise-ratios and arriving phases allows realistic simulation of a multi-baseline interferometric imaging radar system. The resulting simulated data is fed into various inversion algorithms. This simulation package is a powerful tool to evaluate various antenna configurations and inversion algorithms. Results applied to realistic design alternatives of EISCAT_3D will be described.

  14. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6). Simulation Design and Preliminary Results

    SciTech Connect

    Kravitz, Benjamin S.; Robock, Alan; Tilmes, S.; Boucher, Olivier; English, J.; Irvine, Peter; Jones, Andrew; Lawrence, M. G.; Maccracken, Michael C.; Muri, Helene O.; Moore, John; Niemeier, Ulrike; Phipps, Steven; Sillmann, Jana; Storelvmo, Trude; Wang, Hailong; Watanabe, Shingo

    2015-10-27

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  15. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): simulation design and preliminary results

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Robock, A.; Tilmes, S.; Boucher, O.; English, J. M.; Irvine, P. J.; Jones, A.; Lawrence, M. G.; MacCracken, M.; Muri, H.; Moore, J. C.; Niemeier, U.; Phipps, S. J.; Sillmann, J.; Storelvmo, T.; Wang, H.; Watanabe, S.

    2015-10-01

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  16. Toward Unanimous Projections for Sea Ice Using CMIP5 Multi-model Simulations

    NASA Astrophysics Data System (ADS)

    Yang, S.; Christensen, J. H.; Langen, P. P.; Thejll, P.

    2015-12-01

    Coupled global climate models have been used to provide future climate projections as major objective tools based on physical laws that govern the dynamics and thermodynamics of the climate system. However, while climate models in general predict declines in Arctic sea ice cover (i.e., ice extent and volume) from late 20th century through the next decades in response to increase of anthropogenic forcing, the model simulated Arctic sea ice demonstrates considerable biases in both the mean and the declining trend in comparison with the observations over the satellite era (1979-present). The models also show wide inter-model spread in hindcast and projected sea ice decline, raising the question of uncertainty in model predicted polar climate. In order to address the model uncertainty in the Arctic sea ice projection, we analyze the Arctic sea ice extent under the context of surface air temperature (SAT) as simulated in the historical, RCP4.5 and RCP8.5 experiments by 27 CMIP5 models. These 27 models are all we could obtain from the CMIP5 archive with sufficient gird information for processing the sea ice data. Unlike many previous studies in which only limited number of models were selected based on metrics of modeled sea ice characteristics for getting projected ice with reduced uncertainty, our analysis is applied to all model simulations with no discrimination. It is found that the changes in total Arctic sea ice in various seasons from one model are closely related to the changes in global mean SAT in the corresponding model. This relationship appears very similar in all models and agrees well with that in the observational data. In particular, the ratio of the total Arctic sea ice changes in March, September and annual mean with respect to the baseline climatology (1979-2008) are seen to linearly correlate to the global mean annual SAT anomaly, suggesting unanimous projection of the sea ice extent may be possible with this relationship. Further analysis is

  17. Projected strengthening of Amazonian dry season by constrained climate model simulations

    NASA Astrophysics Data System (ADS)

    Boisier, Juan P.; Ciais, Philippe; Ducharne, Agnès; Guimberteau, Matthieu

    2015-07-01

    The vulnerability of Amazonian rainforest, and the ecological services it provides, depends on an adequate supply of dry-season water, either as precipitation or stored soil moisture. How the rain-bearing South American monsoon will evolve across the twenty-first century is thus a question of major interest. Extensive savanization, with its loss of forest carbon stock and uptake capacity, is an extreme although very uncertain scenario. We show that the contrasting rainfall projections simulated for Amazonia by 36 global climate models (GCMs) can be reproduced with empirical precipitation models, calibrated with historical GCM data as functions of the large-scale circulation. A set of these simple models was therefore calibrated with observations and used to constrain the GCM simulations. In agreement with the current hydrologic trends, the resulting projection towards the end of the twenty-first century is for a strengthening of the monsoon seasonal cycle, and a dry-season lengthening in southern Amazonia. With this approach, the increase in the area subjected to lengthy--savannah-prone--dry seasons is substantially larger than the GCM-simulated one. Our results confirm the dominant picture shown by the state-of-the-art GCMs, but suggest that the `model democracy' view of these impacts can be significantly underestimated.

  18. Projected changes in atmospheric river events in Arizona as simulated by global and regional climate models

    NASA Astrophysics Data System (ADS)

    Rivera, Erick R.; Dominguez, Francina

    2015-12-01

    Inland-penetrating atmospheric rivers (ARs) affect the United States Southwest and significantly contribute to cool season precipitation. In this study, we examine the results from an ensemble of dynamically downscaled simulations from the North American Regional Climate Change Assessment Program (NARCCAP) and their driving general circulation models (GCMs) in order to determine statistically significant changes in the intensity of the cool season ARs impacting Arizona and the associated precipitation. Future greenhouse gas emissions follow the A2 emission scenario from the Intergovernmental Panel on Climate Change Fourth Assessment Report simulations. We find that there is a consistent and clear intensification of the AR-related water vapor transport in both the global and regional simulations which reflects the increase in water vapor content due to warmer atmospheric temperatures, according to the Clausius-Clapeyron relationship. However, the response of AR-related precipitation intensity to increased moisture flux and column-integrated water vapor is weak and no significant changes are projected either by the GCMs or the NARCCAP models. This lack of robust precipitation variations can be explained in part by the absence of meaningful changes in both the large-scale water vapor flux convergence and the maximum positive relative vorticity in the GCMs. Additionally, some global models show a robust decrease in relative humidity which may also be responsible for the projected precipitation patterns.

  19. CMIP5 simulated climate conditions of the Greater Horn of Africa (GHA). Part II: projected climate

    NASA Astrophysics Data System (ADS)

    Otieno, Vincent O.; Anyah, R. O.

    2013-10-01

    This is the second of the two-part paper series on the analysis and evaluation of the Fifth phase of Coupled Model Intercomparison Project (CMIP5) simulation of contemporary climate as well as IPCC, AR5 Representative Concentrations Pathways (RCP), 4.5 and 8.5 scenarios projections of the Greater Horn of Africa (GHA) Climate. In the first part (Otieno and Anyah in Clim Dyn, 2012) we focused on the historical simulations, whereas this second part primarily focuses on future projections based on the two scenarios. Six Earth System Models (ESMs) from CMIP5 archive have been used to characterize projected changes in seasonal and annual mean precipitation, temperature and the hydrological cycle by the middle of twenty-first century over the GHA region, based on IPCC, 5th Assessment Report (AR5) RCP4.5 and RCP8.5 scenarios. Nearly all the models outputs analyzed reproduce the correct mean annual cycle of precipitation, with some biases among the models in capturing the correct peak of precipitation cycle, more so, March-April-May (MAM) seasonal rainfall over the equatorial GHA region. However, there is significant variation among models in projected precipitation anomalies, with some models projecting an average increase as others project a decrease in precipitation during different seasons. The ensemble mean of the ESMs indicates that the GHA region has been experiencing a steady increase in both precipitation and temperature beginning in the early 1980s and 1970s respectively in both RCP4.5 and RCP8.5 scenarios. Going by the ensemble means, temperatures are projected to steadily increase uniformly in all the seasons at a rate of 0.3/0.5 °C/decade under RCP4.5/8.5 scenarios over northern GHA region leading to an approximate temperature increase of 2/3 °C by the middle of the century. On the other hand, temperatures will likely increase at a rate of 0.3/0.4 °C/decade under RCP4.5/8.5 scenarios in both equatorial and southern GHA region leading to an approximate

  20. Early Career. Harnessing nanotechnology for fusion plasma-material interface research in an in-situ particle-surface interaction facility

    SciTech Connect

    Allain, Jean Paul

    2014-08-08

    This project consisted of fundamental and applied research of advanced in-situ particle-beam interactions with surfaces/interfaces to discover novel materials able to tolerate intense conditions at the plasma-material interface (PMI) in future fusion burning plasma devices. The project established a novel facility that is capable of not only characterizing new fusion nanomaterials but, more importantly probing and manipulating materials at the nanoscale while performing subsequent single-effect in-situ testing of their performance under simulated environments in fusion PMI.

  1. WRF4G project: Advances in running climate simulations on the EGI Infrastructure

    NASA Astrophysics Data System (ADS)

    Blanco, Carlos; Cofino, Antonio S.; Fernández Quiruelas, Valvanuz; García, Markel; Fernández, Jesús

    2014-05-01

    The Weather Research and Forecasting For Grid (WRF4G) project is a two-year Spanish National R&D project, which has started in 2011. It is now a well established project, involving scientists and technical staff from several institutions, which contribute results to international initiatives such as CORDEX and European FP7 projects such as SPECS and EUPORIAS. The aim of the WRF4G project is to homogenize access hybrid Distributed Computer Infrastructures (DCIs), such as HPC and Grid infrastructures, for climate researchers. Additionally, it provides a productive interface to accomplish ambitious climate experiments such as regional hind-cast/forecast and sensitivity studies. Although Grid infrastructures are very powerful, they have some drawbacks for executing climate applications such as the WRF model. This makes necessary to encapsulate the applications in a middleware in order to provide the appropriate services for monitoring and management. Therefore, the challenge of the WRF4G project is to develop a generic adaptation framework (WRF4G framework) to disseminate it to the scientific community. The framework aims at simplifying the model access by releasing climate scientists from technical and computational aspects. In this contribution, we present some new advances of the WRF4G framework, including new components for designing experiments, simulation monitoring and data management. Additionally, we will show how WRF4G makes possible to run complex experiments on EGI infrastructures concurrently over several VOs such as esr and earth.vo.ibergrid. http://www.meteo.unican.es/software/wrf4g This work has been partially funded by the European Regional Development Fund (ERDF) and the Spanish National R&D Plan 2008-2011 (CGL2011-28864, WRF4G)

  2. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  3. The APOSTLE project: Local Group kinematic mass constraints and simulation candidate selection

    NASA Astrophysics Data System (ADS)

    Fattahi, Azadeh; Navarro, Julio F.; Sawala, Till; Frenk, Carlos S.; Oman, Kyle A.; Crain, Robert A.; Furlong, Michelle; Schaller, Matthieu; Schaye, Joop; Theuns, Tom; Jenkins, Adrian

    2016-03-01

    We use a large sample of isolated dark matter halo pairs drawn from cosmological N-body simulations to identify candidate systems whose kinematics match that of the Local Group (LG) of galaxies. We find, in agreement with the `timing argument' and earlier work, that the separation and approach velocity of the Milky Way (MW) and Andromeda (M31) galaxies favour a total mass for the pair of ˜5 × 1012 M⊙. A mass this large, however, is difficult to reconcile with the small relative tangential velocity of the pair, as well as with the small deceleration from the Hubble flow observed for the most distant LG members. Halo pairs that match these three criteria have average masses a factor of ˜2 times smaller than suggested by the timing argument, but with large dispersion. Guided by these results, we have selected 12 halo pairs with total mass in the range 1.6-3.6 × 1012 M⊙ for the APOSTLE project (A Project Of Simulating The Local Environment), a suite of hydrodynamical resimulations at various numerical resolution levels (reaching up to ˜104 M⊙ per gas particle) that use the subgrid physics developed for the EAGLE project. These simulations reproduce, by construction, the main kinematics of the MW-M31 pair, and produce satellite populations whose overall number, luminosities, and kinematics are in good agreement with observations of the MW and M31 companions. The APOSTLE candidate systems thus provide an excellent testbed to confront directly many of the predictions of the Λ cold dark matter cosmology with observations of our local Universe.

  4. 55Fe effect on enhancing ferritic steel He/dpa ratio in fission reactor irradiations to simulate fusion conditions

    SciTech Connect

    Liu, Haibo; Abdou, Mohamed A.; Greenwood, Lawrence R.

    2013-11-01

    How to increase the ferritic steel He(appm)/dpa ratio in a fission reactor neutron spectrum is an important question for fusion reactor material testing. An early experiment showed that the accelerated He(appm)/dpa ratio of about 2.3 was achieved for 96% enriched 54Fe in iron with 458.2 effective full power days (EFPD) irradiation in the High Flux Isotope Reactor (HFIR), ORNL. Greenwood suggested that the transmutation produced 55Fe has a thermal neutron helium production cross section which may have an effect on this result. In the current work, the ferritic steel He(appm)/dpa ratio is studied in the neutron spectrum of HFIR with 55Fe thermal neutron helium production taken into account. The available ENDF-b format 55Fe incident neutron cross section file from TENDL, Netherlands, is first input into the calculation model. A benchmark calculation for the same sample as used in the aforementioned experiment was used to adjust and evaluate the TENDL 55Fe (n, a) cross section values. The analysis shows a decrease of a factor of 6700 for the TENDL 55Fe (n, a) cross section in the intermediate and low energy regions is required in order to fit the experimental results. The best fit to the cross section value at thermal neutron energy is about 27 mb. With the adjusted 55Fe (n, a) cross sections, calculation show that the 54Fe and 55Fe isotopes can be enriched by the isotopic tailoring technique in a ferritic steel sample irradiated in HFIR to significantly enhance the helium production rate. The results show that a 70% enriched 54Fe and 30% enriched 55Fe ferritic steel sample would produce a He(appm)/dpa ratio of about 13 initially in the HFIR peripheral target position (PTP). After one year irradiation, the ratio decreases to about 10. This new calculation can be used to guide future isotopic tailoring experiments designed to increase the He(appm)/dpa ratio in fission reactors. A benchmark experiment is suggested to be performed to evaluate the 55Fe (n, a) cross section

  5. MEchatronic REspiratory System SImulator for Neonatal Applications (MERESSINA) project: a novel bioengineering goal

    PubMed Central

    Scaramuzzo, Rosa T; Ciantelli, Massimiliano; Baldoli, Ilaria; Bellanti, Lisa; Gentile, Marzia; Cecchi, Francesca; Sigali, Emilio; Tognarelli, Selene; Ghirri, Paolo; Mazzoleni, Stefano; Menciassi, Arianna; Cuttano, Armando; Boldrini, Antonio; Laschi, Cecilia; Dario, Paolo

    2013-01-01

    Respiratory function is mandatory for extrauterine life, but is sometimes impaired in newborns due to prematurity, congenital malformations, or acquired pathologies. Mechanical ventilation is standard care, but long-term complications, such as bronchopulmonary dysplasia, are still largely reported. Therefore, continuous medical education is mandatory to correctly manage devices for assistance. Commercially available breathing function simulators are rarely suitable for the anatomical and physiological realities. The aim of this study is to develop a high-fidelity mechatronic simulator of neonatal airways and lungs for staff training and mechanical ventilator testing. The project is divided into three different phases: (1) a review study on respiratory physiology and pathophysiology and on already available single and multi-compartment models; (2) the prototyping phase; and (3) the on-field system validation. PMID:23966804

  6. Simulation of the time-projection chamber with triple GEMs for the LAMPS at RAON

    NASA Astrophysics Data System (ADS)

    Jhang, Genie; Lee, Jung Woo; Moon, Byul; Hong, Byungsik; Ahn, Jung Keun; Lee, Jong-Won; Lee, Kyong Sei; Kim, Young Jin; Lee, Hyo Sang

    2016-03-01

    The time-projection chamber (TPC) with triple gas-electron multipliers (GEMs) is designed for the large-acceptance multipurpose spectrometer (LAMPS) at the new radioactive ion-beam facility RAON, a pure Korean term for the accelerator complex, in Korea. The simulation environment has been set up to test the performance of the designed chamber, and the software package for analysis has been developed. Particle identification has been demonstrated to be possible up to 2 GeV/ c in momentum for particles with the charge number 1 and 2 by using the simulated heavy-ion events. The transverse-momentum resolutions are expected to be about 2% for protons and about 1.3% for pions in the relatively high-momentum region. The total reconstruction efficiencies are estimated to be about 90 and 80% for charged pions and protons, respectively.

  7. Introducing the Illustris Project: simulating the coevolution of dark and visible matter in the Universe

    NASA Astrophysics Data System (ADS)

    Vogelsberger, Mark; Genel, Shy; Springel, Volker; Torrey, Paul; Sijacki, Debora; Xu, Dandan; Snyder, Greg; Nelson, Dylan; Hernquist, Lars

    2014-10-01

    We introduce the Illustris Project, a series of large-scale hydrodynamical simulations of galaxy formation. The highest resolution simulation, Illustris-1, covers a volume of (106.5 Mpc)3, has a dark mass resolution of 6.26 × 106 M⊙, and an initial baryonic matter mass resolution of 1.26 × 106 M⊙. At z = 0 gravitational forces are softened on scales of 710 pc, and the smallest hydrodynamical gas cells have an extent of 48 pc. We follow the dynamical evolution of 2 × 18203 resolution elements and in addition passively evolve 18203 Monte Carlo tracer particles reaching a total particle count of more than 18 billion. The galaxy formation model includes: primordial and metal-line cooling with self-shielding corrections, stellar evolution, stellar feedback, gas recycling, chemical enrichment, supermassive black hole growth, and feedback from active galactic nuclei. Here we describe the simulation suite, and contrast basic predictions of our model for the present-day galaxy population with observations of the local universe. At z = 0 our simulation volume contains about 40 000 well-resolved galaxies covering a diverse range of morphologies and colours including early-type, late-type and irregular galaxies. The simulation reproduces reasonably well the cosmic star formation rate density, the galaxy luminosity function, and baryon conversion efficiency at z = 0. It also qualitatively captures the impact of galaxy environment on the red fractions of galaxies. The internal velocity structure of selected well-resolved disc galaxies obeys the stellar and baryonic Tully-Fisher relation together with flat circular velocity curves. In the well-resolved regime, the simulation reproduces the observed mix of early-type and late-type galaxies. Our model predicts a halo mass dependent impact of baryonic effects on the halo mass function and the masses of haloes caused by feedback from supernova and active galactic nuclei.

  8. Changing Climate Extremes in the Northeast: CMIP5 Simulations and Projections

    NASA Astrophysics Data System (ADS)

    Thibeault, J. M.; Seth, A.

    2013-12-01

    Extreme climate events are known to have severe impacts on human and natural systems. As greenhouse warming progresses, a major concern is the potential for an increase in the frequency and intensity of extreme events. The Northeast (defined as the Northeast US, southern Quebec, and southeastern Ontario) is sensitive to climate extremes. The region is prone to flooding and drought, which poses challenges for infrastructure and water resource management, and increases risks to agriculture and forests. Extreme heat can be dangerous to human health, especially in the large urban centers of the Northeast. Annual average temperatures have steadily increased since the 1970s, accompanied by more frequent extremely hot weather, a longer growing season, and fewer frost days. Heavy precipitation events have become more frequent in recent decades. This research examines multi-model projections of annual and monthly extreme indices for the Northeast, using extreme indices computed by the Expert Team on Climate Change Detection and Indices (ETCCDI) for twenty-three global climate models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) for the 20th century historical and RCP8.5 experiments. Model simulations are compared to HadEX2 and ERA-interim gridded observations. CMIP5 simulations are consistent with observations - conditions in the Northeast are already becoming warmer and wetter. Projections indicate significant shifts toward warmer and wetter conditions by the middle century (2041-2070). Most indices are projected to be largely outside their late 20th century ranges by the late century (2071-2099). These results provide important information to stakeholders developing plans to lessen the adverse impacts of a warmer and wetter climate in the Northeast.

  9. Evaluation of Tropospheric Water Vapor Simulations from the Atmospheric Model Intercomparison Project

    NASA Technical Reports Server (NTRS)

    Gaffen, Dian J.; Rosen, Richard D.; Salstein, David A.; Boyle, James S.

    1997-01-01

    Simulations of humidity from 28 general circulation models for the period 1979-88 from the Atmospheric Model Intercomparison Project are compared with observations from radiosondes over North America and the globe and with satellite microwave observations over the Pacific basin. The simulations of decadal mean values of precipitable water (W) integrated over each of these regions tend to be less moist than the real atmosphere in all three cases; the median model values are approximately 5% less than the observed values. The spread among the simulations is larger over regions of high terrain, which suggests that differences in methods of resolving topographic features are important. The mean elevation of the North American continent is substantially higher in the models than is observed, which may contribute to the overall dry bias of the models over that area. The authors do not find a clear association between the mean topography of a model and its mean W simulation, however, which suggests that the bias over land is not purely a matter of orography. The seasonal cycle of W is reasonably well simulated by the models, although over North America they have a tendency to become moister more quickly in the spring than is observed. The interannual component of the variability of W is not well captured by the models over North America. Globally, the simulated W values show a signal correlated with the Southern Oscillation index but the observations do not. This discrepancy may be related to deficiencies in the radiosonde network, which does not sample the tropical ocean regions well. Overall, the interannual variability of W, as well as its climatology and mean seasonal cycle, are better described by the median of the 28 simulations than by individual members of the ensemble. Tests to learn whether simulated precipitable water, evaporation, and precipitation values may be related to aspects of model formulation yield few clear signals, although the authors find, for

  10. Validation of Solar Sail Simulations for the NASA Solar Sail Demonstration Project

    NASA Technical Reports Server (NTRS)

    Braafladt, Alexander C.; Artusio-Glimpse, Alexandra B.; Heaton, Andrew F.

    2014-01-01

    NASA's Solar Sail Demonstration project partner L'Garde is currently assembling a flight-like sail assembly for a series of ground demonstration tests beginning in 2015. For future missions of this sail that might validate solar sail technology, it is necessary to have an accurate sail thrust model. One of the primary requirements of a proposed potential technology validation mission will be to demonstrate solar sail thrust over a set time period, which for this project is nominally 30 days. This requirement would be met by comparing a L'Garde-developed trajectory simulation to the as-flown trajectory. The current sail simulation baseline for L'Garde is a Systems Tool Kit (STK) plug-in that includes a custom-designed model of the L'Garde sail. The STK simulation has been verified for a flat plate model by comparing it to the NASA-developed Solar Sail Spaceflight Simulation Software (S5). S5 matched STK with a high degree of accuracy and the results of the validation indicate that the L'Garde STK model is accurate enough to meet the potential future mission requirements. Additionally, since the L'Garde sail deviates considerably from a flat plate, a force model for a non-flat sail provided by L'Garde sail was also tested and compared to a flat plate model in S5. This result will be used in the future as a basis of comparison to the non-flat sail model being developed for STK.

  11. Fusion Power.

    ERIC Educational Resources Information Center

    Dingee, David A.

    1979-01-01

    Discusses the extraordinary potential, the technical difficulties, and the financial problems that are associated with research and development of fusion power plants as a major source of energy. (GA)

  12. Numerical tokamak turbulence project (OFES grand challenge)

    SciTech Connect

    Beer, M; Cohen, B I; Crotinger, J; Dawson, J; Decyk, V; Dimits, A M; Dorland, W D; Hammett, G W; Kerbel, G D; Leboeuf, J N; Lee, W W; Lin, Z; Nevins, W M; Reynders, J; Shumaker, D E; Smith, S; Sydora, R; Waltz, R E; Williams, T

    1999-08-27

    The primary research objective of the Numerical Tokamak Turbulence Project (NTTP) is to develop a predictive ability in modeling turbulent transport due to drift-type instabilities in the core of tokamak fusion experiments, through the use of three-dimensional kinetic and fluid simulations and the derivation of reduced models.

  13. Education and Public Outreach at The Pavilion Lake Research Project: Fusion of Science and Education using Web 2.0

    NASA Astrophysics Data System (ADS)

    Cowie, B. R.; Lim, D. S.; Pendery, R.; Laval, B.; Slater, G. F.; Brady, A. L.; Dearing, W. L.; Downs, M.; Forrest, A.; Lees, D. S.; Lind, R. A.; Marinova, M.; Reid, D.; Seibert, M. A.; Shepard, R.; Williams, D.

    2009-12-01

    The Pavilion Lake Research Project (PLRP) is an international multi-disciplinary science and exploration effort to explain the origin and preservation potential of freshwater microbialites in Pavilion Lake, British Columbia, Canada. Using multiple exploration platforms including one person DeepWorker submersibles, Autonomous Underwater Vehicles, and SCUBA divers, the PLRP acts as an analogue research site for conducting science in extreme environments, such as the Moon or Mars. In 2009, the PLRP integrated several Web 2.0 technologies to provide a pilot-scale Education and Public Outreach (EPO) program targeting the internet savvy generation. The seamless integration of multiple technologies including Google Earth, Wordpress, Youtube, Twitter and Facebook, facilitated the rapid distribution of exciting and accessible science and exploration information over multiple channels. Field updates, science reports, and multimedia including videos, interactive maps, and immersive visualization were rapidly available through multiple social media channels, partly due to the ease of integration of these multiple technologies. Additionally, the successful application of videoconferencing via a readily available technology (Skype) has greatly increased the capacity of our team to conduct real-time education and public outreach from remote locations. The improved communication afforded by Web 2.0 has increased the quality of EPO provided by the PLRP, and has enabled a higher level of interaction between the science team and the community at large. Feedback from these online interactions suggest that remote communication via Web 2.0 technologies were effective tools for increasing public discourse and awareness of the science and exploration activity at Pavilion Lake.

  14. Effects of baseline conditions on the simulated hydrologic response to projected climate change

    USGS Publications Warehouse

    Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.

    2011-01-01

    Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.

  15. Final Report on Project 01-ERD-017 ''Smart Nanostructures From Computer Simulations''

    SciTech Connect

    Grossman, J C; Williamson, A J

    2004-02-13

    This project had two main objectives. The first major goal was to develop new, powerful computational simulation capabilities. It was important that these tools have the combination of the accuracy needed to describe the quantum mechanical nature of nanoscale systems and the efficiency required to be applied to realistic, experimentally derived materials. The second major goal was to apply these computational methods to calculate and predict the properties of quantum dots--initially composed of silicon, but then of other elements--which could be used to build novel nanotechnology devices. The driving factor of our purpose has been that, through the development and successful application of these tools, we would generate a new capability at LLNL that could be used to make nanostructured materials ''smarter'', e.g., by selectively predicting how to engineering specific, desired properties. To carry out the necessary work to successfully complete this project and deliver on our goals, we established a two-pronged effort from the beginning: (1) to work on developing new, more efficient algorithms and quantum simulation tools, and (2) to solve problems and make predictions regarding properties of quantum dots which were being studied experimentally here at Livermore.

  16. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): simulation design and preliminary results

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Robock, A.; Tilmes, S.; Boucher, O.; English, J. M.; Irvine, P. J.; Jones, A.; Lawrence, M. G.; MacCracken, M.; Muri, H.; Moore, J. C.; Niemeier, U.; Phipps, S. J.; Sillmann, J.; Storelvmo, T.; Wang, H.; Watanabe, S.

    2015-06-01

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  17. A National Collaboratory to Advance the Science of High Temperature Plasma Physics for Magnetic Fusion

    SciTech Connect

    Schissel, David P.; Abla, G.; Burruss, J. R.; Feibush, E.; Fredian, T. W.; Goode, M. M.; Greenwald, M. J.; Keahey, K.; Leggett, T.; Li, K.; McCune, D. C.; Papka, M. E.; Randerson, L.; Sanderson, A.; Stillerman, J.; Thompson, M. R.; Uram, T.; Wallace, G.

    2012-12-20

    This report summarizes the work of the National Fusion Collaboratory (NFC) Project to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. The original objective of the NFC project was to develop and deploy a national FES Grid (FusionGrid) that would be a system for secure sharing of computation, visualization, and data resources over the Internet. The goal of FusionGrid was to allow scientists at remote sites to participate as fully in experiments and computational activities as if they were working on site thereby creating a unified virtual organization of the geographically dispersed U.S. fusion community. The vision for FusionGrid was that experimental and simulation data, computer codes, analysis routines, visualization tools, and remote collaboration tools are to be thought of as network services. In this model, an application service provider (ASP provides and maintains software resources as well as the necessary hardware resources. The project would create a robust, user-friendly collaborative software environment and make it available to the US FES community. This Grid's resources would be protected by a shared security infrastructure including strong authentication to identify users and authorization to allow stakeholders to control their own resources. In this environment, access to services is stressed rather than data or software portability.

  18. High Level Information Fusion (HLIF) with nested fusion loops

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  19. The numerical simulation and analysis of three-dimensional seawater intrusion and protection projects in porous media

    NASA Astrophysics Data System (ADS)

    Yuan, Yirang; Liang, Dong; Rui, Hongxing

    2009-01-01

    For the three-dimensional seawater intrusion and protection system, the model of dynamics of fluids in porous media and the modified upwind finite difference fractional steps schemes are put forward. Based on the numerical simulation of the practical situation in the Laizhou Bay Area of Shandong Province, predictive numerical simulation and analysis of the consequence of protection projects, underground dams, tidal barrage projects and the applied modular form of project adjustment have been finished. By using the theory and techniques of differential equation prior estimates, the convergence results have been got.

  20. Simulated effect of deep-sea sedimentation and terrestrial weathering on projections of ocean acidification

    NASA Astrophysics Data System (ADS)

    Cao, Long; Zheng, Meidi; Caldeira, Ken

    2016-04-01

    Projections of ocean acidification have often been based on ocean carbon cycle models that do not represent deep-sea sedimentation and terrestrial weathering. Here we use an Earth system model of intermediate complexity to quantify the effect of sedimentation and weathering on projections of ocean acidification under an intensive CO2 emission scenario that releases 5000 Pg C after year 2000. In our simulations, atmospheric CO2 reaches a peak concentration of 2123 ppm near year 2300 with a maximum reduction in surface pH of 0.8. Consideration of deep-sea sedimentation and terrestrial weathering has negligible effect on these peak changes. Only after several millenniums, sedimentation and weathering feedbacks substantially affect projected ocean acidification. Ten thousand years from today, in the constant-alkalinity simulation, surface pH is reduced by ˜0.7 with 95% of the polar oceans undersaturated with respect to calcite, and no ocean has a calcite saturation horizon (CSH) that is deeper than 1000 m. With the consideration of sediment feedback alone, surface pH is reduced by ˜0.5 with 35% of the polar oceans experiencing calcite undersaturation, and 8% global ocean has a CSH deeper than 1000 m. With the addition of weathering feedback, depending on the weathering parameterizations, surface pH is reduced by 0.2-0.4 with no polar oceans experiencing calcite undersaturation, and 30-80% ocean has a CSH that is deeper than 1000 m. Our results indicate that deep-sea sedimentation and terrestrial weathering play an important role in long-term ocean acidification, but have little effect on mitigating ocean acidification in the coming centuries.

  1. Simulation of Regional Explosion S-Phases (SiRES) Project

    SciTech Connect

    Myers, S C; Wagner, J; Larsen, S; Rodgers, A; Mayeda, K; Smith, K; Walter, W

    2003-07-01

    Seismic monitoring aims to locate and identify all events that generate elastic waves in the solid earth. Amplitudes and arrival-times of seismic phases are commonly exploited to accomplish these goals. For large events that produce strong body and surface waves out to 90{sup o}, events can be accurately located and the Ms/m{sub b} discriminant can be used to distinguish earthquakes from explosions. As event yield/magnitude decreases, the probability of detecting signals at distant stations diminishes, and monitoring relies heavily on regional-distance stations (between about 2{sup o} and 13{sup o}). Many regional discriminant and magnitude methods make use of S-phase amplitude measurements, which are expected to diminish for explosion sources. Despite the general success of regional monitoring methods there remain instances when explosions produce anomalously large S-phases that confound regional discriminants. The 2-year Simulation of Regional Explosion S-phases (SiRES) project explores the phenomenology of regional S-phase generation through numerical simulation. In the first year of this study we construct a 3-dimensional model of the Nevada Test Site (NTS) and the surrounding region. Extensive databases of geologic information, including existing 3-dimensional models developed under past and ongoing NTS programs, are used to construct the regional model. In addition to deterministic geologic structure and topography we introduce stochastic variability within geologic units and along boundaries. The stochastic variability provides a more realistic simulation of the regional wave field, which is known to largely consist of scattered energy. Here we introduce the project and report on the first few months of progress.

  2. Massively parallel simulation with DOE's ASCI supercomputers : an overview of the Los Alamos Crestone project

    SciTech Connect

    Weaver, R. P.; Gittings, M. L.

    2004-01-01

    The Los Alamos Crestone Project is part of the Department of Energy's (DOE) Accelerated Strategic Computing Initiative, or ASCI Program. The main goal of this software development project is to investigate the use of continuous adaptive mesh refinement (CAMR) techniques for application to problems of interest to the Laboratory. There are many code development efforts in the Crestone Project, both unclassified and classified codes. In this overview I will discuss the unclassified SAGE and the RAGE codes. The SAGE (SAIC adaptive grid Eulerian) code is a one-, two-, and three-dimensional multimaterial Eulerian massively parallel hydrodynamics code for use in solving a variety of high-deformation flow problems. The RAGE CAMR code is built from the SAGE code by adding various radiation packages, improved setup utilities and graphics packages and is used for problems in which radiation transport of energy is important. The goal of these massively-parallel versions of the codes is to run extremely large problems in a reasonable amount of calendar time. Our target is scalable performance to {approx}10,000 processors on a 1 billion CAMR computational cell problem that requires hundreds of variables per cell, multiple physics packages (e.g. radiation and hydrodynamics), and implicit matrix solves for each cycle. A general description of the RAGE code has been published in [l],[ 2], [3] and [4]. Currently, the largest simulations we do are three-dimensional, using around 500 million computation cells and running for literally months of calendar time using {approx}2000 processors. Current ASCI platforms range from several 3-teraOPS supercomputers to one 12-teraOPS machine at Lawrence Livermore National Laboratory, the White machine, and one 20-teraOPS machine installed at Los Alamos, the Q machine. Each machine is a system comprised of many component parts that must perform in unity for the successful run of these simulations. Key features of any massively parallel system

  3. Simulation technology used for risky assessment in deep exploration project in China

    NASA Astrophysics Data System (ADS)

    jiao, J.; Huang, D.; Liu, J.

    2013-12-01

    Deep exploration has been carried out in China for five years in which various heavy duty instruments and equipments are employed for gravity, magnetic, seismic and electromagnetic data prospecting as well as ultra deep drilling rig established for obtaining deep samples, and so on. The deep exploration is a large and complex system engineering crossing multiple subjects with great investment. It is necessary to employ advanced technical means technology for verification, appraisal, and optimization of geographical prospecting equipment development. To reduce risk of the application and exploration, efficient and allegeable management concept and skills have to be enhanced in order to consolidate management measure and workflow to benefit the ambitious project. Therefore, evidence, prediction, evaluation and related decision strategies have to be taken into accouter simultaneously to meet practical scientific requests and technique limits and extendable attempts. Simulation technique is then proposed as a tool that can be used to carry out dynamic test on actual or imagined system. In practice, it is necessary to combine the simulation technique with the instruments and equipment to accomplish R&D tasks. In this paper, simulation technique is introduced into the R&D process of heavy-duty equipment and high-end engineering project technology. Based on the information provided by a drilling group recently, a digital model is constructed by combination of geographical data, 3d visualization, database management, and visual reality technologies together. It result in push ahead a R&D strategy, in which data processing , instrument application, expected result and uncertainty, and even operation workflow effect environment atmosphere are simulated systematically or simultaneously, in order to obtain an optimal consequence as well as equipment updating strategy. The simulation technology is able to adjust, verify, appraise and optimize the primary plan due to changing in

  4. The Southwest Indian Ocean thermocline dome in CMIP5 models: Historical simulation and future projection

    NASA Astrophysics Data System (ADS)

    Zheng, Xiao-Tong; Gao, Lihui; Li, Gen; Du, Yan

    2016-04-01

    Using 20 models of the Coupled Model Intercomparison Project Phase 5 (CMIP5), the simulation of the Southwest Indian Ocean (SWIO) thermocline dome is evaluated and its role in shaping the Indian Ocean Basin (IOB) mode following El Ni˜no investigated. In most of the CMIP5 models, due to an easterly wind bias along the equator, the simulated SWIO thermocline is too deep, which could further influence the amplitude of the interannual IOB mode. A model with a shallow (deep) thermocline dome tends to simulate a strong (weak) IOB mode, including key attributes such as the SWIO SST warming, antisymmetric pattern during boreal spring, and second North Indian Ocean warming during boreal summer. Under global warming, the thermocline dome deepens with the easterly wind trend along the equator in most of the models. However, the IOB amplitude does not follow such a change of the SWIO thermocline among the models; rather, it follows future changes in both ENSO forcing and local convection feedback, suggesting a decreasing effect of the deepening SWIO thermocline dome on the change in the IOB mode in the future.

  5. Numerical simulations of instabilities in the implosion process of inertial confined fusion in 2D cylindrical coordinates

    NASA Astrophysics Data System (ADS)

    Yong, Heng; Zhai, ChuanLei; Jiang, Song; Song, Peng; Dai, ZhenSheng; Gu, JianFa

    2016-01-01

    In this paper, we introduce a multi-material arbitrary Lagrangian and Eulerian method for the hydrodynamic radiative multi-group diffusion model in 2D cylindrical coordinates. The basic idea in the construction of the method is the following: In the Lagrangian step, a closure model of radiation-hydrodynamics is used to give the states of equations for materials in mixed cells. In the mesh rezoning step, we couple the rezoning principle with the Lagrangian interface tracking method and an Eulerian interface capturing scheme to compute interfaces sharply according to their deformation and to keep cells in good geometric quality. In the interface reconstruction step, a dual-material Moment-of-Fluid method is introduced to obtain the unique interface in mixed cells. In the remapping step, a conservative remapping algorithm of conserved quantities is presented. A number of numerical tests are carried out and the numerical results show that the new method can simulate instabilities in complex fluid field under large deformation, and are accurate and robust.

  6. Two-dimensional simulations of thermonuclear burn in ignition-scale inertial confinement fusion targets under compressed axial magnetic fields

    SciTech Connect

    Perkins, L. J.; Logan, B. G.; Zimmerman, G. B.; Werner, C. J.

    2013-07-15

    We report for the first time on full 2-D radiation-hydrodynamic implosion simulations that explore the impact of highly compressed imposed magnetic fields on the ignition and burn of perturbed spherical implosions of ignition-scale cryogenic capsules. Using perturbations that highly convolute the cold fuel boundary of the hotspot and prevent ignition without applied fields, we impose initial axial seed fields of 20–100 T (potentially attainable using present experimental methods) that compress to greater than 4 × 10{sup 4} T (400 MG) under implosion, thereby relaxing hotspot areal densities and pressures required for ignition and propagating burn by ∼50%. The compressed field is high enough to suppress transverse electron heat conduction, and to allow alphas to couple energy into the hotspot even when highly deformed by large low-mode amplitudes. This might permit the recovery of ignition, or at least significant alpha particle heating, in submarginal capsules that would otherwise fail because of adverse hydrodynamic instabilities.

  7. Future of Inertial Fusion Energy

    SciTech Connect

    Nuckolls, J H; Wood, L L

    2002-09-04

    In the past 50 years, fusion R&D programs have made enormous technical progress. Projected billion-dollar scale research facilities are designed to approach net energy production. In this century, scientific and engineering progress must continue until the economics of fusion power plants improves sufficiently to win large scale private funding in competition with fission and non-nuclear energy systems. This economic advantage must be sustained: trillion dollar investments will be required to build enough fusion power plants to generate ten percent of the world's energy. For Inertial Fusion Energy, multi-billion dollar driver costs must be reduced by up to an order of magnitude, to a small fraction of the total cost of the power plant. Major cost reductions could be achieved via substantial improvements in target performance-both higher gain and reduced ignition energy. Large target performance improvements may be feasible through a combination of design innovations, e.g., ''fast ignition,'' propagation down density gradients, and compression of fusion fuel with a combination of driver and chemical energy. The assumptions that limit projected performance of fusion targets should be carefully examined. The National Ignition Facility will enable development and testing of revolutionary targets designed to make possible economically competitive fusion power plants.

  8. Converting Snow Depth to SWE: The Fusion of Simulated Data with Remote Sensing Retrievals and the Airborne Snow Observatory

    NASA Astrophysics Data System (ADS)

    Bormann, K.; Marks, D. G.; Painter, T. H.; Hedrick, A. R.; Deems, J. S.

    2015-12-01

    Snow cover monitoring has greatly benefited from remote sensing technology but, despite their critical importance, spatially distributed measurements of snow water equivalent (SWE) in mountain terrain remain elusive. Current methods of monitoring SWE rely on point measurements and are insufficient for distributed snow science and effective management of water resources. Many studies have shown that the spatial variability in SWE is largely controlled by the spatial variability in snow depth. JPL's Airborne Snow Observatory mission (ASO) combines LiDAR and spectrometer instruments to retrieve accurate and very high-resolution snow depth measurements at the watershed scale, along with other products such as snow albedo. To make best use of these high-resolution snow depths, spatially distributed snow density data are required to leverage SWE from the measured snow depths. Snow density is a spatially and temporally variable property that cannot yet be reliably extracted from remote sensing techniques, and is difficult to extrapolate to basin scales. However, some physically based snow models have shown skill in simulating bulk snow densities and therefore provide a pathway for snow depth to SWE conversion. Leveraging model ability where remote sensing options are non-existent, ASO employs a physically based snow model (iSnobal) to resolve distributed snow density dynamics across the basin. After an adjustment scheme guided by in-situ data, these density estimates are used to derive the elusive spatial distribution of SWE from the observed snow depth distributions from ASO. In this study, we describe how the process of fusing model data with remote sensing retrievals is undertaken in the context of ASO along with estimates of uncertainty in the final SWE volume products. This work will likely be of interest to those working in snow hydrology, water resource management and the broader remote sensing community.

  9. Regional-scale rainfall projections: Simulations for the New Guinea region using the CCAM model

    NASA Astrophysics Data System (ADS)

    Smith, Ian; Moise, Aurel; Katzfey, Jack; Nguyen, Kim; Colman, Rob

    2013-02-01

    A common problem with global climate models is the fact that their grids do not always resolve important topographic features which determine the spatial variability of rainfall at regional scales. Here we present and compare simulations of rainfall for the relatively mountainous New Guinea region from six relatively coarse resolution climate models and the corresponding results using a higher resolution model (the Conformal Cubic Atmospheric Model—CCAM). While the large-scale climatological mean rainfall from both the coarse models and CCAM tend to be similar, unsurprisingly, the CCAM results better reflect some of the important topographic effects. However, the results for projected changes (under the A2 emissions scenario) to rainfall for later this century reveal some important differences. The coarse-scale results indicate relatively smooth patterns of projected change consistent with the representations of the underlying topography, but over New Guinea, there is little agreement on the sign of the change. The CCAM projections show greater spatial detail and better agreement among the six members. These indicate that West Papua and the relatively wet northern and southern mountain slopes may get wetter during December to February—the peak of the Austral monsoon season, and the highland regions may actually become drier during June to August—the dry season. These results are consistent with the theoretical concept that warmer temperatures may lead to increases over already wet regions and decreases over the relatively drier regions—the so-called "rich-get-richer" mechanism. They also highlight the fact that the climate of mountainous regions can be relatively complex and indicate potential difficulties that can arise when attempting to synthesize regional-scale projections from coarse-scale models.

  10. (Fusion energy research)

    SciTech Connect

    Phillips, C.A.

    1988-01-01

    This report discusses the following topics: principal parameters achieved in experimental devices (FY88); tokamak fusion test reactor; Princeton beta Experiment-Modification; S-1 Spheromak; current drive experiment; x-ray laser studies; spacecraft glow experiment; plasma deposition and etching of thin films; theoretical plasma; tokamak modeling; compact ignition tokamak; international thermonuclear experimental reactor; Engineering Department; Project Planning and Safety Office; quality assurance and reliability; and technology transfer.

  11. Hardware-Accelerated Simulated Radiography

    SciTech Connect

    Laney, D; Callahan, S; Max, N; Silva, C; Langer, S; Frank, R

    2005-08-04

    We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32-bit floating point texture capabilities to obtain solutions to the radiative transport equation for X-rays. The hardware accelerated solutions are accurate enough to enable scientists to explore the experimental design space with greater efficiency than the methods currently in use. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedral meshes that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester.

  12. To Create Space on Earth: The Space Environment Simulation Laboratory and Project Apollo

    NASA Technical Reports Server (NTRS)

    Walters, Lori C.

    2003-01-01

    Few undertakings in the history of humanity can compare to the great technological achievement known as Project Apollo. Among those who witnessed Armstrong#s flickering television image were thousands of people who had directly contributed to this historic moment. Amongst those in this vast anonymous cadre were the personnel of the Space Environment Simulation Laboratory (SESL) at the Manned Spacecraft Center (MSC) in Houston, Texas. SESL houses two large thermal-vacuum chambers with solar simulation capabilities. At a time when NASA engineers had a limited understanding of the effects of extremes of space on hardware and crews, SESL was designed to literally create the conditions of space on Earth. With interior dimensions of 90 feet in height and a 55-foot diameter, Chamber A dwarfed the Apollo command/service module (CSM) it was constructed to test. The chamber#s vacuum pumping capacity of 1 x 10(exp -6) torr can simulate an altitude greater than 130 miles above the Earth. A "lunar plane" capable of rotating a 150,000-pound test vehicle 180 deg replicates the revolution of a craft in space. To reproduce the temperature extremes of space, interior chamber walls cool to -280F as two banks of carbon arc modules simulate the unfiltered solar light/heat of the Sun. With capabilities similar to that of Chamber A, early Chamber B tests included the Gemini modular maneuvering unit, Apollo EVA mobility unit and the lunar module. Since Gemini astronaut Charles Bassett first ventured into the chamber in 1966, Chamber B has assisted astronauts in testing hardware and preparing them for work in the harsh extremes of space.

  13. POST2 End-To-End Descent and Landing Simulation for the Autonomous Landing and Hazard Avoidance Technology Project

    NASA Technical Reports Server (NTRS)

    Fisher, Jody l.; Striepe, Scott A.

    2007-01-01

    The Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining the design and performance capability of lunar descent and landing system models and lunar environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. This POST2-based ALHAT simulation provides descent and landing simulation capability by integrating lunar environment and lander system models (including terrain, sensor, guidance, navigation, and control models), along with the data necessary to design and operate a landing system for robotic, human, and cargo lunar-landing success. This paper presents the current and planned development and model validation of the POST2-based end-to-end trajectory simulation used for the testing, performance and evaluation of ALHAT project system and models.

  14. Simulation of extreme reservoir level distribution with the SCHADEX method (EXTRAFLO project)

    NASA Astrophysics Data System (ADS)

    Paquet, Emmanuel; Penot, David; Garavaglia, Federico

    2013-04-01

    -to-volume ratios and hydrographs applied to each simulated event. This allows accounting for different flood dynamics, depending on the season, the generating precipitation event, the soil saturation state, etc. In both cases, a hydraulic simulation of dam operation is performed, in order to compute the distribution of maximum reservoir levels. Results are detailed for an extreme return level, showing that a 1000 years return level reservoir level can be reached during flood events whose components (peaks, volumes) are not necessarily associated with such return level. The presentation will be illustrated by the example of a fictive dam on the Tech River at Reynes (South of France, 477 km²). This study has been carried out within the EXTRAFLO project, Task 8 (https://extraflo.cemagref.fr/). References: Paquet, E., Gailhard, J. and Garçon, R. (2006), Evolution of the GRADEX method: improvement by atmospheric circulation classification and hydrological modeling, La Houille Blanche, 5, 80-90. doi:10.1051/lhb:2006091. Paquet, E., Garavaglia, F., Garçon, R. and Gailhard, J. (2012), The SCHADEX method: a semi-continuous rainfall-runoff simulation for extreme food estimation, Journal of Hydrology, under revision

  15. Accelerator & Fusion Research Division 1991 summary of activities

    SciTech Connect

    Not Available

    1991-12-01

    This report discusses research projects in the following areas: Heavy-ion fusion accelerator research; magnetic fusion energy; advanced light source; center for x-ray optics; exploratory studies; superconducting magnets; and bevalac operations.

  16. Accelerator Fusion Research Division 1991 summary of activities

    SciTech Connect

    Berkner, Klaus H.

    1991-12-01

    This report discusses research projects in the following areas: Heavy-ion fusion accelerator research; magnetic fusion energy; advanced light source; center for x-ray optics; exploratory studies; superconducting magnets; and bevalac operations.

  17. Simulator Network Project Report: A tool for improvement of teaching materials and targeted resource usage in Skills Labs

    PubMed Central

    Damanakis, Alexander; Blaum, Wolf E.; Stosch, Christoph; Lauener, Hansjörg; Richter, Sabine; Schnabel, Kai P.

    2013-01-01

    During the last decade, medical education in the German-speaking world has been striving to become more practice-oriented. This is currently being achieved in many schools through the implementation of simulation-based instruction in Skills Labs. Simulators are thus an essential part of this type of medical training, and their acquisition and operation by a Skills Lab require a large outlay of resources. Therefore, the Practical Skills Committee of the Medical Education Society (GMA) introduced a new project, which aims to improve the flow of information between the Skills Labs and enable a transparent assessment of the simulators via an online database (the Simulator Network). PMID:23467581

  18. A NATIONAL COLLABORATORY TO ADVANCE THE SCIENCE OF HIGH TEMPERATURE PLASMA PHYSICS FOR MAGNETIC FUSION

    SciTech Connect

    Allen R. Sanderson; Christopher R. Johnson

    2006-08-01

    This report summarizes the work of the University of Utah, which was a member of the National Fusion Collaboratory (NFC) Project funded by the United States Department of Energy (DOE) under the Scientific Discovery through Advanced Computing Program (SciDAC) to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. A five year project that was initiated in 2001, it the NFC built on the past collaborative work performed within the U.S. fusion community and added the component of computer science research done with the USDOE Office of Science, Office of Advanced Scientific Computer Research. The project was itself a collaboration, itself uniting fusion scientists from General Atomics, MIT, and PPPL and computer scientists from ANL, LBNL, and Princeton University, and the University of Utah to form a coordinated team. The group leveraged existing computer science technology where possible and extended or created new capabilities where required. The complete finial report is attached as an addendum. The In the collaboration, the primary technical responsibility of the University of Utah in the collaboration was to develop and deploy an advanced scientific visualization service. To achieve this goal, the SCIRun Problem Solving Environment (PSE) is used on FusionGrid for an advanced scientific visualization service. SCIRun is open source software that gives the user the ability to create complex 3D visualizations and 2D graphics. This capability allows for the exploration of complex simulation results and the comparison of simulation and experimental data. SCIRun on FusionGrid gives the scientist a no-license-cost visualization capability that rivals present day commercial visualization packages. To accelerate the usage of SCIRun within the fusion community, a stand-alone application built on top of SCIRun was developed and deployed. This application, FusionViewer, allows users who are unfamiliar with SCIRun to quickly create

  19. Cirrus Parcel Model Comparison Project. Phase 1; The Critical Components to Simulate Cirrus Initiation Explicitly

    NASA Technical Reports Server (NTRS)

    Lin, Ruei-Fong; Starr, David OC; DeMott, Paul J.; Cotton, Richard; Sassen, Kenneth; Jensen, Eric; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The Cirrus Parcel Model Comparison Project, a project of the GCSS (GEWEX Cloud System Studies) Working Group on Cirrus Cloud Systems, involves the systematic comparison of current models of ice crystal nucleation and growth for specified, typical, cirrus cloud environments. In Phase I of the project reported here, simulated cirrus cloud microphysical properties are compared for situations of "warm" (40 C) and "cold" (-60 C) cirrus, both subject to updrafts of 4, 20 and 100 centimeters per second. Five models participated. The various models employ explicit microphysical schemes wherein the size distribution of each class of particles (aerosols and ice crystals) is resolved into bins or treated separately. Simulations are made including both the homogeneous and heterogeneous ice nucleation mechanisms. A single initial aerosol population of sulfuric acid particles is prescribed for all simulations. To isolate the treatment of the homogeneous freezing (of haze droplets) nucleation process, the heterogeneous nucleation mechanism is disabled for a second parallel set of simulations. Qualitative agreement is found for the homogeneous-nucleation- only simulations, e.g., the number density of nucleated ice crystals increases with the strength of the prescribed updraft. However, significant quantitative differences are found. Detailed analysis reveals that the homogeneous nucleation rate, haze particle solution concentration, and water vapor uptake rate by ice crystal growth (particularly as controlled by the deposition coefficient) are critical components that lead to differences in predicted microphysics. Systematic bias exists between results based on a modified classical theory approach and models using an effective freezing temperature approach to the treatment of nucleation. Each approach is constrained by critical freezing data from laboratory studies, but each includes assumptions that can only be justified by further laboratory research. Consequently, it is not yet

  20. A New Approach to Image Fusion Based on Cokriging

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.

    2005-01-01

    We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.

  1. Simulated lesion, human observer performance comparison between thin-section dedicated breast CT images versus computed thick-section simulated projection images of the breast

    PubMed Central

    Chen, L; Boone, JM; Abbey, CK; Hargreaves, J; Bateni, C; Lindfors, KK; Yang, K; Nosratieh, A; Hernandez, A; Gazi, P

    2015-01-01

    Objectives The objective of this study was to compare the lesion detection performance of human observers between thin-section computed tomography images of the breast, with thick-section (>40 mm) simulated projection images of the breast. Methods Three radiologists and six physicists each executed a two alterative force choice (2AFC) study involving simulated spherical lesions placed mathematically into breast images produced on a prototype dedicated breast CT scanner. The breast image data sets from 88 patients were used to create 352 pairs of image data. Spherical lesions with diameters of 1, 2, 3, 5, and 11 mm were simulated and adaptively positioned into 3D breast CT image data sets; the native thin section (0.33 mm) images were averaged to produce images with different slice thicknesses; average section thicknesses of 0.33 mm, 0.71 mm, 1.5 mm, and 2.9 mm were representative of breast CT; the average 43 mm slice thickness served to simulate simulated projection images of the breast. Results The percent correct of the human observer’s responses were evaluated in the 2AFC experiments. Radiologists lesion detection performance was significantly (p<0.05) better in the case of thin-section images, compared to thick section images similar to mammography, for all but the 1 mm lesion diameter lesions. For example, the average of three radiologist’s performance for 3 mm diameter lesions was 92 % correct for thin section breast CT images while it was 67 % for the simulated projection images. A gradual reduction in observer performance was observed as the section thickness increased beyond about 1 mm. While a performance difference based on breast density was seen in both breast CT and the projection image results, the average radiologist performance using breast CT images in dense breasts outperformed the performance using simulated projection images in fatty breasts for all lesion diameters except 11 mm. The average radiologist performance outperformed that of the

  2. PROJECT CLEA: Two Decades of Astrophysics Research Simulations for Astronomy Education

    NASA Astrophysics Data System (ADS)

    Marschall, Laurence A.; Snyder, G.; Cooper, P.

    2013-01-01

    Since 1992, Project CLEA (Contemporary Laboratory Experiences in Astronomy) has been developing simulations for the astronomy laboratory that engage students in the experience of modern astrophysical research. Though designed for introductory undergraduate courses, CLEA software can be flexibly configured for use in high-school classes and in upper-level observational astronomy classes, and has found usage in a wide spectrum of classrooms and on-line courses throughout the world. Now at the two-decade mark, CLEA has produced 16 exercises covering a variety of planetary, stellar, and extragalactic research topics at wavelengths from radio to X-ray. Project CLEA’s most recent product, VIREO, the Virtual Educational Observatory, is a flexible all-sky environment for developing a variety of further exercises. We review the current CLEA offerings and look to the future, especially describing further challenges in developing and maintaining the functionality of CLEA and similar activities as the current investigators wind down the funded development process. This research was sponsored throughout the world. by the National Science Foundation, Gettysburg College, and NASA's XMM-Newton mission.

  3. QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials

    NASA Astrophysics Data System (ADS)

    Giannozzi, Paolo; Baroni, Stefano; Bonini, Nicola; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Chiarotti, Guido L.; Cococcioni, Matteo; Dabo, Ismaila; Dal Corso, Andrea; de Gironcoli, Stefano; Fabris, Stefano; Fratesi, Guido; Gebauer, Ralph; Gerstmann, Uwe; Gougoussis, Christos; Kokalj, Anton; Lazzeri, Michele; Martin-Samos, Layla; Marzari, Nicola; Mauri, Francesco; Mazzarello, Riccardo; Paolini, Stefano; Pasquarello, Alfredo; Paulatto, Lorenzo; Sbraccia, Carlo; Scandolo, Sandro; Sclauzero, Gabriele; Seitsonen, Ari P.; Smogunov, Alexander; Umari, Paolo; Wentzcovitch, Renata M.

    2009-09-01

    QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.

  4. Modeling multiple communities of interest for interactive simulation and gaming: the dynamic adversarial gaming algorithm project

    NASA Astrophysics Data System (ADS)

    Santos, Eugene, Jr.; Zhao, Qunhua; Pratto, Felicia; Pearson, Adam R.; McQueary, Bruce; Breeden, Andy; Krause, Lee

    2007-04-01

    Nowadays, there is an increasing demand for the military to conduct operations that are beyond traditional warfare. In these operations, analyzing and understanding those who are involved in the situation, how they are going to behave, and why they behave in certain ways is critical for success. The challenge lies in that behavior does not simply follow universal/fixed doctrines; it is significantly influenced by soft factors (i.e. cultural factors, societal norms, etc.). In addition, there is rarely just one isolated enemy; the behaviors and responses of all groups in the region, and the dynamics of the interaction among them composes an important part of the whole picture. The Dynamic Adversarial Gaming Algorithm (DAGA) project aims to provide a wargaming environment for automation of simulating dynamics of geopolitical crisis and eventually be applied to military simulation and training domain, and/or commercial gaming arena. The focus of DAGA is on modeling communities of interest (COIs), where various individuals, groups, and organizations as well as their interactions are captured. The framework should provide a context for COIs to interact with each other and influence others' behaviors. These behaviors must incorporate soft factors by modeling cultural knowledge. We do so by representing cultural variables and their influence on behavior using probabilistic networks. In this paper, we describe our COI modeling, the development of cultural networks, the interaction architecture, and a prototype of DAGA.

  5. Container cargo simulation modeling for measuring impacts of infrastructure investment projects in Pearl River Delta

    NASA Astrophysics Data System (ADS)

    Li, Jia-Qi; Shibasaki, Ryuichi; Li, Bo-Wei

    2010-03-01

    In the Pearl River Delta (PRD), there is severe competition between container ports, particularly those in Hong Kong, Shenzhen, and Guangzhou, for collecting international maritime container cargo. In addition, the second phase of the Nansha terminal in Guangzhou’s port and the first phase of the Da Chang Bay container terminal in Shenzhen opened last year. Under these circumstances, there is an increasing need to quantitatively measure the impact these infrastructure investments have on regional cargo flows. The analysis should include the effects of container terminal construction, berth deepening, and access road construction. The authors have been developing a model for international cargo simulation (MICS) which can simulate the movement of cargo. The volume of origin-destination (OD) container cargo in the East Asian region was used as an input, in order to evaluate the effects of international freight transportation policies. This paper focuses on the PRD area and, by incorporating a more detailed network, evaluates the impact of several infrastructure investment projects on freight movement.

  6. Modeling and simulating aircraft stability and control—The SimSAC project

    NASA Astrophysics Data System (ADS)

    Rizzi, Arthur

    2011-11-01

    This paper overviews the SimSAC Project, Simulating Aircraft Stability And Control Characteristics for Use in Conceptual Design. It reports on the three major tasks: development of design software, validating the software on benchmark tests and applying the software to design exercises. CEASIOM, the Computerized Environment for Aircraft Synthesis and Integrated Optimization Methods, is a framework tool that integrates discipline-specific tools for conceptual design. At this early stage of the design it is very useful to be able to predict the flying and handling qualities of this design. In order to do this, the aerodynamic database needs to be computed for the configuration being studied, which then has to be coupled to the stability and control tools to carry out the analysis. The benchmarks for validation are the F12 windtunnel model of a generic long-range airliner and the TCR windtunnel model of a sonic-cruise passenger transport concept. The design, simulate and evaluate (DSE) exercise demonstrates how the software works as a design tool. The exercise begins with a design specification and uses conventional design methods to prescribe a baseline configuration. Then CEASIOM improves upon this baseline by analyzing its flying and handling qualities. Six such exercises are presented.

  7. Monte Carlo simulations for the space radiation superconducting shield project (SR2S)

    NASA Astrophysics Data System (ADS)

    Vuolo, M.; Giraudo, M.; Musenich, R.; Calvelli, V.; Ambroglini, F.; Burger, W. J.; Battiston, R.

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield - a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat.

  8. Monte Carlo simulations for the space radiation superconducting shield project (SR2S).

    PubMed

    Vuolo, M; Giraudo, M; Musenich, R; Calvelli, V; Ambroglini, F; Burger, W J; Battiston, R

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield--a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat. PMID:26948010

  9. The boreal summer intraseasonal oscillation simulated by four Chinese AGCMs participating in the CMIP5 project

    NASA Astrophysics Data System (ADS)

    Zhao, Chongbo; Zhou, Tianjun; Song, Lianchun; Ren, Hongli

    2014-09-01

    The performances of four Chinese AGCMs participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) in the simulation of the boreal summer intraseasonal oscillation (BSISO) are assessed. The authors focus on the major characteristics of BSISO: the intensity, significant period, and propagation. The results show that the four AGCMs can reproduce boreal summer intraseasonal signals of precipitation; however their limitations are also evident. Compared with the Climate Prediction Center Merged Analysis of Precipitation (CMAP) data, the models underestimate the strength of the intraseasonal oscillation (ISO) over the eastern equatorial Indian Ocean (IO) during the boreal summer (May to October), but overestimate the intraseasonal variability over the western Pacific (WP). In the model results, the westward propagation dominates, whereas the eastward propagation dominates in the CMAP data. The northward propagation in these models is tilted southwest-northeast, which is also different from the CMAP result. Thus, there is not a northeast-southwest tilted rain belt revolution off the equator during the BSISO's eastward journey in the models. The biases of the BSISO are consistent with the summer mean state, especially the vertical shear. Analysis also shows that there is a positive feedback between the intraseasonal precipitation and the summer mean precipitation. The positive feedback processes may amplify the models' biases in the BSISO simulation.

  10. Terascale Optimal PDE Simulations

    SciTech Connect

    David Keyes

    2009-07-28

    The Terascale Optimal PDE Solvers (TOPS) Integrated Software Infrastructure Center (ISIC) was created to develop and implement algorithms and support scientific investigations performed by DOE-sponsored researchers. These simulations often involve the solution of partial differential equations (PDEs) on terascale computers. The TOPS Center researched, developed and deployed an integrated toolkit of open-source, optimal complexity solvers for the nonlinear partial differential equations that arise in many DOE application areas, including fusion, accelerator design, global climate change and reactive chemistry. The algorithms created as part of this project were also designed to reduce current computational bottlenecks by orders of magnitude on terascale computers, enabling scientific simulation on a scale heretofore impossible.

  11. Extreme rainfall in Serbia, May 2014, simulation using WRF NMM and RainFARM: DRIHM project

    NASA Astrophysics Data System (ADS)

    Dekić, Ljiljana; Mihalović, Ana; Dimitrijević, Vladimir; Rebora, Nicola; Parodi, Antonio

    2015-04-01

    Extreme rainfall in Serbia, May 2014, simulation using WRF NMM and RainFARM: DRIHM project Ljiljana Dekić (1), Ana Mihalović (1), Vladimir Dimitrijević (1), Nicola Rebora (2), Antonio Parodi (2) (1)Republic HydroMeteorological Service of Serbia, Belgrade, Serbia, (2)CIMA Research Foundation, Savona, Italy In May 2014 Balkan region was affected with the continuous heavy rainfall, the heaviest in 120 years of recording observation, causing extensive flooding. Serbia suffered human casualties, huge infrastructure and industrial destruction and agricultural damage. Cyclone development and trajectory was very well predicted by RHMSS operational WRF NMM numerical model but extreme precipitation was not possible to predict with sufficient precision. Simulation of extreme rainfall situations using different numerical weather prediction models can indicate weakness of the model and point out importance of specified physical approach and parameterization schemes. The FP7 Distributed Research Infrastructure for Hydro-Meteorology DRIHM project gives a framework for using different models in forecasting extreme weather events. One of the DRIHM component is Rainfall Filtered Autoregressive Model RainFARM for stochastic rainfall downscaling. Objective of the DRIHM project was developing of standards and conversion of the data for seamless use of meteorological and hydrological models in flood prediction. This paper describes numerical tests and results of WRF NMM nonhydrostatic model and RainFARM downscaling applied on WRF NMM outputs. Different physics options in WRF NMM and their influence on precipitation amount were investigated. RainFARM was applied on every physical option with downscaling from 4km to 500m and 100m horizontal resolution and 100 ensemble members. We analyzed locations on the catchments in Serbia where flooding was the strongest and the most destructive. Statistical evaluation of ensemble output gives new insight into the sub scale precipitation

  12. Security on the US Fusion Grid

    SciTech Connect

    Burruss, Justin R.; Fredian, Tom W.; Thompson, Mary R.

    2005-06-01

    The National Fusion Collaboratory project is developing and deploying new distributed computing and remote collaboration technologies with the goal of advancing magnetic fusion energy research. This work has led to the development of the US Fusion Grid (FusionGrid), a computational grid composed of collaborative, compute, and data resources from the three large US fusion research facilities and with users both in the US and in Europe. Critical to the development of FusionGrid was the creation and deployment of technologies to ensure security in a heterogeneous environment. These solutions to the problems of authentication, authorization, data transfer, and secure data storage, as well as the lessons learned during the development of these solutions, may be applied outside of FusionGrid and scale to future computing infrastructures such as those for next-generation devices like ITER.

  13. Data security on the national fusion grid

    SciTech Connect

    Burruss, Justine R.; Fredian, Tom W.; Thompson, Mary R.

    2005-06-01

    The National Fusion Collaboratory project is developing and deploying new distributed computing and remote collaboration technologies with the goal of advancing magnetic fusion energy research. This work has led to the development of the US Fusion Grid (FusionGrid), a computational grid composed of collaborative, compute, and data resources from the three large US fusion research facilities and with users both in the US and in Europe. Critical to the development of FusionGrid was the creation and deployment of technologies to ensure security in a heterogeneous environment. These solutions to the problems of authentication, authorization, data transfer, and secure data storage, as well as the lessons learned during the development of these solutions, may be applied outside of FusionGrid and scale to future computing infrastructures such as those for next-generation devices like ITER.

  14. Review of alternative concepts for magnetic fusion

    SciTech Connect

    Krakowski, R.A.; Miller, R.L.; Hagenson, R.L.

    1980-01-01

    Although the Tokamak represents the mainstay of the world's quest for magnetic fusion power, with the tandem mirror serving as a primary backup concept in the US fusion program, a wide range of alternative fusion concepts (AFC's) have been and are being pursued. This review presents a summary of past and present reactor projections of a majority of AFC's. Whenever possible, quantitative results are given.

  15. The EAGLE project: simulating the evolution and assembly of galaxies and their environments

    NASA Astrophysics Data System (ADS)

    Schaye, Joop; Crain, Robert A.; Bower, Richard G.; Furlong, Michelle; Schaller, Matthieu; Theuns, Tom; Dalla Vecchia, Claudio; Frenk, Carlos S.; McCarthy, I. G.; Helly, John C.; Jenkins, Adrian; Rosas-Guevara, Y. M.; White, Simon D. M.; Baes, Maarten; Booth, C. M.; Camps, Peter; Navarro, Julio F.; Qu, Yan; Rahmati, Alireza; Sawala, Till; Thomas, Peter A.; Trayford, James

    2015-01-01

    We introduce the Virgo Consortium's Evolution and Assembly of GaLaxies and their Environments (EAGLE) project, a suite of hydrodynamical simulations that follow the formation of galaxies and supermassive black holes in cosmologically representative volumes of a standard Λ cold dark matter universe. We discuss the limitations of such simulations in light of their finite resolution and poorly constrained subgrid physics, and how these affect their predictive power. One major improvement is our treatment of feedback from massive stars and active galactic nuclei (AGN) in which thermal energy is injected into the gas without the need to turn off cooling or decouple hydrodynamical forces, allowing winds to develop without predetermined speed or mass loading factors. Because the feedback efficiencies cannot be predicted from first principles, we calibrate them to the present-day galaxy stellar mass function and the amplitude of the galaxy-central black hole mass relation, also taking galaxy sizes into account. The observed galaxy stellar mass function is reproduced to ≲ 0.2 dex over the full resolved mass range, 108 < M*/M⊙ ≲ 1011, a level of agreement close to that attained by semi-analytic models, and unprecedented for hydrodynamical simulations. We compare our results to a representative set of low-redshift observables not considered in the calibration, and find good agreement with the observed galaxy specific star formation rates, passive fractions, Tully-Fisher relation, total stellar luminosities of galaxy clusters, and column density distributions of intergalactic C IV and O VI. While the mass-metallicity relations for gas and stars are consistent with observations for M* ≳ 109 M⊙ (M* ≳ 1010 M⊙ at intermediate resolution), they are insufficiently steep at lower masses. For the reference model, the gas fractions and temperatures are too high for clusters of galaxies, but for galaxy groups these discrepancies can be resolved by adopting a higher

  16. Cold fusion, Alchemist's dream

    SciTech Connect

    Clayton, E.D.

    1989-09-01

    In this report the following topics relating to cold fusion are discussed: muon catalysed cold fusion; piezonuclear fusion; sundry explanations pertaining to cold fusion; cosmic ray muon catalysed cold fusion; vibrational mechanisms in excited states of D{sub 2} molecules; barrier penetration probabilities within the hydrogenated metal lattice/piezonuclear fusion; branching ratios of D{sub 2} fusion at low energies; fusion of deuterons into {sup 4}He; secondary D+T fusion within the hydrogenated metal lattice; {sup 3}He to {sup 4}He ratio within the metal lattice; shock induced fusion; and anomalously high isotopic ratios of {sup 3}He/{sup 4}He.

  17. Testbed for large volume surveillance through distributed fusion and resource management

    NASA Astrophysics Data System (ADS)

    Valin, Pierre; Guitouni, Adel; Bossé, Éloi; Wehn, Hans; Yates, Richard; Zwick, Harold

    2007-04-01

    DRDC Valcartier has initiated, through a PRECARN partnership project, the development of an advanced simulation testbed for the evaluation of the effectiveness of Network Enabled Operations in a coastal large volume surveillance situation. The main focus of this testbed is to study concepts like distributed information fusion, dynamic resources and networks configuration management, and self synchronising units and agents. This article presents the requirements, design and first implementation builds, and reports on some preliminary results. The testbed allows to model distributed nodes performing information fusion, dynamic resource management planning and scheduling, as well as configuration management, given multiple constraints on the resources and their communications networks. Two situations are simulated: cooperative and non-cooperative target search. A cooperative surface target behaves in ways to be detected (and rescued), while an elusive target attempts to avoid detection. The current simulation consists of a networked set of surveillance assets including aircraft (UAVs, helicopters, maritime patrol aircraft), and ships. These assets have electrooptical and infrared sensors, scanning and imaging radar capabilities. Since full data sharing over datalinks is not feasible, own-platform data fusion must be simulated to evaluate implementation and performance of distributed information fusion. A special emphasis is put on higher-level fusion concepts using knowledge-based rules, with level 1 fusion already providing tracks. Surveillance platform behavior is also simulated in order to evaluate different dynamic resource management algorithms. Additionally, communication networks are modeled to simulate different information exchange concepts. The testbed allows the evaluation of a range of control strategies from independent platform search, through various levels of platform collaboration, up to a centralized control of search platforms.

  18. The METAFOR project: providing community metadata standards for climate models, simulations and CMIP5

    NASA Astrophysics Data System (ADS)

    Callaghan, Sarah; Guilyardi, Eric

    2010-05-01

    The results of climate models are now of more than purely academic interest: governments and the private sector also have a need to discover the results in order to prepare for and mitigate against the potentially severe impacts of global climate change. Climate modelling is a complex process, which requires accurate and complete metadata (data describing data) in order to identify, assess and use the climate data stored in digital repositories. The EU funded METAFOR project has developed a Common Information Model (CIM) to describe in a standard way climate data and the models and modelling environments that produce this data. To establish the CIM, METAFOR first considered the metadata models developed by many groups engaged in similar efforts in Europe and worldwide (for example the US Earth System Curator), explored fragmentation and gaps as well as duplication of information present in these metadata models, and reviewed current problems in identifying, accessing or using climate data present in existing repositories. The CIM documents the "simulation context and models", i.e. the whys and wherefores and issues associated with any particular simulation. Climate modelling is a complex process with a wide degree of variability between different models and different modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible. The climate modelling process which is "an activity undertaken using software on computers to produce data" is described as separate UML packages. This fairly generic structure can be paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. METAFOR has been charged by the Working Group on Coupled Modelling (WGCM) via the Coupled Model Inter-comparison Project (CMIP) panel to define and collect model and experiment metadata for CMIP5. To do this, a web-based questionnaire will collect information and metadata from the CMIP5 modelling groups on the details

  19. A system for testing distributed information fusion applications for maritime surveillance

    NASA Astrophysics Data System (ADS)

    Wehn, Hans; Happe, Jens; Guitouni, Adel; Valin, Pierre; Bossé, Éloi

    2008-03-01

    A PRECARN partnership project, called CanCoastWatch (CCW), is bringing together a team of researchers from industry, government, and academia for creating an advanced simulation test bed for the purpose of evaluating the effectiveness of Network Enabled Operations in a Coastal Wide Area Surveillance situation. The test bed allows experimenting with higher-level distributed information fusion, dynamic resource management and configuration management given multiple constraints on the resources and their communications networks. The test bed provides general services that are useful for testing many fusion applications. This includes a multi-layer plug-and-play architecture, and a general multi-agent framework based on John Boyd's OODA loop.

  20. Climate projections of future extreme events accounting for modelling uncertainties and historical simulation biases

    NASA Astrophysics Data System (ADS)

    Brown, Simon J.; Murphy, James M.; Sexton, David M. H.; Harris, Glen R.

    2014-11-01

    A methodology is presented for providing projections of absolute future values of extreme weather events that takes into account key uncertainties in predicting future climate. This is achieved by characterising both observed and modelled extremes with a single form of non-stationary extreme value (EV) distribution that depends on global mean temperature and which includes terms that account for model bias. Such a distribution allows the prediction of future "observed" extremes for any period in the twenty-first century. Uncertainty in modelling future climate, arising from a wide range of atmospheric, oceanic, sulphur cycle and carbon cycle processes, is accounted for by using probabilistic distributions of future global temperature and EV parameters. These distributions are generated by Bayesian sampling of emulators with samples weighted by their likelihood with respect to a set of observational constraints. The emulators are trained on a large perturbed parameter ensemble of global simulations of the recent past, and the equilibrium response to doubled CO2. Emulated global EV parameters are converted to the relevant regional scale through downscaling relationships derived from a smaller perturbed parameter regional climate model ensemble. The simultaneous fitting of the EV model to regional model data and observations allows the characterisation of how observed extremes may change in the future irrespective of biases that may be present in the regional models simulation of the recent past climate. The clearest impact of a parameter perturbation in this ensemble was found to be the depth to which plants can access water. Members with shallow soils tend to be biased hot and dry in summer for the observational period. These biases also appear to have an impact on the potential future response for summer temperatures with some members with shallow soils having increases for extremes that reduce with extreme severity. We apply this methodology for London, using the

  1. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    NASA Astrophysics Data System (ADS)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Fazio, D.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Sedov, A.; Twomey, M. S.; Wang, F.; Zaytsev, A.

    2015-12-01

    During the LHC Long Shutdown 1 (LSI) period, that started in 2013, the Simulation at Point1 (Sim@P1) project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High-Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 Virtual Machines (VMs) each with 8 CPU cores, for a total of up to 22000 parallel jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 project, operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 33 million CPU-hours and it generated more than 1.1 billion Monte Carlo events. The design aspects are presented: the virtualization platform exploited by Sim@P1 avoids interferences with TDAQ operations and it guarantees the security and the usability of the ATLAS private network. The cloud mechanism allows the separation of the needed support on both infrastructural (hardware, virtualization layer) and logical (Grid site support) levels. This paper focuses on the operational aspects of such a large system during the upcoming LHC Run 2 period: simple, reliable, and efficient tools are needed to quickly switch from Sim@P1 to TDAQ mode and back, to exploit the resources when they are not used for the data acquisition, even for short periods. The evolution of the central OpenStack infrastructure is described, as it was upgraded from Folsom to the Icehouse release, including the scalability issues addressed.

  2. Project MANTIS: A MANTle Induction Simulator for coupling geodynamic and electromagnetic modeling

    NASA Astrophysics Data System (ADS)

    Weiss, C. J.

    2009-12-01

    A key component to testing geodynamic hypotheses resulting from the 3D mantle convection simulations is the ability to easily translate the predicted physiochemical state to the model space relevant for an independent geophysical observation, such as earth's seismic, geodetic or electromagnetic response. In this contribution a new parallel code for simulating low-frequency, global-scale electromagnetic induction phenomena is introduced that has the same Earth discretization as the popular CitcomS mantle convection code. Hence, projection of the CitcomS model into the model space of electrical conductivity is greatly simplified, and focuses solely on the node-to-node, physics-based relationship between these Earth parameters without the need for "upscaling", "downscaling", averaging or harmonizing with some other model basis such as spherical harmonics. Preliminary performance tests of the MANTIS code on shared and distributed memory parallel compute platforms shows favorable scaling (>70% efficiency) for up to 500 processors. As with CitcomS, an OpenDX visualization widget (VISMAN) is also provided for 3D rendering and interactive interrogation of model results. Details of the MANTIS code will be briefly discussed here, focusing on compatibility with CitcomS modeling, as will be preliminary results in which the electromagnetic response of a CitcomS model is evaluated. VISMAN rendering of electrical tomography-derived electrical conductivity model overlain by an a 1x1 deg crustal conductivity map. Grey scale represents the log_10 magnitude of conductivity [S/m]. Arrows are horiztonal components of a hypothetical magnetospheric source field used to electromagnetically excite the conductivity model.

  3. Interannual tropical rainfall variability in general circulation model simulations associated with the atmospheric model intercomparison project

    SciTech Connect

    Sperber, K.R.; Palmer, T.N.

    1996-11-01

    The interannual variability of rainfall over the Indian subcontinent, the African Sahel, and the Nordeste region of Brazil have been evaluated in 32 models for the period 1979 - 88 as part of the Atmospheric Model Intercomparison Project (AMIP). The interannual variations of Nordeste rainfall are the most readily captured, owing to the intimate link with Pacific and Atlantic sea surface temperatures. The precipitation variations over India and the Sahel are less well simulated. Additionally, an Indian monsoon wind shear index was calculated for each model. This subset of models also had a rainfall climatology that was in better agreement with observations, indicating a link between systematic model error and the ability to simulate interannual variations. A suite of six European Centre for Medium-Range Weather Forecasts (ECMWF) AMIP runs (differing only in their initial conditions) have also been examined. As observed, all-India rainfall was enhanced in 1988 relative to 1987 in each of these realizations. All-India rainfall variability during other years showed little or no predictability, possibly due to internal chaotic dynamics associated with intraseasonal monsoon fluctuations and/or unpredictable land surface process interactions. The interannual variations of Nordeste rainfall were best represented. The State University of New York at Albany /National Center for Atmospheric Research Genesis model was run in five initial condition realizations. In this model, the Nordeste rainfall variability was also best reproduced. However, for all regions the skill was less than that of the ECMWF model. The relationships of the all-India and Sahel rainfall/SST teleconnections with horizontal resolution, convection scheme closure, and numerics have been evaluated. 64 refs., 13 figs., 3 tabs.

  4. Sensor fusion for intelligent process control.

    SciTech Connect

    Connors, John J.; Hill, Kevin; Hanekamp, David; Haley, William F.; Gallagher, Robert J.; Gowin, Craig; Farrar, Arthur R.; Sheaffer, Donald A.; DeYoung, Mark A.; Bertram, Lee A.; Dodge, Craig; Binion, Bruce; Walsh, Peter M.; Houf, William G.; Desam, Padmabhushana R.; Tiwary, Rajiv; Stokes, Michael R.; Miller, Alan J.; Michael, Richard W.; Mayer, Raymond M.; Jiao, Yu; Smith, Philip J.; Arbab, Mehran; Hillaire, Robert G.

    2004-08-01

    An integrated system for the fusion of product and process sensors and controls for production of flat glass was envisioned, having as its objective the maximization of throughput and product quality subject to emission limits, furnace refractory wear, and other constraints. Although the project was prematurely terminated, stopping the work short of its goal, the tasks that were completed show the value of the approach and objectives. Though the demonstration was to have been done on a flat glass production line, the approach is applicable to control of production in the other sectors of the glass industry. Furthermore, the system architecture is also applicable in other industries utilizing processes in which product uniformity is determined by ability to control feed composition, mixing, heating and cooling, chemical reactions, and physical processes such as distillation, crystallization, drying, etc. The first phase of the project, with Visteon Automotive Systems as industrial partner, was focused on simulation and control of the glass annealing lehr. That work produced the analysis and computer code that provide the foundation for model-based control of annealing lehrs during steady state operation and through color and thickness changes. In the second phase of the work, with PPG Industries as the industrial partner, the emphasis was on control of temperature and combustion stoichiometry in the melting furnace, to provide a wider operating window, improve product yield, and increase energy efficiency. A program of experiments with the furnace, CFD modeling and simulation, flow measurements, and sensor fusion was undertaken to provide the experimental and theoretical basis for an integrated, model-based control system utilizing the new infrastructure installed at the demonstration site for the purpose. In spite of the fact that the project was terminated during the first year of the second phase of the work, the results of these first steps toward implementation

  5. The JUMP student project: two weeks of space simulation in a Mars-like environment.

    NASA Astrophysics Data System (ADS)

    de Crombrugghe, Guerric; de Lobkowicz, Ysaline; van Vynckt, Delphine; Reydams, Marc; Denies, Jonathan; Jago, Alban; Le Maire, Victor

    JUMP is a student initiative which aim is to simulate during two weeks the life of astronauts in a Mars-like environment. The simulation will be held in the Mars Desert Research Station (MDRS) a habitat installed by the Mars Society (MS) in the Utah desert. The crew is composed of six students, helped by a remote support of four students, all from different background (engineering, physics, mathematics, biology, and architecture) and degree (bachelor, master, PhD), under the supervision of researchers from several institutes. Several researches will be conducted during the simulation. We shall report on the science and technical results, and implications for Earth-Mars comparative studies. JASE: The Jump Astronaut Safety Experiment (JASE) consists in a deployable Yagi antenna with basic elec-tronics, providing an extremely light and simple way to prevent the solar flares and observe Jupiter bursts. JADE: The Jump Angular Detection Experiment (JADE) is an innovative an-gular particle detector used to determine the irradiation of the surface and monitor the charged particle distribution in Mars' atmosphere. Even if its resolution is low, it is a very light solution compared to pixel detectors. JAPE: The Jump Astronaut Potatoes Experiment (JAPE) will try to grow and eat in a space-like environment high-performance potatoes developed by the Groupe de Recherche en Physiologie Végétale (GRPV) of the UCL in the frame of the Micro-e Ecological Life Support System Alternative (MELiSSA) project of the ESA. JABE: The Jump soil Analysis with a Backpack drill Experiment (JABE) aim to validate a sample procedure, generate vertical profiles of the humidity with a MEMS sensor, and analyze soil samples with a spectrometer. The crew will therefore use a backpack drill, which is portable, fast and easy to use. JARE: The goal of the Jump Astronaut-Rover interaction Experiment (JARE) is to determine how a rover can help an astronaut in his task, and how it is possible to improve this

  6. Fusion neutronics experiments and analysis

    SciTech Connect

    Not Available

    1992-01-01

    UCLA has led the neutronics R D effort in the US for the past several years through the well-established USDOE/JAERI Collaborative Program on Fusion Neutronics. Significant contributions have been made in providing solid bases for advancing the neutronics testing capabilities in fusion reactors. This resulted from the hands-on experience gained from conducting several fusion integral experiments to quantify the prediction uncertainties of key blanket design parameters such as tritium production rate, activation, and nuclear heating, and when possible, to narrow the gap between calculational results and measurements through improving nuclear data base and codes capabilities. The current focus is to conduct the experiments in an annular configuration where the test assembly totally surrounds a simulated line source. The simulated line source is the first-of-a-kind in the scope of fusion integral experiments and presents a significant contribution to the world of fusion neutronics. The experiments proceeded through Phase IIIA to Phase IIIC in these line source simulation experiments started in 1989.

  7. A Study of the Efficacy of Project-Based Learning Integrated with Computer-Based Simulation--STELLA

    ERIC Educational Resources Information Center

    Eskrootchi, Rogheyeh; Oskrochi, G. Reza

    2010-01-01

    Incorporating computer-simulation modelling into project-based learning may be effective but requires careful planning and implementation. Teachers, especially, need pedagogical content knowledge which refers to knowledge about how students learn from materials infused with technology. This study suggests that students learn best by actively…

  8. COMPASS, the COMmunity Petascale Project for Accelerator Science and Simulation, a broad computational accelerator physics initiative

    SciTech Connect

    J.R. Cary; P. Spentzouris; J. Amundson; L. McInnes; M. Borland; B. Mustapha; B. Norris; P. Ostroumov; Y. Wang; W. Fischer; A. Fedotov; I. Ben-Zvi; R. Ryne; E. Esarey; C. Geddes; J. Qiang; E. Ng; S. Li; C. Ng; R. Lee; L. Merminga; H. Wang; D.L. Bruhwiler; D. Dechow; P. Mullowney; P. Messmer; C. Nieter; S. Ovtchinnikov; K. Paul; P. Stoltz; D. Wade-Stein; W.B. Mori; V. Decyk; C.K. Huang; W. Lu; M. Tzoufras; F. Tsung; M. Zhou; G.R. Werner; T. Antonsen; T. Katsouleas

    2007-06-01

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  9. COMPASS, the COMmunity Petascale Project for Accelerator Science And Simulation, a Broad Computational Accelerator Physics Initiative

    SciTech Connect

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Norris, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; /Jefferson Lab /Tech-X, Boulder /UCLA /Colorado U. /Maryland U. /Southern California U.

    2007-11-09

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  10. COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative

    SciTech Connect

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; Wang, H.; Bruhwiler, D.L.; Dechow, D.; Mullowney, P.; Messmer, P.; Nieter, C.; Ovtchinnikov, S.; Paul, K.; Stoltz, P.; Wade-Stein, D.; Mori, W.B.; Decyk, V.; Huang, C.K.; Lu, W.; Tzoufras, M.; Tsung, F.; Zhou, M.; Werner, G.R.; Antonsen, T.; Katsouleas, T.; Morris, B.

    2007-07-16

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  11. Evaluation of Arctic Sea Ice Thickness Simulated by Arctic Ocean Model Intercomparison Project Models

    NASA Technical Reports Server (NTRS)

    Johnson, Mark; Proshuntinsky, Andrew; Aksenov, Yevgeny; Nguyen, An T.; Lindsay, Ron; Haas, Christian; Zhang, Jinlun; Diansky, Nikolay; Kwok, Ron; Maslowski, Wieslaw; Hakkinen, Sirpa; Ashik, Igor; De Cuevas, Beverly

    2012-01-01

    Six Arctic Ocean Model Intercomparison Project model simulations are compared with estimates of sea ice thickness derived from pan-Arctic satellite freeboard measurements (2004-2008); airborne electromagnetic measurements (2001-2009); ice draft data from moored instruments in Fram Strait, the Greenland Sea, and the Beaufort Sea (1992-2008) and from submarines (1975-2000); and drill hole data from the Arctic basin, Laptev, and East Siberian marginal seas (1982-1986) and coastal stations (1998-2009). Despite an assessment of six models that differ in numerical methods, resolution, domain, forcing, and boundary conditions, the models generally overestimate the thickness of measured ice thinner than approximately 2 mand underestimate the thickness of ice measured thicker than about approximately 2m. In the regions of flat immobile landfast ice (shallow Siberian Seas with depths less than 25-30 m), the models generally overestimate both the total observed sea ice thickness and rates of September and October ice growth from observations by more than 4 times and more than one standard deviation, respectively. The models do not reproduce conditions of fast ice formation and growth. Instead, the modeled fast ice is replaced with pack ice which drifts, generating ridges of increasing ice thickness, in addition to thermodynamic ice growth. Considering all observational data sets, the better correlations and smaller differences from observations are from the Estimating the Circulation and Climate of the Ocean, Phase II and Pan-Arctic Ice Ocean Modeling and Assimilation System models.

  12. Future projections of precipitation characteristics in East Asia simulated by the MRI CGCM2

    NASA Astrophysics Data System (ADS)

    Kitoh, Akio; Hosaka, Masahiro; Adachi, Yukimasa; Kamiguchi, Kenji

    2005-07-01

    Projected changes in precipitation characteristics around the mid-21st century and end-of-the-century are analyzed using the daily precipitation output of the 3-member ensemble Meteorological Research Institute global ocean-atmosphere coupled general circulation model (MRI-CGCM2) simulations under the Special Report on Emissions Scenarios (SRES) A2 and B2 scenarios. It is found that both the frequency and intensity increase in about 40% of the globe, while both the frequency and intensity decrease in about 20% of the globe. These numbers differ only a few percent from decade to decade of the 21st century and between the A2 and B2 scenarios. Over the rest of the globe (about one third), the precipitation frequency decreases but its intensity increases, suggesting a shift of precipitation distribution toward more intense events by global warming. South China is such a region where the summertime wet-day frequency decreases but the precipitation intensity increases. This is related to increased atmospheric moisture content due to global warming and an intensified and more westwardly extended North Pacific subtropical anticyclone, which may be related with an El Niño-like mean sea surface temperature change. On the other hand, a decrease in summer precipitation is noted in North China, thus augmenting a south-to-north precipitation contrast more in the future.

  13. COMPASS, the COMmunity petascale project for accelerator science and simulation, a broad computational accelerator physics initiative

    NASA Astrophysics Data System (ADS)

    Cary, J. R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Norris, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; Wang, H.; Bruhwiler, D. L.; Dechow, D.; Mullowney, P.; Messmer, P.; Nieter, C.; Ovtchinnikov, S.; Paul, K.; Stoltz, P.; Wade-Stein, D.; Mori, W. B.; Decyk, V.; Huang, C. K.; Lu, W.; Tzoufras, M.; Tsung, F.; Zhou, M.; Werner, G. R.; Antonsen, T.; Katsouleas, T.

    2007-07-01

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  14. Development studies for the ILC: Measurements and simulations for a time projection chamber with GEM technology

    NASA Astrophysics Data System (ADS)

    Ledermann, Bernhard; Kaminski, Jochen; Kappler, Steffen; Müller, Thomas

    2007-10-01

    A Time Projection Chamber (TPC) with Gas Electron Multiplier (GEM) technology is well suited for usage as central tracker at the International Linear Collider (ILC). To study the high potential of this detector type a small prototype of 25 cm length was built in Karlsruhe and used in several experimental setups. In this publication the results of these measurements and of additional Monte Carlo simulations are presented. By introducing the so-called equivalent drift distance a combination of all results was possible leading to a recommended configuration of the multi-GEM tower for the ILC-TPC. It will be shown that for conditions considered in the TESLA-TDR the transverse spatial resolution will be able to reach 65 μm for 10 cm and 190 μm for 200 cm drift at the ILC. This as well as the expectations for longitudinal spatial resolution, for energy resolutions of the specific ionization, and for single pad row efficiency should be able to meet the requirements of a future ILC-TPC.

  15. Final Technical Report for Project "Improving the Simulation of Arctic Clouds in CCSM3"

    SciTech Connect

    Stephen J. Vavrus

    2008-11-15

    This project has focused on the simulation of Arctic clouds in CCSM3 and how the modeled cloud amount (and climate) can be improved substantially by altering the parameterized low cloud fraction. The new formula, dubbed 'freeezedry', alleviates the bias of excessive low clouds during polar winter by reducing the cloud amount under very dry conditions. During winter, freezedry decreases the low cloud amount over the coldest regions in high latitudes by over 50% locally and more than 30% averaged across the Arctic (Fig. 1). The cloud reduction causes an Arctic-wide drop of 15 W m{sup -2} in surface cloud radiative forcing (CRF) during winter and about a 50% decrease in mean annual Arctic CRF. Consequently, wintertime surface temperatures fall by up to 4 K on land and 2-8 K over the Arctic Ocean, thus significantly reducing the model's pronounced warm bias (Fig. 1). While improving the polar climate simulation in CCSM3, freezedry has virtually no influence outside of very cold regions (Fig. 2) or during summer (Fig. 3), which are space and time domains that were not targeted. Furthermore, the simplicity of this parameterization allows it to be readily incorporated into other GCMs, many of which also suffer from excessive wintertime polar cloudiness, based on the results from the CMIP3 archive (Vavrus et al., 2008). Freezedry also affects CCSM3's sensitivity to greenhouse forcing. In a transient-CO{sub 2} experiment, the model version with freezedry warms up to 20% less in the North Polar and South Polar regions (1.5 K and 0.5 K smaller warming, respectively) (Fig. 4). Paradoxically, the muted high-latitude response occurs despite a much larger increase in cloud amount with freezedry during non-summer months (when clouds warm the surface), apparently because of the colder modern reference climate. These results of the freezedry parameterization have recently been published (Vavrus and D. Waliser, 2008: An improved parameterization for simulating Arctic cloud amount in

  16. Fusion: an energy source for synthetic fuels

    SciTech Connect

    Fillo, J A; Powell, J; Steinberg, M

    1980-01-01

    The decreasing availability of fossil fuels emphasizes the need to develop systems which will produce synthetic fuel to substitute for and supplement the natural supply. An important first step in the synthesis of liquid and gaseous fuels is the production of hydrogen. Thermonuclear fusion offers an inexhaustible source of energy for the production of hydrogen from water. Depending on design, electric generation efficiencies of approx. 40 to 60% and hydrogen production efficiencies by high temperature electrolysis of approx. 50 to 70% are projected for fusion reactors using high temperature blankets. Fusion/coal symbiotic systems appear economically promising for the first generation of commercial fusion synfuels plants. Coal production requirements and the environmental effects of large-scale coal usage would be greatly reduced by a fusion/coal system. In the long term, there could be a gradual transition to an inexhaustible energy system based solely on fusion.

  17. Advanced fission and fossil plant economics-implications for fusion

    SciTech Connect

    Delene, J.G.

    1994-09-01

    In order for fusion energy to be a viable option for electric power generation, it must either directly compete with future alternatives or serve as a reasonable backup if the alternatives become unacceptable. This paper discusses projected costs for the most likely competitors with fusion power for baseload electric capacity and what these costs imply for fusion economics. The competitors examined include advanced nuclear fission and advanced fossil-fired plants. The projected costs and their basis are discussed. The estimates for these technologies are compared with cost estimates for magnetic and inertial confinement fusion plants. The conclusion of the analysis is that fusion faces formidable economic competition. Although the cost level for fusion appears greater than that for fission or fossil, the costs are not so high as to preclude fusion`s potential competitiveness.

  18. Neural Network Approach To Sensory Fusion

    NASA Astrophysics Data System (ADS)

    Pearson, John C.; Gelfand, Jack J.; Sullivan, W. E.; Peterson, Richard M.; Spence, Clay D.

    1988-08-01

    We present a neural network model for sensory fusion based on the design of the visual/acoustic target localiza-tion system of the barn owl. This system adaptively fuses its separate visual and acoustic representations of object position into a single joint representation used for head orientation. The building block in this system, as in much of the brain, is the neuronal map. Neuronal maps are large arrays of locally interconnected neurons that represent information in a map-like form, that is, parameter values are systematically encoded by the position of neural activation in the array. The computational load is distributed to a hierarchy of maps, and the computation is performed in stages by transforming the representation from map to map via the geometry of the projections between the maps and the local interactions within the maps. For example, azimuthal position is computed from the frequency and binaural phase information encoded in the signals of the acoustic sensors, while elevation is computed in a separate stream using binaural intensity information. These separate streams are merged in their joint projection onto the external nucleus of the inferior colliculus, a two dimensional array of cells which contains a map of acoustic space. This acoustic map, and the visual map of the retina, jointly project onto the optic tectum, creating a fused visual/acoustic representation of position in space that is used for object localization. In this paper we describe our mathematical model of the stage of visual/acoustic fusion in the optic tectum. The model assumes that the acoustic projection from the external nucleus onto the tectum is roughly topographic and one-to-many, while the visual projection from the retina onto the tectum is topographic and one-to-one. A simple process of self-organization alters the strengths of the acoustic connections, effectively forming a focused beam of strong acoustic connections whose inputs are coincident with the visual inputs

  19. Public Relations on Fusion in Europe

    NASA Astrophysics Data System (ADS)

    Ongena, J.; van Oost, G.; Paris, P. J.

    2000-10-01

    A summary will be presented of PR efforts on fusion energy research in Europe. A 3-D movie of a fusion research experimental reactor has been realized at the start of this year. It has been made entirely on virtual animation basis. Two versions exists, a short version of 3 min., as a video clip, and a longer version of nearly 8 min. Both could be viewed in 3D, using special projections and passive glasses or in normal VHS video projections. A new CD-ROM for individual and classroom use will be presented, discussing (i) the different energy forms, (ii) general principles of fusion, (iii) current research efforts and (iv) future prospects of fusion. This CD-ROM is now produced in English, German, French, Spanish, Italian and Portuguese Several new brochures and leaflets intended to increase the public awareness on fusion in Europe will be on display.

  20. Vector boson fusion searches for dark matter at the LHC

    NASA Astrophysics Data System (ADS)

    Brooke, James; Buckley, Matthew R.; Dunne, Patrick; Penning, Bjoern; Tamanas, John; Zgubič, Miha

    2016-06-01

    The vector boson fusion (VBF) event topology at the Large Hadron Collider (LHC) allows efficient suppression of dijet backgrounds and is therefore a promising target for new physics searches. We consider dark matter models which interact with the standard model through the electroweak sector—either through new scalar and pseudoscalar mediators which can be embedded into the Higgs sector or via effective operators suppressed by some higher scale—and therefore have significant VBF production cross sections. Using realistic simulations of the ATLAS and CMS analysis chain, including estimates of major error sources, we project the discovery and exclusion potential of the LHC for these models over the next decade.

  1. Establishment of an Institute for Fusion Studies. Technical progress report, November 1, 1994--October 31, 1995

    SciTech Connect

    1995-07-01

    The Institute for Fusion Studies is a national center for theoretical fusion plasma physics research. Its purposes are to (1) conduct research on theoretical questions concerning the achievement of controlled fusion energy by means of magnetic confinement--including both fundamental problems of long-range significance, as well as shorter-term issues; (2) serve as a national and international center for information exchange by hosting exchange visits, conferences, and workshops; and (3) train students and postdoctoral research personnel for the fusion energy program and plasma physics research areas. During FY 1995, a number of significant scientific advances were achieved at the IFS, both in long-range fundamental problems as well as in near-term strategic issues, consistent with the Institute`s mandate. Examples of these achievements include, for example, tokamak edge physics, analytical and computational studies of ion-temperature-gradient-driven turbulent transport, alpha-particle-excited toroidal Alfven eigenmode nonlinear behavior, sophisticated simulations for the Numerical Tokamak Project, and a variety of non-tokamak and non-fusion basic plasma physics applications. Many of these projects were done in collaboration with scientists from other institutions. Research discoveries are briefly described in this report.

  2. Z-Pinch Fusion Propulsion

    NASA Technical Reports Server (NTRS)

    Miernik, Janie

    2011-01-01

    Fusion-based nuclear propulsion has the potential to enable fast interplanetary transportation. Shorter trips are better for humans in the harmful radiation environment of deep space. Nuclear propulsion and power plants can enable high Ispand payload mass fractions because they require less fuel mass. Fusion energy research has characterized the Z-Pinch dense plasma focus method. (1) Lightning is form of pinched plasma electrical discharge phenomena. (2) Wire array Z-Pinch experiments are commonly studied and nuclear power plant configurations have been proposed. (3) Used in the field of Nuclear Weapons Effects (NWE) testing in the defense industry, nuclear weapon x-rays are simulated through Z-Pinch phenomena.

  3. Photo-fusion reactions in a new compact device for ELI

    NASA Astrophysics Data System (ADS)

    Moustaizis, S. D.; Auvray, P.; Hora, H.; Lalousis, P.; Larour, J.; Mourou, G.

    2012-07-01

    In the last few years significant progress on technological, experimental and numerical studies on fusion process in high density and high temperature plasmas produced by a high intensity laser pulse interaction with clusters in a high external applied magnetic field, enable us to propose a compact photo-fusion magnetic device for high neutron production. For the purpose of the project a pulsed magnetic field driver with values up to 110 Tesla has been developed which allows increasing the trapping time of the high density plasma in the device and improving the neutron yield. Numerical simulations show that the proposed device is capable of producing up to 109-1010 neutrons per laser shot with an external magnetic field of 150 Tesla. The proposed device can be used for experiments and numerical code validation concerning different conventional and (or) exotic fusion fuels.

  4. Photo-fusion reactions in a new compact device for ELI

    SciTech Connect

    Moustaizis, S. D.; Auvray, P.; Hora, H.; Lalousis, P.; Larour, J.; Mourou, G.

    2012-07-09

    In the last few years significant progress on technological, experimental and numerical studies on fusion process in high density and high temperature plasmas produced by a high intensity laser pulse interaction with clusters in a high external applied magnetic field, enable us to propose a compact photo-fusion magnetic device for high neutron production. For the purpose of the project a pulsed magnetic field driver with values up to 110 Tesla has been developed which allows increasing the trapping time of the high density plasma in the device and improving the neutron yield. Numerical simulations show that the proposed device is capable of producing up to 10{sup 9}-10{sup 10} neutrons per laser shot with an external magnetic field of 150 Tesla. The proposed device can be used for experiments and numerical code validation concerning different conventional and (or) exotic fusion fuels.

  5. Inertial confinement fusion

    SciTech Connect

    Powers, L.; Condouris, R.; Kotowski, M.; Murphy, P.W.

    1992-01-01

    This issue of the ICF Quarterly contains seven articles that describe recent progress in Lawrence Livermore National Laboratory's ICF program. The Department of Energy recently initiated an effort to design a 1--2 MJ glass laser, the proposed National Ignition Facility (NIF). These articles span various aspects of a program which is aimed at moving forward toward such a facility by continuing to use the Nova laser to gain understanding of NIF-relevant target physics, by developing concepts for an NIF laser driver, and by envisioning a variety of applications for larger ICF facilities. This report discusses research on the following topics: Stimulated Rotational Raman Scattering in Nitrogen; A Maxwell Equation Solver in LASNEX for the Simulation of Moderately Intense Ultrashort Pulse Experiments; Measurements of Radial Heat-Wave Propagation in Laser-Produced Plasmas; Laser-Seeded Modulation Growth on Directly Driven Foils; Stimulated Raman Scattering in Large-Aperture, High-Fluence Frequency-Conversion Crystals; Fission Product Hazard Reduction Using Inertial Fusion Energy; Use of Inertial Confinement Fusion for Nuclear Weapons Effects Simulations.

  6. The New England Climate Adaptation Project: Enhancing Local Readiness to Adapt to Climate Change through Role-Play Simulations

    NASA Astrophysics Data System (ADS)

    Rumore, D.; Kirshen, P. H.; Susskind, L.

    2014-12-01

    Despite scientific consensus that the climate is changing, local efforts to prepare for and manage climate change risks remain limited. How we can raise concern about climate change risks and enhance local readiness to adapt to climate change's effects? In this presentation, we will share the lessons learned from the New England Climate Adaptation Project (NECAP), a participatory action research project that tested science-based role-play simulations as a tool for educating the public about climate change risks and simulating collective risk management efforts. NECAP was a 2-year effort involving the Massachusetts Institute of Technology, the Consensus Building Institute, the National Estuarine Research Reserve System, and four coastal New England municipalities. During 2012-2013, the NECAP team produced downscaled climate change projections, a summary risk assessment, and a stakeholder assessment for each partner community. Working with local partners, we used these assessments to create a tailored, science-based role-play simulation for each site. Through a series of workshops in 2013, NECAP engaged between 115-170 diverse stakeholders and members of the public in each partner municipality in playing the simulation and a follow up conversation about local climate change risks and possible adaptation strategies. Data were collected through before-and-after surveys administered to all workshop participants, follow-up interviews with 25 percent of workshop participants, public opinion polls conducted before and after our intervention, and meetings with public officials. This presentation will report our research findings and explain how science-based role-play simulations can be used to help communicate local climate change risks and enhance local readiness to adapt.

  7. Multi-model ensemble simulation and projection in the climate change in the Mekong River Basin. Part I: temperature.

    PubMed

    Huang, Yong; Wang, Fengyou; Li, Yi; Cai, Tijiu

    2014-11-01

    This paper evaluates the performance of the Coupled Model Intercomparison Project phase 5 (CMIP5) in simulating annual and decadal temperature in the Mekong River Basin from 1950 to 2005. By use of Bayesian multi-model averaging method, the future projection of temperature variation under different scenarios are also analyzed. The results show, the performances of climate model are more accurate in space than time, the model can catch the warming characteristics in the Mekong river Basin, but the accuracy of simulation is not good enough. Bayesian multi-model averaging method can improve the annual and decadal temperature simulation when compared to a single result. The projected temperature in Mekong River will increase by 0.88 °C/100 year, 2.15 °C/100 year and 4.96 °C/100 year for the RCP2.6, RCP4.5, and RCP8.5 scenarios, respectively, over the twenty-first century. The findings will be beneficial for local people and policy-maker to formulate regional strategies against the potential menaces of warming scenarios. PMID:25027780

  8. Projecting Wind Energy Potential Under Climate Change with Ensemble of Climate Model Simulations

    NASA Astrophysics Data System (ADS)

    Jain, A.; Shashikanth, K.; Ghosh, S.; Mukherjee, P. P.

    2013-12-01

    Recent years have witnessed an increasing global concern over energy sustainability and security, triggered by a number of issues, such as (though not limited to): fossil fuel depletion, energy resource geopolitics, economic efficiency versus population growth debate, environmental concerns and climate change. Wind energy is a renewable and sustainable form of energy in which wind turbines convert the kinetic energy of wind into electrical energy. Global warming and differential surface heating may significantly impact the wind velocity and hence the wind energy potential. Sustainable design of wind mills requires understanding the impacts of climate change on wind energy potential, which we evaluate here with multiple General Circulation Models (GCMs). GCMs simulate the climate variables globally considering the greenhouse emission scenarios provided as Representation Concentration path ways (RCPs). Here we use new generation climate model outputs obtained from Coupled model Intercomparison Project 5(CMIP5). We first compute the wind energy potential with reanalysis data (NCEP/ NCAR), at a spatial resolution of 2.50, where the gridded data is fitted to Weibull distribution and with the Weibull parameters, the wind energy densities are computed at different grids. The same methodology is then used, to CMIP5 outputs (resultant of U-wind and V-wind) of MRI, CMCC, BCC, CanESM, and INMCM4 for historical runs. This is performed separately for four seasons globally, MAM, JJA, SON and DJF. We observe the muti-model average of wind energy density for historic period has significant bias with respect to that of reanalysis product. Here we develop a quantile based superensemble approach where GCM quantiles corresponding to selected CDF values are regressed to reanalysis data. It is observed that this regression approach takes care of both, bias in GCMs and combination of GCMs. With superensemble, we observe that the historical wind energy density resembles quite well with

  9. Accelerators for Fusion Materials Testing

    NASA Astrophysics Data System (ADS)

    Knaster, Juan; Okumura, Yoshikazu

    Fusion materials research is a worldwide endeavor as old as the parallel one working toward the long term stable confinement of ignited plasma. In a fusion reactor, the preservation of the required minimum thermomechanical properties of the in-vessel components exposed to the severe irradiation and heat flux conditions is an indispensable factor for safe operation; it is also an essential goal for the economic viability of fusion. Energy from fusion power will be extracted from the 14 MeV neutron freed as a product of the deuterium-tritium fusion reactions; thus, this kinetic energy must be absorbed and efficiently evacuated and electricity eventually generated by the conventional methods of a thermal power plant. Worldwide technological efforts to understand the degradation of materials exposed to 14 MeV neutron fluxes >1018 m-2s-1, as expected in future fusion power plants, have been intense over the last four decades. Existing neutron sources can reach suitable dpa (“displacement-per-atom”, the figure of merit to assess materials degradation from being exposed to neutron irradiation), but the differences in the neutron spectrum of fission reactors and spallation sources do not allow one to unravel the physics and to anticipate the degradation of materials exposed to fusion neutrons. Fusion irradiation conditions can be achieved through Li (d, xn) nuclear reactions with suitable deuteron beam current and energy, and an adequate flowing lithium screen. This idea triggered in the late 1970s at Los Alamos National Laboratory (LANL) a campaign working toward the feasibility of continuous wave (CW) high current linacs framed by the Fusion Materials Irradiation Test (FMIT) project. These efforts continued with the Low Energy Demonstrating Accelerator (LEDA) (a validating prototype of the canceled Accelerator Production of Tritium (APT) project), which was proposed in 2002 to the fusion community as a 6.7MeV, 100mA CW beam injector for a Li (d, xn) source to bridge

  10. Accelerators for Fusion Materials Testing

    NASA Astrophysics Data System (ADS)

    Knaster, Juan; Okumura, Yoshikazu

    Fusion materials research is a worldwide endeavor as old as the parallel one working toward the long term stable confinement of ignited plasma. In a fusion reactor, the preservation of the required minimum thermomechanical properties of the in-vessel components exposed to the severe irradiation and heat flux conditions is an indispensable factor for safe operation; it is also an essential goal for the economic viability of fusion. Energy from fusion power will be extracted from the 14 MeV neutron freed as a product of the deuterium-tritium fusion reactions; thus, this kinetic energy must be absorbed and efficiently evacuated and electricity eventually generated by the conventional methods of a thermal power plant. Worldwide technological efforts to understand the degradation of materials exposed to 14 MeV neutron fluxes > 1018 m-2s-1, as expected in future fusion power plants, have been intense over the last four decades. Existing neutron sources can reach suitable dpa ("displacement-per-atom", the figure of merit to assess materials degradation from being exposed to neutron irradiation), but the differences in the neutron spectrum of fission reactors and spallation sources do not allow one to unravel the physics and to anticipate the degradation of materials exposed to fusion neutrons. Fusion irradiation conditions can be achieved through Li (d, xn) nuclear reactions with suitable deuteron beam current and energy, and an adequate flowing lithium screen. This idea triggered in the late 1970s at Los Alamos National Laboratory (LANL) a campaign working toward the feasibility of continuous wave (CW) high current linacs framed by the Fusion Materials Irradiation Test (FMIT) project. These efforts continued with the Low Energy Demonstrating Accelerator (LEDA) (a validating prototype of the canceled Accelerator Production of Tritium (APT) project), which was proposed in 2002 to the fusion community as a 6.7MeV, 100mA CW beam injector for a Li (d, xn) source to bridge

  11. Fusion Blanket Development in FDF

    NASA Astrophysics Data System (ADS)

    Wong, C. P. C.; Smith, J. P.; Stambaugh, R. D.

    2008-11-01

    To satisfy the electricity and tritium self-sufficiency missions of a Fusion Development Facility (FDF), suitable blanket designs will need to be evaluated, selected and developed. To demonstrate closure of the fusion fuel cycle, 2-3 main tritium breeding blankets will be used to cover most of the available chamber surface area in order to reach the project goal of achieving a tritium breeding ratio, TBR > 1. To demonstrate the feasibility of electricity and tritium production for subsequent devices such as the fusion demonstration power reactor (DEMO), several advanced test blankets will need to be selected and tested on the FDF to demonstrate high coolant outlet temperature necessary for efficient electricity production. Since the design goals for the main and test blankets are different, the design criteria of these blankets will also be different. The considerations in performing the evaluation of blanket and structural material options in concert with the maintenance approach for the FDF will be reported in this paper.

  12. Laser fusion experiments at LLL

    SciTech Connect

    Ahlstrom, H.G.

    1980-06-16

    These notes present the experimental basis and status for laser fusion as developed at LLL. Two other chapters, one authored by K.A. Brueckner and the other by C. Max, present the theoretical implosion physics and laser plasma interaction physics. The notes consist of six sections. The first is an introductory section which provides some of the history of inertial fusion and a simple explanation of the concepts involved. The second section presents an extensive discussion of diagnostic instrumentation used in the LLL Laser Fusion Program. The third section is a presentation of laser facilities and capabilities at LLL. The purpose here is to define capability, not to derive how it was obtained. The fourth and fifth sections present the experimental data on laser-plasma interaction and implosion physics. The last chapter is a short projection of the future.

  13. Superconducting magnets for fusion applications

    SciTech Connect

    Henning, C.D.

    1987-07-02

    Fusion magnet technology has made spectacular advances in the past decade; to wit, the Mirror Fusion Test Facility and the Large Coil Project. However, further advances are still required for advanced economical fusion reactors. Higher fields to 14 T and radiation-hardened superconductors and insulators will be necessary. Coupled with high rates of nuclear heating and pulsed losses, the next-generation magnets will need still higher current density, better stability and quench protection. Cable-in-conduit conductors coupled with polyimide insulations and better steels seem to be the appropriate path. Neutron fluences up to 10/sup 19/ neutrons/cm/sup 2/ in niobium tin are achievable. In the future, other amorphous superconductors could raise these limits further to extend reactor life or decrease the neutron shielding and corresponding reactor size.

  14. Educational Simulations: A Project Report. New Approaches for Behaviorally Exceptional Youth.

    ERIC Educational Resources Information Center

    Santa Cruz County Superintendent of Schools, CA.

    Evaluated was the use of 12 simulation games with approximately 650 adolescents in 19 corrective schools in Santa Cruz county including ranch schools, juvenile hall schools, drug dependent minor programs, and youth authority facilities. Topics of the simulation games were peer pressure, looking for and keeping a job, mathematics, driving…

  15. Incorporating Reflective Practice into Team Simulation Projects for Improved Learning Outcomes

    ERIC Educational Resources Information Center

    Wills, Katherine V.; Clerkin, Thomas A.

    2009-01-01

    The use of simulation games in business courses is a popular method for providing undergraduate students with experiences similar to those they might encounter in the business world. As such, in 2003 the authors were pleased to find a classroom simulation tool that combined the decision-making and team experiences of a senior management group with…

  16. High-Fidelity Simulation Meets Athletic Training Education: An Innovative Collaborative Teaching Project

    ERIC Educational Resources Information Center

    Palmer, Elizabeth; Edwards, Taylor; Racchini, James

    2014-01-01

    High-fidelity simulation is frequently used in nursing education to provide students with simulated experiences prior to and throughout clinical coursework that involves direct patient care. These high-tech exercises take advantage of the benefits of a standardized patient or mock patient encounter, while eliminating some of the drawbacks…

  17. Virtual Airspace Modeling and Simulation (VAMS) Project First Technical Interchange Meeting

    NASA Technical Reports Server (NTRS)

    Beard, Robert; Kille, Robert; Kirsten, Richard; Rigterink, Paul; Sielski, Henry; Gratteau, Melinda F. (Editor)

    2002-01-01

    A three-day NASA Virtual Airspace and Modeling Project (VAMS) Technical Interchange Meeting (TIM) was held at the NASA Ames Research Center in Mountain View, CA. on May 21 through May 23,2002. The purpose of this meeting was to share initial concept information sponsored by the VAMS Project. An overall goal of the VAMS Project is to develop validated, blended, robust and transition-able air transportation system concepts over the next five years that will achieve NASA's long-term Enterprise Aviation Capacity goals. This document describes the presentations at the TIM, their related questions and answers, and presents the TIM recommendations.

  18. Simulating Carbon Dynamics and Species Composition Under Projected Changes in Climate in the Puget Sound, Washington, USA

    NASA Astrophysics Data System (ADS)

    Laflower, D.; Hurteau, M. D.

    2014-12-01

    Changing climate has the potential to directly and indirectly alter forest carbon dynamics and species composition, particularly in temperature or precipitation limited systems. In light-limited systems, species-specific response to changing climate could result in an indirect effect of climate through altered competitive interactions. Joint Base Lewis-McChord, in Washington, contains one of the largest intact forested areas in the Puget Sound. Management priorities include development of late-successional forests and conservation. We sought to quantify how projected changes in climate would affect species diversity and carbon (C) sequestration given management priorities. We used Landis-II to simulate forest dynamics over 100 years using current climate and projected climate under two emission scenarios. Preliminary analyses indicate a decrease in soil C, relative to current climate, beginning mid-century for both emission scenarios. Under the low emission scenario, the decrease is compensated by increased aboveground C, while the high scenario experiences a decline in aboveground C. Total ecosystem C was consistent between baseline and low emission climate throughout the simulation. By late-century, the high scenario had an average decrease of 10 Mg C ha-1. Douglas-fir (DF) accounts for the largest fraction of aboveground biomass (AGB) in the study area. Interestingly, DF AGB was fairly consistent between climate scenarios through mid-century, but diverged during late-century, with the high scenario having the greatest amount of DF AGB (mean 368 Mg ha-1) and current climate having the lowest (mean 341 Mg ha-1). We found the inverse relationship when examining all other species. Given the uncertainty associated with climate projections, future simulations will include a larger suite of climate projections and address the secondary effects of climate change (e.g. increased wildfire, disease or insect outbreaks) that can impact productivity.

  19. The Virtual ChemLab Project: A Realistic and Sophisticated Simulation of Inorganic Qualitative Analysis

    NASA Astrophysics Data System (ADS)

    Woodfield, Brian F.; Catlin, Heidi R.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Allen, Rob; Bodily, Greg

    2004-11-01

    We have created a set of sophisticated and realistic laboratory simulations for use in freshman- and sophomore-level chemistry classes and laboratories called Virtual ChemLab. We have completed simulations for Inorganic Qualitative Analysis, Organic Synthesis and Organic Qualitative Analysis, Experiments in Quantum Chemistry, Gas Properties, Titration Experiments, and Calorimetric and Thermochemical Experiments. The purpose of our simulations is to reinforce concepts taught in the classroom, provide an environment for creative learning, and emphasize the thinking behind instructional laboratory experiments. We have used the inorganic simulation extensively with thousands of students in our department at Brigham Young University. We have learned from our evaluation that: (i) students enjoy using these simulations and find them to be an asset in learning effective problem-solving strategies, (ii) students like the fact that they can both reproduce experimental procedures and explore various topics in ways they choose, and (iii) students naturally divide themselves into two groups: creative learners, who excel in an open-ended environment of virtual laboratories, and structured learners, who struggle in this same environment. In this article, we describe the Inorganic Qualitative Analysis simulation; we also share specific evaluation findings from using the inorganic simulation in classroom and laboratory settings.

  20. Fusion energy

    NASA Astrophysics Data System (ADS)

    1990-09-01

    The main purpose of the International Thermonuclear Experimental Reactor (ITER) is to develop an experimental fusion reactor through the united efforts of many technologically advanced countries. The ITER terms of reference, issued jointly by the European Community, Japan, the USSR, and the United States, call for an integrated international design activity and constitute the basis of current activities. Joint work on ITER is carried out under the auspices of the International Atomic Energy Agency (IAEA), according to the terms of quadripartite agreement reached between the European Community, Japan, the USSR, and the United States. The site for joint technical work sessions is at the Max Planck Institute of Plasma Physics. Garching, Federal Republic of Germany. The ITER activities have two phases: a definition phase performed in 1988 and the present design phase (1989 to 1990). During the definition phase, a set of ITER technical characteristics and supporting research and development (R and D) activities were developed and reported. The present conceptual design phase of ITER lasts until the end of 1990. The objectives of this phase are to develop the design of ITER, perform a safety and environmental analysis, develop site requirements, define future R and D needs, and estimate cost, manpower, and schedule for construction and operation. A final report will be submitted at the end of 1990. This paper summarizes progress in the ITER program during the 1989 design phase.

  1. Fusion energy

    SciTech Connect

    Not Available

    1990-09-01

    The main purpose of the International Thermonuclear Experimental Reactor (ITER) is to develop an experimental fusion reactor through the united efforts of many technologically advanced countries. The ITER terms of reference, issued jointly by the European Community, Japan, the USSR, and the United States, call for an integrated international design activity and constitute the basis of current activities. Joint work on ITER is carried out under the auspices of the International Atomic Energy Agency (IAEA), according to the terms of quadripartite agreement reached between the European Community, Japan, the USSR, and the United States. The site for joint technical work sessions is at the MaxPlanck Institute of Plasma Physics. Garching, Federal Republic of Germany. The ITER activities have two phases: a definition phase performed in 1988 and the present design phase (1989--1990). During the definition phase, a set of ITER technical characteristics and supporting research and development (R D) activities were developed and reported. The present conceptual design phase of ITER lasts until the end of 1990. The objectives of this phase are to develop the design of ITER, perform a safety and environmental analysis, develop site requirements, define future R D needs, and estimate cost, manpower, and schedule for construction and operation. A final report will be submitted at the end of 1990. This paper summarizes progress in the ITER program during the 1989 design phase.

  2. Magnetic-Nozzle Studies for Fusion Propulsion Applications: Gigawatt Plasma Source Operation and Magnetic Nozzle Analysis

    NASA Technical Reports Server (NTRS)

    Gilland, James H.; Mikekkides, Ioannis; Mikellides, Pavlos; Gregorek, Gerald; Marriott, Darin

    2004-01-01

    This project has been a multiyear effort to assess the feasibility of a key process inherent to virtually all fusion propulsion concepts: the expansion of a fusion-grade plasma through a diverging magnetic field. Current fusion energy research touches on this process only indirectly through studies of plasma divertors designed to remove the fusion products from a reactor. This project was aimed at directly addressing propulsion system issues, without the expense of constructing a fusion reactor. Instead, the program designed, constructed, and operated a facility suitable for simulating fusion reactor grade edge plasmas, and to examine their expansion in an expanding magnetic nozzle. The approach was to create and accelerate a dense (up to l0(exp 20)/m) plasma, stagnate it in a converging magnetic field to convert kinetic energy to thermal energy, and examine the subsequent expansion of the hot (100's eV) plasma in a subsequent magnetic nozzle. Throughout the project, there has been a parallel effort between theoretical and numerical design and modelling of the experiment and the experiment itself. In particular, the MACH2 code was used to design and predict the performance of the magnetoplasmadynamic (MPD) plasma accelerator, and to design and predict the design and expected behavior for the magnetic field coils that could be added later. Progress to date includes the theoretical accelerator design and construction, development of the power and vacuum systems to accommodate the powers and mass flow rates of interest to out research, operation of the accelerator and comparison to theoretical predictions, and computational analysis of future magnetic field coils and the expected performance of an integrated source-nozzle experiment.

  3. Basics of Fusion-Fissison Research Facility (FFRF) as a Fusion Neutron Source

    SciTech Connect

    Leonid E. Zakharov

    2011-06-03

    FFRF, standing for the Fusion-Fission Research Facility represents an option for the next step project of ASIPP (Hefei, China) aiming to a first fusion-fission multifunctional device [1]. FFRF strongly relies on new, Lithium Wall Fusion plasma regimes, the development of which has already started in the US and China. With R/a=4/1m/m, Ipl=5 MA, Btor=4-6 T, PDT=50- 100 MW, Pfission=80-4000MW, 1 m thick blanket, FFRF has a unique fusion mission of a stationary fusion neutron source. Its pioneering mission of merging fusion and fission consists in accumulation of design, experimental, and operational data for future hybrid applications.

  4. Modeling and Simulation of Longitudinal Dynamics for Low Energy Ring_High Energy Ring at the Positron-Electron Project

    SciTech Connect

    Rivetta, Claudio; Mastorides, T.; Fox, J.D.; Teytelman, D.; Van Winkle, D.; /SLAC

    2007-03-06

    A time domain dynamic modeling and simulation tool for beam-cavity interactions in the Low Energy Ring (LER) and High Energy Ring (HER) at the Positron-Electron Project (PEP-II) is presented. Dynamic simulation results for PEP-II are compared to measurements of the actual machine. The motivation for this tool is to explore the stability margins and performance limits of PEP-II radio-frequency (RF) systems at future higher currents and upgraded RF configurations. It also serves as a test bed for new control algorithms and can define the ultimate limits of the low-level RF (LLRF) architecture. The time domain program captures the dynamic behavior of the beam-cavity-LLRF interaction based on a reduced model. The ring current is represented by macrobunches. Multiple RF stations in the ring are represented via one or two macrocavities. Each macrocavity captures the overall behavior of all the 2 or 4 cavity RF stations. Station models include nonlinear elements in the klystron and signal processing. This enables modeling the principal longitudinal impedance control loops interacting via the longitudinal beam model. The dynamics of the simulation model are validated by comparing the measured growth rates for the LER with simulation results. The simulated behavior of the LER at increased operation currents is presented via low-mode instability growth rates. Different control strategies are compared and the effects of both the imperfections in the LLRF signal processing and the nonlinear drivers and klystrons are explored.

  5. Modeling and simulation of longitudinal dynamics for Low Energy Ring High Energy Ring at the Positron-Electron Project

    NASA Astrophysics Data System (ADS)

    Rivetta, C.; Mastorides, T.; Fox, J. D.; Teytelman, D.; van Winkle, D.

    2007-02-01

    A time domain dynamic modeling and simulation tool for beam-cavity interactions in the Low Energy Ring (LER) and High Energy Ring (HER) at the Positron-Electron Project (PEP-II) is presented. Dynamic simulation results for PEP-II are compared to measurements of the actual machine. The motivation for this tool is to explore the stability margins and performance limits of PEP-II radio-frequency (RF) systems at future higher currents and upgraded RF configurations. It also serves as a test bed for new control algorithms and can define the ultimate limits of the low-level RF (LLRF) architecture. The time domain program captures the dynamic behavior of the beam-cavity-LLRF interaction based on a reduced model. The ring current is represented by macrobunches. Multiple RF stations in the ring are represented via one or two macrocavities. Each macrocavity captures the overall behavior of all the 2 or 4 cavity RF stations. Station models include nonlinear elements in the klystron and signal processing. This enables modeling the principal longitudinal impedance control loops interacting via the longitudinal beam model. The dynamics of the simulation model are validated by comparing the measured growth rates for the LER with simulation results. The simulated behavior of the LER at increased operation currents is presented via low-mode instability growth rates. Different control strategies are compared and the effects of both the imperfections in the LLRF signal processing and the nonlinear drivers and klystrons are explored.

  6. Three-dimensional numerical reservoir simulation of the EGS Demonstration Project at The Geysers geothermal field

    NASA Astrophysics Data System (ADS)

    Borgia, Andrea; Rutqvist, Jonny; Oldenburg, Curt M.; Hutchings, Lawrence; Garcia, Julio; Walters, Mark; Hartline, Craig; Jeanne, Pierre; Dobson, Patrick; Boyle, Katie

    2013-04-01

    The Enhanced Geothermal System (EGS) Demonstration Project, currently underway at the Northwest Geysers, California, aims to demonstrate the feasibility of stimulating a deep high-temperature reservoir (up to 400 °C) through water injection over a 2-year period. On October 6, 2011, injection of 25 l/s started from the Prati 32 well at a depth interval of 1850-2699 m below sea level. After a period of almost 2 months, the injection rate was raised to 63 l/s. The flow rate was then decreased to 44 l/s after an additional 3.5 months and maintained at 25 l/s up to August 20, 2012. Significant well-head pressure changes were recorded at Prati State 31 well, which is separated from Prati 32 by about 500 m at reservoir level. More subdued pressure increases occur at greater distances. The water injection caused induced seismicity in the reservoir in the vicinity of the well. Microseismic monitoring and interpretation shows that the cloud of seismic events is mainly located in the granitic intrusion below the injection zone, forming a cluster elongated SSE-NNW (azimuth 170°) that dips steeply to the west. In general, the magnitude of the events increases with depth and the hypocenter depth increases with time. This seismic cloud is hypothesized to correlate with enhanced permeability in the high-temperature reservoir and its variation with time. Based on the existing borehole data, we use the GMS™ GUI to construct a realistic three-dimensional (3D) geologic model of the Northwest Geysers geothermal field. This model includes, from the top down, a low permeability graywacke layer that forms the caprock for the reservoir, an isothermal steam zone (known as the normal temperature reservoir) within metagraywacke, a hornfels zone (where the high-temperature reservoir is located), and a felsite layer that is assumed to extend downward to the magmatic heat source. We then map this model onto a rectangular grid for use with the TOUGH2 multiphase, multicomponent, non

  7. Economic potential of magnetic fusion energy

    SciTech Connect

    Henning, C.D.

    1981-03-10

    Scientific feasibility of magnetic fusion is no longer seriously in doubt. Rapid advances have been made in both tokamak and mirror research, leading to a demonstration in the TFTR tokamak at Princeton in 1982 and the tandem mirror MFTF-B at Livermore in 1985. Accordingly, the basis is established for an aggressive engineering thrust to develop a reactor within this century. However, care must be taken to guide the fusion program towards an economically and environmentally viable goal. While the fusion fuels are essentially free, capital costs of reactors appear to be at least as large as current power plants. Accordingly, the price of electricity will not decline, and capital availability for reactor constructions will be important. Details of reactor cost projections are discussed and mechanisms suggested for fusion power implementation. Also discussed are some environmental and safety aspects of magnetic fusion.

  8. The drive-wise project: driving simulator training increases real driving performance in healthy older drivers

    PubMed Central

    Casutt, Gianclaudio; Theill, Nathan; Martin, Mike; Keller, Martin; Jäncke, Lutz

    2014-01-01

    Background: Age-related cognitive decline is often associated with unsafe driving behavior. We hypothesized that 10 active training sessions in a driving simulator increase cognitive and on-road driving performance. In addition, driving simulator training should outperform cognitive training. Methods: Ninety-one healthy active drivers (62–87 years) were randomly assigned to one of three groups: (1) a driving simulator training group, (2) an attention training group (vigilance and selective attention), or (3) a control group. The main outcome variables were on-road driving and cognitive performance. Seventy-seven participants (85%) completed the training and were included in the analyses. Training gains were analyzed using a multiple regression analysis with planned orthogonal comparisons. Results: The driving simulator-training group showed an improvement in on-road driving performance compared to the attention-training group. In addition, both training groups increased cognitive performance compared to the control group. Conclusion: Driving simulator training offers the potential to enhance driving skills in older drivers. Compared to the attention training, the simulator training seems to be a more powerful program for increasing older drivers' safety on the road. PMID:24860497

  9. A Reliability-Based Track Fusion Algorithm

    PubMed Central

    Xu, Li; Pan, Liqiang; Jin, Shuilin; Liu, Haibo; Yin, Guisheng

    2015-01-01

    The common track fusion algorithms in multi-sensor systems have some defects, such as serious imbalances between accuracy and computational cost, the same treatment of all the sensor information regardless of their quality, high fusion errors at inflection points. To address these defects, a track fusion algorithm based on the reliability (TFR) is presented in multi-sensor and multi-target environments. To improve the information quality, outliers in the local tracks are eliminated at first. Then the reliability of local tracks is calculated, and the local tracks with high reliability are chosen for the state estimation fusion. In contrast to the existing methods, TFR reduces high fusion errors at the inflection points of system tracks, and obtains a high accuracy with less computational cost. Simulation results verify the effectiveness and the superiority of the algorithm in dense sensor environments. PMID:25950174

  10. Simulation of African Easterly Waves and its Projection in Response to Anthropogenic Greenhouse Forcing in a High Resolution AGCM

    NASA Astrophysics Data System (ADS)

    Kunhu Bangalth, Hamza; Raj, Jerry; Bhaskar Gunturu, Udaya; Stenchikov, Georgiy

    2016-04-01

    African Easterly Waves (AEWs) are the primary synoptic-scale disturbances over tropical Africa and Atlantic, which propagate westward from East Africa towards Atlantic during summer. AEWs have a pivotal role in the initiation and organization of the convective rainfall over this region and often act as the precursor for Atlantic tropical cyclones. Present study uses a high resolution AGCM, High Resolution Atmospheric Model (HiRAM) developed at GFDL, to investigate the projected changes in AEW characteristics in response to anthropogenic greenhouse forcing. Ensembles of simulations are conducted at a spatial resolution of ~ 25 km, with observed SST and SSTs from two coarse resolution Earth System Models (ESM2M and ESM2G) developed at GFDL, in the history period (1975-2004). Future projections (till 2050) are also conducted for two Representative Concentration Pathways (RCPs), RCP4.5 and RCP8.5. To test the ability of HiRAM to properly simulate the three dimensional structure and the space-time variability of AEW, the simulations in the history period are compared against two reanalysis products, ERA-Interim and MERRA, and against the parent ESMs. Space-time spectral analysis and complex empirical orthogonal function analysis have been conducted to investigate the dispersion characteristics and modes of variability, respectively. The representation of AEW in HiRAM is comparable to reanalyses and is improved in comparison with the coarse resolution parent ESMs.

  11. The accomplishment of the Engineering Design Activities of IFMIF/EVEDA: The European-Japanese project towards a Li(d,xn) fusion relevant neutron source

    NASA Astrophysics Data System (ADS)

    Knaster, J.; Ibarra, A.; Abal, J.; Abou-Sena, A.; Arbeiter, F.; Arranz, F.; Arroyo, J. M.; Bargallo, E.; Beauvais, P.-Y.; Bernardi, D.; Casal, N.; Carmona, J. M.; Chauvin, N.; Comunian, M.; Delferriere, O.; Delgado, A.; Diaz-Arocas, P.; Fischer, U.; Frisoni, M.; Garcia, A.; Garin, P.; Gobin, R.; Gouat, P.; Groeschel, F.; Heidinger, R.; Ida, M.; Kondo, K.; Kikuchi, T.; Kubo, T.; Le Tonqueze, Y.; Leysen, W.; Mas, A.; Massaut, V.; Matsumoto, H.; Micciche, G.; Mittwollen, M.; Mora, J. C.; Mota, F.; Nghiem, P. A. P.; Nitti, F.; Nishiyama, K.; Ogando, F.; O'hira, S.; Oliver, C.; Orsini, F.; Perez, D.; Perez, M.; Pinna, T.; Pisent, A.; Podadera, I.; Porfiri, M.; Pruneri, G.; Queral, V.; Rapisarda, D.; Roman, R.; Shingala, M.; Soldaini, M.; Sugimoto, M.; Theile, J.; Tian, K.; Umeno, H.; Uriot, D.; Wakai, E.; Watanabe, K.; Weber, M.; Yamamoto, M.; Yokomine, T.

    2015-08-01

    The International Fusion Materials Irradiation Facility (IFMIF), presently in its Engineering Validation and Engineering Design Activities (EVEDA) phase under the frame of the Broader Approach Agreement between Europe and Japan, accomplished in summer 2013, on schedule, its EDA phase with the release of the engineering design report of the IFMIF plant, which is here described. Many improvements of the design from former phases are implemented, particularly a reduction of beam losses and operational costs thanks to the superconducting accelerator concept, the re-location of the quench tank outside the test cell (TC) with a reduction of tritium inventory and a simplification on its replacement in case of failure, the separation of the irradiation modules from the shielding block gaining irradiation flexibility and enhancement of the remote handling equipment reliability and cost reduction, and the water cooling of the liner and biological shielding of the TC, enhancing the efficiency and economy of the related sub-systems. In addition, the maintenance strategy has been modified to allow a shorter yearly stop of the irradiation operations and a more careful management of the irradiated samples. The design of the IFMIF plant is intimately linked with the EVA phase carried out since the entry into force of IFMIF/EVEDA in June 2007. These last activities and their on-going accomplishment have been thoroughly described elsewhere (Knaster J et al [19]), which, combined with the present paper, allows a clear understanding of the maturity of the European-Japanese international efforts. This released IFMIF Intermediate Engineering Design Report (IIEDR), which could be complemented if required concurrently with the outcome of the on-going EVA, will allow decision making on its construction and/or serve as the basis for the definition of the next step, aligned with the evolving needs of our fusion community.

  12. Reconfigurable computing for Monte Carlo simulations: Results and prospects of the Janus project

    NASA Astrophysics Data System (ADS)

    Baity-Jesi, M.; Baños, R. A.; Cruz, A.; Fernandez, L. A.; Gil-Narvion, J. M.; Gordillo-Guerrero, A.; Guidetti, M.; Iñiguez, D.; Maiorano, A.; Mantovani, F.; Marinari, E.; Martin-Mayor, V.; Monforte-Garcia, J.; Muñoz Sudupe, A.; Navarro, D.; Parisi, G.; Pivanti, M.; Perez-Gaviro, S.; Ricci-Tersenghi, F.; Ruiz-Lorenzo, J. J.; Schifano, S. F.; Seoane, B.; Tarancon, A.; Tellez, P.; Tripiccione, R.; Yllanes, D.

    2012-08-01

    We describe Janus, a massively parallel FPGA-based computer optimized for the simulation of spin glasses, theoretical models for the behavior of glassy materials. FPGAs (as compared to GPUs or many-core processors) provide a complementary approach to massively parallel computing. In particular, our model problem is formulated in terms of binary variables, and floating-point operations can be (almost) completely avoided. The FPGA architecture allows us to run many independent threads with almost no latencies in memory access, thus updating up to 1024 spins per cycle. We describe Janus in detail and we summarize the physics results obtained in four years of operation of this machine; we discuss two types of physics applications: long simulations on very large systems (which try to mimic and provide understanding about the experimental non-equilibrium dynamics), and low-temperature equilibrium simulations using an artificial parallel tempering dynamics. The time scale of our non-equilibrium simulations spans eleven orders of magnitude (from picoseconds to a tenth of a second). On the other hand, our equilibrium simulations are unprecedented both because of the low temperatures reached and for the large systems that we have brought to equilibrium. A finite-time scaling ansatz emerges from the detailed comparison of the two sets of simulations. Janus has made it possible to perform spin-glass simulations that would take several decades on more conventional architectures. The paper ends with an assessment of the potential of possible future versions of the Janus architecture, based on state-of-the-art technology.

  13. Stanford's sedsim project: Dynamic three-dimensional simulation of geologic processes that affect clastic sediments

    NASA Astrophysics Data System (ADS)

    Lee, Young-Hoon; Harbaugh, John W.

    Simulation experiments using SEDSIM reproduced patterns of fluid flow and sedimentation typical of fluvial environments. In the bent-channel experiment, SEDSIM reproduced flow velocities, fluid depths, bottom shear stresses, and rates of sediment transport comparable to those in natural channels. The results also show that fluid velocities were high near the cut bank of the simulated channel, and that channel deposits occur as fining-upwards sequences. Coarse-grained sediments were deposited in the upper channel as submerged dunes or transverse bars that subsequently migrated downstream. After six simulated years, fine-grained sediments covered the coarse-grained sediment, and the sediment load and fluid flow reached equilibrium. Thus, SEDSIM appears to be reasonably effective in representing flow and sedimentation in meandering channels, and should provide insight in understanding the spatial distribution of sediment bodies in fluvial deposits and the internal structures within these bodies.

  14. Validation of CME Detection Software (CACTus) by Means of Simulated Data, and Analysis of Projection Effects on CME Velocity Measurements

    NASA Astrophysics Data System (ADS)

    Bonte, K.; Jacobs, C.; Robbrecht, E.; de Groof, A.; Berghmans, D.; Poedts, S.

    2011-05-01

    In the context of space weather forecasting, an automated detection of coronal mass ejections (CMEs) becomes more and more important for efficiently handling a large data flow which is expected from recently-launched and future solar missions. In this paper we validate the detection software package "CACTus" by applying the program to synthetic data from our 3D time-dependent CME simulations instead of observational data. The main strength of this study is that we know in advance what should be detected. We describe the sensitivities and strengths of automated detection, more specific for the CACTus program, resulting in a better understanding of CME detection on one hand and the calibration of the CACTus software on the other hand, suggesting possible improvements of the package. In addition, the simulation is an ideal tool to investigate projection effects on CME velocity measurements.

  15. Current status of the numerical reservoir simulation of the Peace River in-situ project

    SciTech Connect

    Myhill, N.A.; Henderson, I.G.; Schmitz, R.V.

    1982-01-01

    The initial warm-up phase of a large, 31-well, 50-acre steam drive pilot at Peace River is described. This pilot is a field test of a laboratory developed pressure cycle steam drive process. The operational prognosis for this pilot is based on the results of over 100 physical vacuum model experiments. The prognosis includes a warm-up phase, followed by 2 pressure cycles. To gain confidence in the simulator, prior Peace River tests and physical model experiments were matched. Some of these matches are outlined. The results of some of these simulations and the insight they give into the process mechanics are presented.

  16. Project GeoSim: A GIS-Based Simulation Laboratory for Introductory Geography.

    ERIC Educational Resources Information Center

    Shaffer, Clifford A.

    This report describes a multidisciplinary project by members of Virginia Polytechnic Institute and State University's Departments of Geography and Computer Science, and College of Education, to develop computer-aided education software for introductory geography at the college and high school levels. GeoSim's goal was to produce major changes in…

  17. The Virtual ChemLab Project: A Realistic and Sophisticated Simulation of Inorganic Qualitative Analysis

    ERIC Educational Resources Information Center

    Woodfield, Brian F.; Catlin, Heidi R.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Bodily, Greg; Allen, Rob

    2004-01-01

    Virtual ChemLab project is an instructional laboratory involved in providing a practical experience by connecting the theory and laboratory practicals, teaching laboratory techniques and teaching the cognitive processes. This lab provides the students with the freedom to explore, repeat the procedures again, focuses on the underlying principles of…

  18. Simulation of a Forensic Chemistry Problem: A Multidisciplinary Project for Secondary School Chemistry Students.

    ERIC Educational Resources Information Center

    Long, G. A.

    1995-01-01

    Describes a project that uses a multidisciplinary approach to problem solving in analyzing a crime scene and suspect evidence. Requires each student to work effectively in a team, communicate in both written and oral forms, perform hands-on laboratory manipulations, and realize that the entire class was depending on their individual contributions…

  19. A Tire Gasification Senior Design Project That Integrates Laboratory Experiments and Computer Simulation

    ERIC Educational Resources Information Center

    Weiss, Brian; Castaldi, Marco J.

    2006-01-01

    A reactor to convert waste rubber tires to useful products such as CO and H2, was investigated in a university undergraduate design project. The student worked individually with mentorship from a faculty professor who aided the student with professional critique. The student was able to research the background of the field and conceive of a novel…

  20. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant.

    PubMed

    Camplani, M; Malizia, A; Gelfusa, M; Barbato, F; Antonelli, L; Poggi, L A; Ciparisse, J F; Salgado, L; Richetta, M; Gaudio, P

    2016-01-01

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles' velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach. PMID:26827318

  1. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant

    NASA Astrophysics Data System (ADS)

    Camplani, M.; Malizia, A.; Gelfusa, M.; Barbato, F.; Antonelli, L.; Poggi, L. A.; Ciparisse, J. F.; Salgado, L.; Richetta, M.; Gaudio, P.

    2016-01-01

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles' velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach.

  2. A DATA FUSION SYSTEM FOR THE NONDESTRUCTIVE EVALUATION OF NON-PIGGABLE PIPES

    SciTech Connect

    Dr. Shreekanth Mandayam; Dr. Robi Polikar; Dr. John C. Chen

    2003-06-01

    The objectives of this research project are: (1) To design sensor data fusion algorithms that can synergistically combine defect related information from heterogeneous sensors used in gas pipeline inspection for reliably and accurately predicting the condition of the pipe-wall. (2) To develop efficient data management techniques for signals obtained during multisensor interrogation of a gas pipeline. During this reporting period, Rowan University fabricated test specimens with simulated defects for nondestructive evaluation (NDE); designed and developed two versions of a test platform for performing multi-sensor interrogation of test specimens under loaded conditions simulating pressurized gas pipelines; and performed acoustic emission (AE) NDE on the test specimens. The data resulting from this work will be employed for designing multi-sensor data fusion algorithms during the next reporting period.

  3. A DATA FUSION SYSTEM FOR THE NONDESTRUCTIVE EVALUATION OF NON-PIGGABLE PIPES

    SciTech Connect

    Shreekanth Mandayam; Robi Polikar; John C. Chen

    2004-04-01

    The objectives of this research project are: (1) To design sensor data fusion algorithms that can synergistically combine defect related information from heterogeneous sensors used in gas pipeline inspection for reliably and accurately predicting the condition of the pipe-wall. (2) To develop efficient data management techniques for signals obtained during multisensor interrogation of a gas pipeline. During this reporting period, Rowan University fabricated test specimens with simulated defects for nondestructive evaluation (NDE); designed and developed two versions of a test platform for performing multi-sensor interrogation of test specimens under loaded conditions simulating pressurized gas pipelines; and performed magnetic flux leakage (MFL), ultrasonic testing (UT), thermal imaging and acoustic emission (AE) NDE on the test specimens. The data resulting from this work will be employed for designing multi-sensor data fusion algorithms.

  4. EMISSIONS OF AIR TOXICS FROM A SIMULATED CHARCOAL KILN EQUIPPED WITH AN AFTERBURNER (PROJECT SUMMARY)

    EPA Science Inventory

    A laboratory-scale charcoal kiln simu-lator was constructed and tested to de-termine if it could be used to produce charcoal that was similar to that pro-duced in Missouri-type charcoal kilns. An afterburner was added later to study conditions for oxidizing the volatile or-ganic ...

  5. A Simulation-Based LED Design Project in Photonics Instruction Based on Industry-University Collaboration

    ERIC Educational Resources Information Center

    Chang, S. -H.; Chen, M. -L.; Kuo, Y. -K.; Shen, Y. -C.

    2011-01-01

    In response to the growing industrial demand for light-emitting diode (LED) design professionals, based on industry-university collaboration in Taiwan, this paper develops a novel instructional approach: a simulation-based learning course with peer assessment to develop students' professional skills in LED design as required by industry as well as…

  6. Multi-Scale Simulations of Past and Future Projections of Hydrology in Lake Tahoe Basin, California-Nevada (Invited)

    NASA Astrophysics Data System (ADS)

    Niswonger, R. G.; Huntington, J. L.; Dettinger, M. D.; Rajagopal, S.; Gardner, M.; Morton, C. G.; Reeves, D. M.; Pohll, G. M.

    2013-12-01

    Water resources in the Tahoe basin are susceptible to long-term climate change and extreme events because it is a middle-altitude, snow-dominated basin that experiences large inter-annual climate variations. Lake Tahoe provides critical water supply for its basin and downstream populations, but changes in water supply are obscured by complex climatic and hydrologic gradients across the high relief, geologically complex basin. An integrated surface and groundwater model of the Lake Tahoe basin has been developed using GSFLOW to assess the effects of climate change and extreme events on surface and groundwater resources. Key hydrologic mechanisms are identified with this model that explains recent changes in water resources of the region. Critical vulnerabilities of regional water-supplies and hazards also were explored. Maintaining a balance between (a) accurate representation of spatial features (e.g., geology, streams, and topography) and hydrologic response (i.e., groundwater, stream, lake, and wetland flows and storages), and (b) computational efficiency, is a necessity for the desired model applications. Potential climatic influences on water resources are analyzed here in simulations of long-term water-availability and flood responses to selected 100-year climate-model projections. GSFLOW is also used to simulate a scenario depicting an especially extreme storm event that was constructed from a combination of two historical atmospheric-river storm events as part of the USGS MultiHazards Demonstration Project. Historical simulated groundwater levels, streamflow, wetlands, and lake levels compare well with measured values for a 30-year historical simulation period. Results are consistent for both small and large model grid cell sizes, due to the model's ability to represent water table altitude, streams, and other hydrologic features at the sub-grid scale. Simulated hydrologic responses are affected by climate change, where less groundwater resources will be

  7. Viral membrane fusion

    PubMed Central

    Harrison, Stephen C.

    2015-01-01

    Membrane fusion is an essential step when enveloped viruses enter cells. Lipid bilayer fusion requires catalysis to overcome a high kinetic barrier; viral fusion proteins are the agents that fulfill this catalytic function. Despite a variety of molecular architectures, these proteins facilitate fusion by essentially the same generic mechanism. Stimulated by a signal associated with arrival at the cell to be infected (e.g., receptor or co-receptor binding, proton binding in an endosome), they undergo a series of conformational changes. A hydrophobic segment (a “fusion loop” or “fusion peptide”) engages the target-cell membrane and collapse of the bridging intermediate thus formed draws the two membranes (virus and cell) together. We know of three structural classes for viral fusion proteins. Structures for both pre- and postfusion conformations of illustrate the beginning and end points of a process that can be probed by single-virion measurements of fusion kinetics. PMID:25866377

  8. An Assessment of Artificial Compressibility and Pressure Projection Methods for Incompressible Flow Simulations

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, C.; Smith, Charles A. (Technical Monitor)

    1998-01-01

    Performance of the two commonly used numerical procedures, one based on artificial compressibility method and the other pressure projection method, are compared. These formulations are selected primarily because they are designed for three-dimensional applications. The computational procedures are compared by obtaining steady state solutions of a wake vortex and unsteady solutions of a curved duct flow. For steady computations, artificial compressibility was very efficient in terms of computing time and robustness. For an unsteady flow which requires small physical time step, pressure projection method was found to be computationally more efficient than an artificial compressibility method. This comparison is intended to give some basis for selecting a method or a flow solution code for large three-dimensional applications where computing resources become a critical issue.

  9. A higher-order projection method for the simulation of unsteady turbulent nonpremixed combustion in an industrial burner

    SciTech Connect

    Pember, R.B.; Almgren, A.S.; Bell, J.B.; Colella, P.; Howell, L.; Lai, M.

    1994-12-01

    The modeling of transient effects in burners is becoming increasingly important. The problem of ensuring the safe performance of an industrial burner, for example, is much more difficult during the startup or shutdown phases of operation. The peak formation of pollutants is also much more dependent on transient behavior, in particular, on peak temperatures, than on average operating conditions. In this paper we present a new methodology for the modeling of unsteady, nonpremixed, reacting flow in industrial burners. The algorithm uses a second-order projection method for unsteady, low-Mach number reacting flow and accounts for species diffusion, convective and radiative heat transfer, viscous transport, turbulence, and chemical kinetics. The time step used by the method is restricted solely by an advective CFL condition. The methodology is applicable only in the low-Mach number regime (M < .3), typically met in industrial burners. The projection method for low-Mach number reacting flow is an extension of a higher-order projection method for incompressible flow [9, 5, 3,4] to the low-Mach number equations of reacting flow. Our method is based on an approximate projection formulation. Radiative transport is modeled using the discrete ordinates method. The main goal of this work is to introduce and investigate the simulation of burners using a higher-order projection method for low-Mach number combustion. As such, the methodology is applied here only to axisymmetric flow in gas-fired burners for which the boundaries can be aligned with a rectangular grid. The perfect gas law is also assumed. In addition, we use a one-step reduced kinetics mechanism, a {kappa} {minus} {epsilon} model for turbulent transport, and a simple turbulent combustion model.

  10. The Dust Management Project: Characterizing Lunar Environments and Dust, Developing Regolith Mitigation Technology and Simulants

    NASA Technical Reports Server (NTRS)

    Hyatt, Mark J.; Straka, Sharon A.

    2010-01-01

    A return to the Moon to extend human presence, pursue scientific activities, use the Moon to prepare for future human missions to Mars, and expand Earth?s economic sphere, will require investment in developing new technologies and capabilities to achieve affordable and sustainable human exploration. From the operational experience gained and lessons learned during the Apollo missions, conducting long-term operations in the lunar environment will be a particular challenge, given the difficulties presented by the unique physical properties and other characteristics of lunar regolith, including dust. The Apollo missions and other lunar explorations have identified significant lunar dust-related problems that will challenge future mission success. Comprised of regolith particles ranging in size from tens of nanometers to microns, lunar dust is a manifestation of the complex interaction of the lunar soil with multiple mechanical, electrical, and gravitational effects. The environmental and anthropogenic factors effecting the perturbation, transport, and deposition of lunar dust must be studied in order to mitigate it?s potentially harmful effects on exploration systems and human explorers. The Dust Management Project (DMP) is tasked with the evaluation of lunar dust effects, assessment of the resulting risks, and development of mitigation and management strategies and technologies related to Exploration Systems architectures. To this end, the DMP supports the overall goal of the Exploration Technology Development Program (ETDP) of addressing the relevant high priority technology needs of multiple elements within the Constellation Program (CxP) and sister ETDP projects. Project scope, plans, and accomplishments will be presented.

  11. Plasma asymmetry due to the magnetic filter in fusion-type negative ion sources: Comparisons between two and three-dimensional particle-in-cell simulations

    SciTech Connect

    Fubiani, G. Boeuf, J. P.

    2014-07-15

    Previously reported 2D Particle-In-Cell Monte Carlo Collisions (PIC-MCC) simulations of negative ion sources under conditions similar to those of the ITER neutral beam injection system have shown that the presence of the magnetic filter tends to generate asymmetry in the plasma properties in the extraction region. In this paper, we show that these conclusions are confirmed by 3D PIC-MCC simulations and we provide quantitative comparisons between the 2D and 3D model predictions.

  12. Control Room Training for the Hyper-X Project Utilizing Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Lux-Baumann, Jesica; Dees, Ray; Fratello, David

    2006-01-01

    The NASA Dryden Flight Research Center flew two Hyper-X research vehicles and achieved hypersonic speeds over the Pacific Ocean in March and November 2004. To train the flight and mission control room crew, the NASA Dryden simulation capability was utilized to generate telemetry and radar data, which was used in nominal and emergency mission scenarios. During these control room training sessions personnel were able to evaluate and refine data displays, flight cards, mission parameter allowable limits, and emergency procedure checklists. Practice in the mission control room ensured that all primary and backup Hyper-X staff were familiar with the nominal mission and knew how to respond to anomalous conditions quickly and successfully. This report describes the technology in the simulation environment and the Mission Control Center, the need for and benefit of control room training, and the rationale and results of specific scenarios unique to the Hyper-X research missions.

  13. Can correcting feature location in simulated mean climate improve agreement on projected changes?

    NASA Astrophysics Data System (ADS)

    Levy, Adam A. L.; Ingram, William; Jenkinson, Mark; Huntingford, Chris; Hugo Lambert, F.; Allen, Myles

    2013-01-01

    To the extent that deficiencies in GCM simulations of precipitation are due to persistent errors of location and timing, correcting the spatial and seasonal distribution of features would provide a physically based improvement in inter-model agreement on future changes. We use a tool for the analysis of medical images to warp the precipitation climatologies of 14 General Circulation Models (GCMs) closer to a reanalysis of observations, rather than adjusting intensities locally as in conventional bias correction techniques. These warps are then applied to the same GCMs' simulated changes in mean climate under a CO2 quadrupling experiment. We find that the warping process not only makes GCMs' historical climatologies more closely resemble reanalysis but also reduces the disagreement between the models' response to this external forcing. Developing a tool that is tailored for the specific requirements of climate fields may provide further improvement, particularly in combination with local bias correction techniques.

  14. Commnity Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect

    Spentzouris, Panagiotis; Cary, John; Mcinnes, Lois Curfman; Mori, Warren; Ng, Cho; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2008-07-01

    The design and performance optimization of particle accelerators is essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC1 Accelerator Science and Technology project, the SciDAC2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modeling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multi-physics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  15. Community petascale project for accelerator science and simulation : Advancing computational science for future accelerators and accelerator technologies.

    SciTech Connect

    Spentzouris, P.; Cary, J.; McInnes, L. C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.

    2008-01-01

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R & D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  16. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    SciTech Connect

    Spentzouris, Panagiotis; Cary, John; Mcinnes, Lois Curfman; Mori, Warren; Ng, Cho; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2011-10-21

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  17. Simulation of Plasma Jet Merger and Liner Formation within the PLX- α Project

    NASA Astrophysics Data System (ADS)

    Samulyak, Roman; Chen, Hsin-Chiang; Shih, Wen; Hsu, Scott

    2015-11-01

    Detailed numerical studies of the propagation and merger of high Mach number argon plasma jets and the formation of plasma liners have been performed using the newly developed method of Lagrangian particles (LP). The LP method significantly improves accuracy and mathematical rigor of common particle-based numerical methods such as smooth particle hydrodynamics while preserving their main advantages compared to grid-based methods. A brief overview of the LP method will be presented. The Lagrangian particle code implements main relevant physics models such as an equation of state for argon undergoing atomic physics transformation, radiation losses in thin optical limit, and heat conduction. Simulations of the merger of two plasma jets are compared with experimental data from past PLX experiments. Simulations quantify the effect of oblique shock waves, ionization, and radiation processes on the jet merger process. Results of preliminary simulations of future PLX- alpha experiments involving the ~ π / 2 -solid-angle plasma-liner configuration with 9 guns will also be presented. Partially supported by ARPA-E's ALPHA program.

  18. The Living Heart Project: A robust and integrative simulator for human heart function.

    PubMed

    Baillargeon, Brian; Rebelo, Nuno; Fox, David D; Taylor, Robert L; Kuhl, Ellen

    2014-11-01

    The heart is not only our most vital, but also our most complex organ: Precisely controlled by the interplay of electrical and mechanical fields, it consists of four chambers and four valves, which act in concert to regulate its filling, ejection, and overall pump function. While numerous computational models exist to study either the electrical or the mechanical response of its individual chambers, the integrative electro-mechanical response of the whole heart remains poorly understood. Here we present a proof-of-concept simulator for a four-chamber human heart model created from computer topography and magnetic resonance images. We illustrate the governing equations of excitation-contraction coupling and discretize them using a single, unified finite element environment. To illustrate the basic features of our model, we visualize the electrical potential and the mechanical deformation across the human heart throughout its cardiac cycle. To compare our simulation against common metrics of cardiac function, we extract the pressure-volume relationship and show that it agrees well with clinical observations. Our prototype model allows us to explore and understand the key features, physics, and technologies to create an integrative, predictive model of the living human heart. Ultimately, our simulator will open opportunities to probe landscapes of clinical parameters, and guide device design and treatment planning in cardiac diseases such as stenosis, regurgitation, or prolapse of the aortic, pulmonary, tricuspid, or mitral valve. PMID:25267880

  19. New Capabilities for Modeling Intense Beams in Heavy Ion Fusion Drivers

    SciTech Connect

    Friedman, A; Barnard, J J; Bieniosek, F M; Celata, C M; Cohen, R H; Davidson, R C; Grote, D P; Haber, I; Henestroza, E; Lee, E P; Lund, S M; Qin, H; Sharp, W M; Startsev, E; Vay, J L

    2003-09-09

    Significant advances have been made in modeling the intense beams of heavy-ion beam-driven Inertial Fusion Energy (Heavy Ion Fusion). In this paper, a roadmap for a validated, predictive driver simulation capability, building on improved codes and experimental diagnostics, is presented, as are examples of progress. The Mesh Refinement and Particle-in-Cell methods were integrated in the WARP code; this capability supported an injector experiment that determined the achievable current rise time, in good agreement with calculations. In a complementary effort, a new injector approach based on the merging of {approx}100 small beamlets was simulated, its basic feasibility established, and an experimental test designed. Time-dependent 3D simulations of the High Current Experiment (HCX) were performed, yielding voltage waveforms for an upcoming study of bunch-end control. Studies of collective beam modes which must be taken into account in driver designs were carried out. The value of using experimental data to tomographically ''synthesize'' a 4D beam particle distribution and so initialize a simulation was established; this work motivated further development of new diagnostics which yield 3D projections of the beam phase space. Other developments, including improved modeling of ion beam focusing and transport through the fusion chamber environment and onto the target, and of stray electrons and their effects on ion beams, are briefly noted.

  20. The Numerical Tokamak Project (NTP) simulation of turbulent transport in the core plasma: A grand challenge in plasma physics

    SciTech Connect

    Not Available

    1993-12-01

    The long-range goal of the Numerical Tokamak Project (NTP) is the reliable prediction of tokamak performance using physics-based numerical tools describing tokamak physics. The NTP is accomplishing the development of the most advanced particle and extended fluid model`s on massively parallel processing (MPP) environments as part of a multi-institutional, multi-disciplinary numerical study of tokamak core fluctuations. The NTP is a continuing focus of the Office of Fusion Energy`s theory and computation program. Near-term HPCC work concentrates on developing a predictive numerical description of the core plasma transport in tokamaks driven by low-frequency collective fluctuations. This work addresses one of the greatest intellectual challenges to our understanding of the physics of tokamak performance and needs the most advanced computational resources to progress. We are conducting detailed comparisons of kinetic and fluid numerical models of tokamak turbulence. These comparisons are stimulating the improvement of each and the development of hybrid models which embody aspects of both. The combination of emerging massively parallel processing hardware and algorithmic improvements will result in an estimated 10**2--10**6 performance increase. Development of information processing and visualization tools is accelerating our comparison of computational models to one another, to experimental data, and to analytical theory, providing a bootstrap effect in our understanding of the target physics. The measure of success is the degree to which the experimentally observed scaling of fluctuation-driven transport may be predicted numerically. The NTP is advancing the HPCC Initiative through its state-of-the-art computational work. We are pushing the capability of high performance computing through our efforts which are strongly leveraged by OFE support.

  1. Downscaling seasonal to centennial simulations on distributed computing infrastructures using WRF model. The WRF4G project

    NASA Astrophysics Data System (ADS)

    Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.

    2013-12-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models

  2. Magneto-Inertial Fusion

    DOE PAGESBeta

    Wurden, G. A.; Hsu, S. C.; Intrator, T. P.; Grabowski, T. C.; Degnan, J. H.; Domonkos, M.; Turchi, P. J.; Campbell, E. M.; Sinars, D. B.; Herrmann, M. C.; et al

    2015-11-17

    In this community white paper, we describe an approach to achieving fusion which employs a hybrid of elements from the traditional magnetic and inertial fusion concepts, called magneto-inertial fusion (MIF). The status of MIF research in North America at multiple institutions is summarized including recent progress, research opportunities, and future plans.

  3. Slow liner fusion

    SciTech Connect

    Shaffer, M.J.

    1997-08-01

    {open_quotes}Slow{close_quotes} liner fusion ({approximately}10 ms compression time) implosions are nondestructive and make repetitive ({approximately} 1 Hz) pulsed liner fusion reactors possible. This paper summarizes a General Atomics physics-based fusion reactor study that showed slow liner feasibility, even with conservative open-line axial magnetic field confinement and Bohm radial transport.

  4. Cold fusion research

    SciTech Connect

    1989-11-01

    I am pleased to forward to you the Final Report of the Cold Fusion Panel. This report reviews the current status of cold fusion and includes major chapters on Calorimetry and Excess Heat, Fusion Products and Materials Characterization. In addition, the report makes a number of conclusions and recommendations, as requested by the Secretary of Energy.

  5. Cluster-impact fusion

    SciTech Connect

    Echenique, P.M.; Manson, J.R.; Ritchie, R.H. )

    1990-03-19

    We present a model for the cluster-impact-fusion experiments of Buehler, Friedlander, and Friedman, Calculated fusion rates as a function of bombarding energy for constant cluster size agree well with experiment. The dependence of the fusion rate on cluster size at fixed bombarding energy is explained qualitatively. The role of correlated, coherent collisions in enhanced energy loss by clusters is emphasized.

  6. The Change of First-flowering Date over South Korea Projected from Downscaled IPCC AR5 Simulation: Peach and Pear

    NASA Astrophysics Data System (ADS)

    Ahn, J. B.; Hur, J.

    2014-12-01

    The variations in the first-flowering date (FFD) of peach (Prunus persica) and pear (Pyrus pyrifolia) under future climate change in South Korea are investigated using simulations obtained from five models of the fifth Coupled Model Intercomparison Project. For the study, daily temperature simulations with Historical (1986-2005), and RCP (2071-2090) 4.5 and 8.5 scenarios are statistically downscaled to 50 peach and pear FFD (FFDpeach and FFDpear, respectively) observation sites over South Korea. The number of days transformed to standard temperature (DTS) method is selected as the phenological model and applied to simulations for estimating FFDpeach and FFDpear over South Korea, due to its superior performance on the target plants and region compared to the growing degree days (GDD) and chill days (CD) methods. In the analysis, mean temperatures for early spring (February to April) over South Korea in 2090 under RCP4.5 and 8.5 scenarios are expected to have increased by 1.9K and 3.3K, respectively. Among the early spring months of February to April, February shows the largest temperature increase of 2.1K and 3.7K for RCP4.5 and 8.5 scenarios, respectively. The increased temperature during February and March accelerates the plant growth rate and thereby advances FFDpeach by 7.0 and 12.7 days and FFDpear by 6.1 and 10.7 days, respectively. These results imply that the present flowering of peach and pear in the middle of April will have advanced to late March or early April by the end of this century. Acknowledgements This work was carried out with the support of the Rural Development Administration Cooperative Research Program for Agriculture Science and Technology Development under Grant Project No. PJ009953, Republic of Korea.

  7. The WASCAL regional climate simulations for West Africa - how to add value to existing climate projections

    NASA Astrophysics Data System (ADS)

    Arnault, J.; Heinzeller, D.; Klein, C.; Dieng, D.; Smiatek, G.; Bliefernicht, J.; Sylla, M. B.; Kunstmann, H.

    2015-12-01

    With climate change being one of the most severe challenges to rural Africa in the 21st century, West Africa is facing an urgent need to develop effective adaptation and mitigation measures to protect its constantly growing population. WASCAL (West African Science Service Center on Climate Change and Adapted Land Use) is a large-scale research-focused program designed to enhance the resilience of human and environmental systems to climate change and increased variability. An integral part of its climate services is the provisioning of a new set of high resolution, ensemble-based regional climate change scenarios for the region of West Africa. In this contribution, we present the overall concept of the WASCAL regional climate projections and provide information on the dissemination of the data. We discuss the model performance over the validation period for two of the three regional climate models employed, the Weather Research & Forecasting Tool (WRF) and the Consortium for Small-scale Modeling Model COSMO in Climate Mode (COSMO-CLM), and give details about a novel precipitation database used to verify the models. Particular attention is paid to the representation of the dynamics of the West African Summer Monsoon and to the added value of our high resolution models over existing data sets. We further present results on the climate change signal obtained from the WRF model runs for the periods 2020-2050 and 2070-2100 and compare them to current state-of-the-art projections from the CORDEX project. As an example, the figure shows the different climate change signals obtained for the total annual rainfall with respect to the 1980-2010 mean (WRF-E: WASCAL 12km high-resolution run MPI-ESM + WRFV3.5.1, CORDEX-E: 50km medium-resolution run MPI-ESM + RCA4, CORDEX-G: 50km medium-resolution run GFDL-ESM + RCA4).

  8. Using Discrete Event Simulation to predict KPI's at a Projected Emergency Room.

    PubMed

    Concha, Pablo; Neriz, Liliana; Parada, Danilo; Ramis, Francisco

    2015-01-01

    Discrete Event Simulation (DES) is a powerful factor in the design of clinical facilities. DES enables facilities to be built or adapted to achieve the expected Key Performance Indicators (KPI's) such as average waiting times according to acuity, average stay times and others. Our computational model was built and validated using expert judgment and supporting statistical data. One scenario studied resulted in a 50% decrease in the average cycle time of patients compared to the original model, mainly by modifying the patient's attention model. PMID:26262262

  9. The Ohio River Valley CO2 Storage Project AEP Mountaineer Plant, West Virginia Numerical Simulation and Risk Assessment Report

    SciTech Connect

    Neeraj Gupta

    2008-03-31

    A series of numerical simulations of carbon dioxide (CO{sub 2}) injection were conducted as part of a program to assess the potential for geologic sequestration in deep geologic reservoirs (the Rose Run and Copper Ridge formations), at the American Electric Power (AEP) Mountaineer Power Plant outside of New Haven, West Virginia. The simulations were executed using the H{sub 2}O-CO{sub 2}-NaCl operational mode of the Subsurface Transport Over Multiple Phases (STOMP) simulator (White and Oostrom, 2006). The objective of the Rose Run formation modeling was to predict CO{sub 2} injection rates using data from the core analysis conducted on the samples. A systematic screening procedure was applied to the Ohio River Valley CO{sub 2} storage site utilizing the Features, Elements, and Processes (FEP) database for geological storage of CO{sub 2} (Savage et al., 2004). The objective of the screening was to identify potential risk categories for the long-term geological storage of CO{sub 2} at the Mountaineer Power Plant in New Haven, West Virginia. Over 130 FEPs in seven main classes were assessed for the project based on site characterization information gathered in a geological background study, testing in a deep well drilled on the site, and general site conditions. In evaluating the database, it was apparent that many of the items were not applicable to the Mountaineer site based its geologic framework and environmental setting. Nine FEPs were identified for further consideration for the site. These FEPs generally fell into categories related to variations in subsurface geology, well completion materials, and the behavior of CO{sub 2} in the subsurface. Results from the screening were used to provide guidance on injection system design, developing a monitoring program, performing reservoir simulations, and other risk assessment efforts. Initial work indicates that the significant FEPs may be accounted for by focusing the storage program on these potential issues. The

  10. AeroCom INSITU Project: Comparison of Aerosol Optical Properties from In-situ Surface Measurements and Model Simulations

    NASA Astrophysics Data System (ADS)

    Schmeisser, L.; Andrews, E.; Schulz, M.; Fiebig, M.; Zhang, K.; Randles, C. A.; Myhre, G.; Chin, M.; Stier, P.; Takemura, T.; Krol, M. C.; Bian, H.; Skeie, R. B.; da Silva, A. M., Jr.; Kokkola, H.; Laakso, A.; Ghan, S.; Easter, R. C.

    2015-12-01

    AeroCom, an open international collaboration of scientists seeking to improve global aerosol models, recently initiated a project comparing model output to in-situ, surface-based measurements of aerosol optical properties. The model/measurement comparison project, called INSITU, aims to evaluate the performance of a suite of AeroCom aerosol models with site-specific observational data in order to inform iterative improvements to model aerosol modules. Surface in-situ data have the unique property of being traceable to physical standards, which is a big asset in accomplishing the overarching goal of bettering the accuracy of aerosol processes and predicative capability of global climate models. The INSITU project looks at how well models reproduce aerosol climatologies on a variety of time scales, aerosol characteristics and behaviors (e.g., aerosol persistence and the systematic relationships between aerosol optical properties), and aerosol trends. Though INSITU is a multi-year endeavor, preliminary phases of the analysis, using GOCART and other models participating in this AeroCom project, show substantial model biases in absorption and scattering coefficients compared to surface measurements, though the sign and magnitude of the bias varies with location and optical property. Spatial patterns in the biases highlight model weaknesses, e.g., the inability of models to properly simulate aerosol characteristics at sites with complex topography (see Figure 1). Additionally, differences in modeled and measured systematic variability of aerosol optical properties suggest that some models are not accurately capturing specific aerosol co-dependencies, for example, the tendency of in-situ surface single scattering albedo to decrease with decreasing aerosol extinction coefficient. This study elucidates specific problems with current aerosol models and suggests additional model runs and perturbations that could further evaluate the discrepancies between measured and modeled

  11. Viral membrane fusion

    SciTech Connect

    Harrison, Stephen C.

    2015-05-15

    Membrane fusion is an essential step when enveloped viruses enter cells. Lipid bilayer fusion requires catalysis to overcome a high kinetic barrier; viral fusion proteins are the agents that fulfill this catalytic function. Despite a variety of molecular architectures, these proteins facilitate fusion by essentially the same generic mechanism. Stimulated by a signal associated with arrival at the cell to be infected (e.g., receptor or co-receptor binding, proton binding in an endosome), they undergo a series of conformational changes. A hydrophobic segment (a “fusion loop” or “fusion peptide”) engages the target-cell membrane and collapse of the bridging intermediate thus formed draws the two membranes (virus and cell) together. We know of three structural classes for viral fusion proteins. Structures for both pre- and postfusion conformations of illustrate the beginning and end points of a process that can be probed by single-virion measurements of fusion kinetics. - Highlights: • Viral fusion proteins overcome the high energy barrier to lipid bilayer merger. • Different molecular structures but the same catalytic mechanism. • Review describes properties of three known fusion-protein structural classes. • Single-virion fusion experiments elucidate mechanism.

  12. The immersed boundary projection method and its application to simulation and control of flows around low-aspect-ratio wings

    NASA Astrophysics Data System (ADS)

    Taira, Kunihiko

    First, we present a new formulation of the immersed boundary method that is algebraically identical to the traditional fractional step algorithm. This method, called the immersed boundary projection method, allows for the simulations of incompressible flows over arbitrarily shaped bodies under motion and/or deformation in both two and three dimensions. The no-slip condition along the immersed boundary is enforced simultaneously with the incompressibility constraint through a single projection. The boundary force is determined implicitly without any constitutive relations for the rigid body formulation, which in turn allows the use of high CFL numbers in our simulations compared to past methods. Next, the above immersed boundary projection method is used to analyze three-dimensional separated flows around low-aspect-ratio flat-plate wings. A number of simulations highlighting the unsteady nature of the separated flows are performed for Re=300 and 500 with various aspect ratios, angles of attack, and planform geometries. The aspect ratio and angle of attack are found to have a large influence on the stability of the wake profile and the force experienced by the low-aspect-ratio wing. At early times, following an impulsive start, topologies of the wake vortices are found to be the same across different aspect ratios and angles of attack. Behind low-aspect-ratio rectangular plates, leading-edge vortices form and eventually separate as hairpin vortices following the start-up. This phenomenon is found to be similar to dynamic stall observed behind pitching plates. The detached structure would then interact with the tip vortices, reducing the downward velocity induced by the tip vortices acting upon the leading-edge vortex. At large time, depending on the aspect ratio and angles of attack, the wakes reach one of the three states: (i) a steady state, (ii) a periodic unsteady state, or (iii) an aperiodic unsteady state. We have observed that the tip effects in three

  13. Engineering Challenges in Antiproton Triggered Fusion Propulsion

    SciTech Connect

    Cassenti, Brice; Kammash, Terry

    2008-01-21

    During the last decade antiproton triggered fusion propulsion has been investigated as a method for achieving high specific impulse, high thrust in a nuclear pulse propulsion system. In general the antiprotons are injected into a pellet containing fusion fuel with a small amount of fissionable material (i.e., an amount less than the critical mass) where the products from the fission are then used to trigger a fusion reaction. Initial calculations and simulations indicate that if magnetically insulated inertial confinement fusion is used that the pellets should result in a specific impulse of between 100,000 and 300,000 seconds at high thrust. The engineering challenges associated with this propulsion system are significant. For example, the antiprotons must be precisely focused. The pellet must be designed to contain the fission and initial fusion products and this will require strong magnetic fields. The fusion fuel must be contained for a sufficiently long time to effectively release the fusion energy, and the payload must be shielded from the radiation, especially the excess neutrons emitted, in addition to many other particles. We will review the recent progress, possible engineering solutions and the potential performance of these systems.

  14. Beam quality simulation of the Boeing photoinjector accelerator for the MCTD project

    NASA Astrophysics Data System (ADS)

    Takeda, Harunori; Davis, Keith; Delo, Lance

    1991-07-01

    We present a performance study of the photoinjector accelerator installed at Boeing Corp., Seattle, for the Modular Component Technology Development (MCTD) program. This 5 MeV injector operates at 433 MHz and is designed to produce a normalized emittance less than 100π mm mrad. This study was performed using the PARMELA simulation code. We study parametrically the dependence of the beam emittance on the magnetic fields produced by beam-guiding coils and by the gap coil located immediately after the first injector cavity. We also study the effect of phasing between cavities and the bunched electron beam. In addition to considering the parameters that determine the electron beam environment, we consider the space-charge effect on the bunched beam at higher charge.

  15. The fusion breeder

    NASA Astrophysics Data System (ADS)

    Moir, Ralph W.

    1982-10-01

    The fusion breeder is a fusion reactor designed with special blankets to maximize the transmutation by 14 MeV neutrons of uranium-238 to plutonium or thorium to uranium-233 for use as a fuel for fission reactors. Breeding fissile fuels has not been a goal of the U.S. fusion energy program. This paper suggests it is time for a policy change to make the fusion breeder a goal of the U.S. fusion program and the U.S. nuclear energy program. There is wide agreement that many approaches will work and will produce fuel for five equal-sized LWRs, and some approach as many as 20 LWRs at electricity costs within 20% of those at today's price of uranium (30/lb of U3O8). The blankets designed to suppress fissioning, called symbiotes, fusion fuel factories, or just fusion breeders, will have safety characteristics more like pure fusion reactors and will support as many as 15 equal power LWRs. The blankets designed to maximize fast fission of fertile material will have safety characteristics more like fission reactors and will support 5 LWRs. This author strongly recommends development of the fission suppressed blanket type, a point of view not agreed upon by everyone. There is, however, wide agreement that, to meet the market price for uranium which would result in LWR electricity within 20% of today's cost with either blanket type, fusion components can cost severalfold more than would be allowed for pure fusion to meet the goal of making electricity alone at 20% over today's fission costs. Also widely agreed is that the critical-path-item for the fusion breeder is fusion development itself; however, development of fusion breeder specific items (blankets, fuel cycle) should be started now in order to have the fusion breeder by the time the rise in uranium prices forces other more costly choices.

  16. INTRODUCTION: Status report on fusion research

    NASA Astrophysics Data System (ADS)

    Burkart, Werner

    2005-10-01

    members' personal views on the latest achievements in fusion research, including magnetic and inertial confinement scenarios. The report describes fusion fundamentals and progress in fusion science and technology, with ITER as a possible partner in the realization of self-sustainable burning plasma. The importance of the socio-economic aspects of energy production using fusion power plants is also covered. Noting that applications of plasma science are of broad interest to the Member States, the report addresses the topic of plasma physics to assist in understanding the achievements of better coatings, cheaper light sources, improved heat-resistant materials and other high-technology materials. Nuclear fusion energy production is intrinsically safe, but for ITER the full range of hazards will need to be addressed, including minimising radiation exposure, to accomplish the goal of a sustainable and environmentally acceptable production of energy. We anticipate that the role of the Agency will in future evolve from supporting scientific projects and fostering information exchange to the preparation of safety principles and guidelines for the operation of burning fusion plasmas with a Q > 1. Technical progress in inertial and magnetic confinement, as well as in alternative concepts, will lead to a further increase in international cooperation. New means of communication will be needed, utilizing the best resources of modern information technology to advance interest in fusion. However, today the basis of scientific progress is still through journal publications and, with this in mind, we trust that this report will find an interested readership. We acknowledge with thanks the support of the members of the IFRC as an advisory body to the Agency. Seven chairmen have presided over the IFRC since its first meeting in 1971 in Madison, USA, ensuring that the IAEA fusion efforts were based on the best professional advice possible, and that information on fusion developments has

  17. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  18. Magnetic mirror fusion: status and prospects

    SciTech Connect

    Post, R.F.

    1980-02-11

    Two improved mirror systems, the tandem mirror (TM) and the field-reversed mirror (FRM) are being intensively studied. The twin practical aims of these studies: to improve the economic prospects for mirror fusion power plants and to reduce the size and/or complexity of such plants relative to earlier approaches to magnetic fusion. While at the present time the program emphasis is still strongly oriented toward answering scientific questions, the emphasis is shifting as the data accumulates and as larger facilities - ones with a heavy technological and engineering orientation - are being prepared. The experimental and theoretical progress that led to the new look in mirror fusion research is briefly reviewed, the new TM and the FRM ideas are outlined, and the projected future course of mirror fusion research is discussed.

  19. Use of Generalized Fluid System Simulation Program (GFSSP) for Teaching and Performing Senior Design Projects at the Educational Institutions

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.; Hedayat, A.

    2015-01-01

    This paper describes the experience of the authors in using the Generalized Fluid System Simulation Program (GFSSP) in teaching Design of Thermal Systems class at University of Alabama in Huntsville. GFSSP is a finite volume based thermo-fluid system network analysis code, developed at NASA/Marshall Space Flight Center, and is extensively used in NASA, Department of Defense, and aerospace industries for propulsion system design, analysis, and performance evaluation. The educational version of GFSSP is freely available to all US higher education institutions. The main purpose of the paper is to illustrate the utilization of this user-friendly code for the thermal systems design and fluid engineering courses and to encourage the instructors to utilize the code for the class assignments as well as senior design projects.

  20. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.