Science.gov

Sample records for fusion simulation project

  1. AI/Simulation Fusion Project at Lawrence Livermore National Laboratory

    SciTech Connect

    Erickson, S.A.

    1984-04-25

    This presentation first discusses the motivation for the AI Simulation Fusion project. After discussing very briefly what expert systems are in general, what object oriented languages are in general, and some observed features of typical combat simulations, it discusses why putting together artificial intelligence and combat simulation makes sense. We then talk about the first demonstration goal for this fusion project.

  2. Fusion Simulation Project. Workshop Sponsored by the U.S. Department of Energy, Rockville, MD, May 16-18, 2007

    SciTech Connect

    Kritz, A.; Keyes, D.

    2007-05-18

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  3. Fusion Simulation Project. Workshop sponsored by the U.S. Department of Energy Rockville, MD, May 16-18, 2007

    SciTech Connect

    2007-05-16

    The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved forty-six physicists, applied mathematicians and computer scientists, from twenty-one institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a three-day workshop in May 2007.

  4. Lessons Learned from ASCI applied to the Fusion Simulation Project (FSP)

    NASA Astrophysics Data System (ADS)

    Post, Douglass

    2003-10-01

    The magnetic fusion program has proposed a 20M dollar per year project to develop a computational predictive capability for magnetic fusion experiments. The DOE NNSA launched a program in 1996, the Accelerated Strategic Computing Initiative (ASCI) to achieve the same goal for nuclear weapons to allow certification of the stockpile without testing. We present a "lessons learned" analysis of the 3B dollary 7 year ASCI program with the goal of improving the FSP to maximize the likelihood of success. The major lessons from ASCI are: 1. Build on your institution's successful history; 2.Teams are the key element; 3. Sound Software Project Management is essential: 4. Requirements, schedule and resources must be consistent; 5. Practices, not processes, are important; 6. Minimize and mitigate risks; 7. Minimize the computer science research aspect and maximize the physics elements; and 8. Verification and Validation are essential. We map this experience and recommendations into the FSP.

  5. SKIDS data fusion project

    NASA Astrophysics Data System (ADS)

    Greenway, Phil

    1992-04-01

    The European Community's strategic research initiative in information technology (ESPRIT) has been in place for nearly five years. An early example of the pan-European collaborative projects being conducted under this initiative is 'SKIDS': Signal and Knowledge Integration with Decisional Control for Multisensory Systems. This four year project, which is approaching completion, aims to build a real-time multisensor perception machine. This machine will be capable of performing data fusion, interpretation, situation assessment, and resource allocation tasks, under the constraints of both time and resource availability, and in the presence of uncertain data. Of the many possible applications, the surveillance and monitoring of a semi-automated 'factory environment' has been chosen as a challenging and representative test scenario. This paper presents an overview of the goals and objectives of the project, the makeup of the consortium, and roles of the members within it, and the main technical achievements to data. In particular, the following are discussed: relevant application domains, and the generic requirements that can be inferred from them; sensor configuration, including choice, placement, etc.; control paradigms, including the possible trade-offs between centralized, hierarchical, and decentralized approaches; the corresponding hardware architectural choices, including the need for parallel processing; and the appropriate software architecture and infra-structure required to support the chosen task oriented approach. Specific attention is paid to the functional decomposition of the system and how the requirements for control impact the organization of the identified interpretation tasks. Future work and outstanding problems are considered in some concluding remarks. By virtue of limited space, this paper is descriptive rather than explanatory.

  6. Advanced fusion concepts: project summaries

    SciTech Connect

    1980-12-01

    This report contains descriptions of the activities of all the projects supported by the Advanced Fusion Concepts Branch of the Office of Fusion Energy, US Department of Energy. These descriptions are project summaries of each of the individual projects, and contain the following: title, principle investigators, funding levels, purpose, approach, progress, plans, milestones, graduate students, graduates, other professional staff, and recent publications. Information is given for each of the following programs: (1) reverse-field pinch, (2) compact toroid, (3) alternate fuel/multipoles, (4) stellarator/torsatron, (5) linear magnetic fusion, (6) liners, and (7) Tormac. (MOW)

  7. Fusion Simulation Program

    SciTech Connect

    Project Staff

    2012-02-29

    Under this project, General Atomics (GA) was tasked to develop the experimental validation plans for two high priority ISAs, Boundary and Pedestal and Whole Device Modeling in collaboration with the theory, simulation and experimental communities. The following sections have been incorporated into the final FSP Program Plan (www.pppl.gov/fsp), which was delivered to the US Department of Energy (DOE). Additional deliverables by GA include guidance for validation, development of metrics to evaluate success and procedures for collaboration with experiments. These are also part of the final report.

  8. Integrated simulation and modeling capability for alternate magnetic fusion concepts

    SciTech Connect

    Cohen, B. I.; Hooper, E.B.; Jarboe, T. R.; LoDestro, L. L.; Pearlstein, L. D.; Prager, S. C.; Sarff, J. S.

    1998-11-03

    This document summarizes a strategic study addressing the development of a comprehensive modeling and simulation capability for magnetic fusion experiments with particular emphasis on devices that are alternatives to the mainline tokamak device. A code development project in this area supports two defined strategic thrust areas in the Magnetic Fusion Energy Program: (1) comprehensive simulation and modeling of magnetic fusion experiments and (2) development, operation, and modeling of magnetic fusion alternate- concept experiment

  9. Fusion Simulation Program Definition. Final report

    SciTech Connect

    Cary, John R.

    2012-09-05

    We have completed our contributions to the Fusion Simulation Program Definition Project. Our contributions were in the overall planning with concentration in the definition of the area of Software Integration and Support. We contributed to the planning of multiple meetings, and we contributed to multiple planning documents.

  10. Simulation of Fusion Plasmas

    ScienceCinema

    Holland, Chris [UC San Diego, San Diego, California, United States

    2016-07-12

    The upcoming ITER experiment (www.iter.org) represents the next major milestone in realizing the promise of using nuclear fusion as a commercial energy source, by moving into the “burning plasma” regime where the dominant heat source is the internal fusion reactions. As part of its support for the ITER mission, the US fusion community is actively developing validated predictive models of the behavior of magnetically confined plasmas. In this talk, I will describe how the plasma community is using the latest high performance computing facilities to develop and refine our models of the nonlinear, multiscale plasma dynamics, and how recent advances in experimental diagnostics are allowing us to directly test and validate these models at an unprecedented level.

  11. Fusion Plasma Theory project summaries

    SciTech Connect

    Not Available

    1993-10-01

    This Project Summary book is a published compilation consisting of short descriptions of each project supported by the Fusion Plasma Theory and Computing Group of the Advanced Physics and Technology Division of the Department of Energy, Office of Fusion Energy. The summaries contained in this volume were written by the individual contractors with minimal editing by the Office of Fusion Energy. Previous summaries were published in February of 1982 and December of 1987. The Plasma Theory program is responsible for the development of concepts and models that describe and predict the behavior of a magnetically confined plasma. Emphasis is given to the modelling and understanding of the processes controlling transport of energy and particles in a toroidal plasma and supporting the design of the International Thermonuclear Experimental Reactor (ITER). A tokamak transport initiative was begun in 1989 to improve understanding of how energy and particles are lost from the plasma by mechanisms that transport them across field lines. The Plasma Theory program has actively-participated in this initiative. Recently, increased attention has been given to issues of importance to the proposed Tokamak Physics Experiment (TPX). Particular attention has been paid to containment and thermalization of fast alpha particles produced in a burning fusion plasma as well as control of sawteeth, current drive, impurity control, and design of improved auxiliary heating. In addition, general models of plasma behavior are developed from physics features common to different confinement geometries. This work uses both analytical and numerical techniques. The Fusion Theory program supports research projects at US government laboratories, universities and industrial contractors. Its support of theoretical work at universities contributes to the office of Fusion Energy mission of training scientific manpower for the US Fusion Energy Program.

  12. Magnetic fusion and project ITER

    SciTech Connect

    Park, H.K.

    1992-01-01

    It has already been demonstrated that our economics and international relationship are impacted by an energy crisis. For the continuing prosperity of the human race, a new and viable energy source must be developed within the next century. It is evident that the cost will be high and will require a long term commitment to achieve this goal due to a high degree of technological and scientific knowledge. Energy from the controlled nuclear fusion is a safe, competitive, and environmentally attractive but has not yet been completely conquered. Magnetic fusion is one of the most difficult technological challenges. In modem magnetic fusion devices, temperatures that are significantly higher than the temperatures of the sun have been achieved routinely and the successful generation of tens of million watts as a result of scientific break-even is expected from the deuterium and tritium experiment within the next few years. For the practical future fusion reactor, we need to develop reactor relevant materials and technologies. The international project called International Thermonuclear Experimental Reactor (ITER)'' will fulfill this need and the success of this project will provide the most attractive long-term energy source for mankind.

  13. Magnetic fusion and project ITER

    SciTech Connect

    Park, H.K.

    1992-09-01

    It has already been demonstrated that our economics and international relationship are impacted by an energy crisis. For the continuing prosperity of the human race, a new and viable energy source must be developed within the next century. It is evident that the cost will be high and will require a long term commitment to achieve this goal due to a high degree of technological and scientific knowledge. Energy from the controlled nuclear fusion is a safe, competitive, and environmentally attractive but has not yet been completely conquered. Magnetic fusion is one of the most difficult technological challenges. In modem magnetic fusion devices, temperatures that are significantly higher than the temperatures of the sun have been achieved routinely and the successful generation of tens of million watts as a result of scientific break-even is expected from the deuterium and tritium experiment within the next few years. For the practical future fusion reactor, we need to develop reactor relevant materials and technologies. The international project called ``International Thermonuclear Experimental Reactor (ITER)`` will fulfill this need and the success of this project will provide the most attractive long-term energy source for mankind.

  14. SECAD-- a Schema-based Environment for Configuring, Analyzing and Documenting Integrated Fusion Simulations. Final report

    SciTech Connect

    Shasharina, Svetlana

    2012-05-23

    SECAD is a project that developed a GUI for running integrated fusion simulations as implemented in FACETS and SWIM SciDAC projects. Using the GUI users can submit simulations locally and remotely and visualize the simulation results.

  15. ITER project and fusion technology

    NASA Astrophysics Data System (ADS)

    Takatsu, H.

    2011-09-01

    In the sessions of ITR, FTP and SEE of the 23rd IAEA Fusion Energy Conference, 159 papers were presented in total, highlighted by the remarkable progress of the ITER project: ITER baseline has been established and procurement activities have been started as planned with a target of realizing the first plasma in 2019; ITER physics basis is sound and operation scenarios and operational issues have been extensively studied in close collaboration with the worldwide physics community; the test blanket module programme has been incorporated into the ITER programme and extensive R&D works are ongoing in the member countries with a view to delivering their own modules in a timely manner according to the ITER master schedule. Good progress was also reported in the areas of a variety of complementary activities to DEMO, including Broader Approach activities and long-term technology. This paper summarizes the highlights of the papers presented in the ITR, FTP and SEE sessions with a minimum set of background information.

  16. A brief overview of the European Fusion File (EFF) Project

    NASA Astrophysics Data System (ADS)

    Kellett, M. A.; Forrest, R. A.; Batistoni, P.; EFF Project members

    2004-04-01

    The European Fusion File (EFF) Project is a collaborative project with work funded by the European Fusion Development Agreement (EFDA). The emphasis is on the pooling of resources and removal of duplication of effort, leading to the efficient development of two types of nuclear data libraries for use in fusion power plant design and operation studies. The two branches consist of, on the one hand, a general purpose file for modelling and design capabilities and, second, an activation file for the calculation and simulation of dose rates and energy release during operation of a future power plant. Efforts are directed towards a continued improvement of the quality of the nuclear data needed for these analyses. The OECD Nuclear Energy Agency's Data Bank acts as the central repository for the files and all information discussed during twice yearly meetings. It offers its services at no charge to the Project.

  17. Scientific and Computational Challenges of the Fusion Simulation Program (FSP)

    SciTech Connect

    William M. Tang

    2011-02-09

    This paper highlights the scientific and computational challenges facing the Fusion Simulation Program (FSP) a major national initiative in the United States with the primary objective being to enable scientific discovery of important new plasma phenomena with associated understanding that emerges only upon integration. This requires developing a predictive integrated simulation capability for magnetically-confined fusion plasmas that are properly validated against experiments in regimes relevant for producing practical fusion energy. It is expected to provide a suite of advanced modeling tools for reliably predicting fusion device behavior with comprehensive and targeted science-based simulations of nonlinearly-coupled phenomena in the core plasma, edge plasma, and wall region on time and space scales required for fusion energy production. As such, it will strive to embody the most current theoretical and experimental understanding of magnetic fusion plasmas and to provide a living framework for the simulation of such plasmas as the associated physics understanding continues to advance over the next several decades. Substantive progress on answering the outstanding scientific questions in the field will drive the FSP toward its ultimate goal of developing the ability to predict the behavior of plasma discharges in toroidal magnetic fusion devices with high physics fidelity on all relevant time and space scales. From a computational perspective, this will demand computing resources in the petascale range and beyond together with the associated multi-core algorithmic formulation needed to address burning plasma issues relevant to ITER - a multibillion dollar collaborative experiment involving seven international partners representing over half the world's population. Even more powerful exascale platforms will be needed to meet the future challenges of designing a demonstration fusion reactor (DEMO). Analogous to other major applied physics modeling projects (e

  18. Purdue Contribution of Fusion Simulation Program

    SciTech Connect

    Jeffrey Brooks

    2011-09-30

    The overall science goal of the FSP is to develop predictive simulation capability for magnetically confined fusion plasmas at an unprecedented level of integration and fidelity. This will directly support and enable effective U.S. participation in research related to the International Thermonuclear Experimental Reactor (ITER) and the overall mission of delivering practical fusion energy. The FSP will address a rich set of scientific issues together with experimental programs, producing validated integrated physics results. This is very well aligned with the mission of the ITER Organization to coordinate with its members the integrated modeling and control of fusion plasmas, including benchmarking and validation activities. [1]. Initial FSP research will focus on two critical areas: 1) the plasma edge and 2) whole device modeling including disruption avoidance. The first of these problems involves the narrow plasma boundary layer and its complex interactions with the plasma core and the surrounding material wall. The second requires development of a computationally tractable, but comprehensive model that describes all equilibrium and dynamic processes at a sufficient level of detail to provide useful prediction of the temporal evolution of fusion plasma experiments. The initial driver for the whole device model (WDM) will be prediction and avoidance of discharge-terminating disruptions, especially at high performance, which are a critical impediment to successful operation of machines like ITER. If disruptions prove unable to be avoided, their associated dynamics and effects will be addressed in the next phase of the FSP. The FSP plan targets the needed modeling capabilities by developing Integrated Science Applications (ISAs) specific to their needs. The Pedestal-Boundary model will include boundary magnetic topology, cross-field transport of multi-species plasmas, parallel plasma transport, neutral transport, atomic physics and interactions with the plasma wall

  19. Submodeling Simulations in Fusion Welds: Part II

    NASA Astrophysics Data System (ADS)

    Bonifaz, E. A.

    2013-11-01

    In part I, three-dimensional transient non-linear sub modeling heat transfer simulations were performed to study the thermal histories and thermal cycles that occur during the welding process at the macro, meso and micro scales. In the present work, the corresponding non-uniform temperature changes were imposed as load conditions on structural calculations to study the evolution of localized plastic strains and residual stresses at these sub-level scales. To reach the goal, a three-dimensional finite element elastic-plastic model (ABAQUS code) was developed. The sub-modeling technique proposed to be used in coupling phase-field (and/or digital microstructures) codes with finite element codes, was used to mesh a local part of the model with a refined mesh based on interpolation of the solution from an initial, relatively coarse, macro global model. The meso-sub-model is the global model for the subsequent micro sub-model. The strategy used to calculate temperatures, strains and residual stresses at the macro, meso and micro scale level, is very flexible to be used to any number of levels. The objective of this research was to initiate the development of microstructural models to identify fusion welding process parameters for preserving the single crystal nature of gas turbine blades during repair procedures. The multi-scale submodeling approach can be used to capture weld pool features at the macro-meso scale level, and micro residual stress and secondary dendrite arm spacing features at the micro scale level.

  20. Project Icarus: Fission-Fusion Hybrid Fuel for Interstellar Propulsion

    NASA Astrophysics Data System (ADS)

    Freeland, R. M., III

    Project Icarus is a theoretical design study for an interstellar probe. The Icarus Terms of Reference [1] specify that the design must use "current or near future technology", and that the propulsion must be "mainly fusion based". The latter allows room for a propulsion system that uses hybrid fission-fusion technology. This paper explains the motivation and science behind hybrid fuel and examines the applicability of this technology to deep space propulsion. This paper is a submission of the Project Icarus Study Group.

  1. Dynamics of cell aggregates fusion: Experiments and simulations

    NASA Astrophysics Data System (ADS)

    Thomas, Gilberto L.; Mironov, Vladimir; Nagy-Mehez, Agnes; Mombach, José C. M.

    2014-02-01

    Fusion of cell tissues is an ubiquitous phenomenon and has important technological applications including tissue biofabrication. In this work we present experimental results of aggregates fusion using adipose derived stem cells (ADSC) and a three dimensional computer simulation of the process using the cellular Potts model with aggregates reaching 10,000 cells. We consider fusion of round aggregates and monitor the dimensionless neck area of contact between the two aggregates to characterize the process, as done for the coalescence of liquid droplets and polymers. Both experiments and simulations show that the evolution of this quantity obeys a power law in time. We also study quantitatively individual cell motion with the simulation and it corresponds to an anomalous diffusion.

  2. Secondary fusion coupled deuteron/triton transport simulation and thermal-to-fusion neutron convertor measurement

    SciTech Connect

    Wang, G. B.; Wang, K.; Liu, H. G.; Li, R. D.

    2013-07-01

    A Monte Carlo tool RSMC (Reaction Sequence Monte Carlo) was developed to simulate deuteron/triton transportation and reaction coupled problem. The 'Forced particle production' variance reduction technique was used to improve the simulation speed, which made the secondary product play a major role. The mono-energy 14 MeV fusion neutron source was employed as a validation. Then the thermal-to-fusion neutron convertor was studied with our tool. Moreover, an in-core conversion efficiency measurement experiment was performed with {sup 6}LiD and {sup 6}LiH converters. Threshold activation foils was used to indicate the fast and fusion neutron flux. Besides, two other pivotal parameters were calculated theoretically. Finally, the conversion efficiency of {sup 6}LiD is obtained as 1.97x10{sup -4}, which matches well with the theoretical result. (authors)

  3. Projective simulation for artificial intelligence

    PubMed Central

    Briegel, Hans J.; De las Cuevas, Gemma

    2012-01-01

    We propose a model of a learning agent whose interaction with the environment is governed by a simulation-based projection, which allows the agent to project itself into future situations before it takes real action. Projective simulation is based on a random walk through a network of clips, which are elementary patches of episodic memory. The network of clips changes dynamically, both due to new perceptual input and due to certain compositional principles of the simulation process. During simulation, the clips are screened for specific features which trigger factual action of the agent. The scheme is different from other, computational, notions of simulation, and it provides a new element in an embodied cognitive science approach to intelligent action and learning. Our model provides a natural route for generalization to quantum-mechanical operation and connects the fields of reinforcement learning and quantum computation. PMID:22590690

  4. Colorado School of Mines fusion gamma ray diagnostic project

    SciTech Connect

    Cecil, F.E.

    1992-02-14

    This report summarizes the 1991 calendar year activities of the fusion gamma ray diagnostics project in the Physics Department at the Colorado School of Mines. Considerable progress has been realized in the fusion gamma ray diagnostic project in the last year. Specifically we have achieved the two major goals of the project as outlined in last year's proposed work statement to the Office of Applied Plasma Physics in the DOE Division of Magnetic Fusion Energy. The two major goals were: (1) Solution of the severe interference problem encountered during the operation of the gamma ray spectrometer concurrent with high power levels of the neutral beam injectors (NBI) and the ICRH antenae. (2) Experimental determination of the absolute detection efficiency of the gamma ray spectrometer. This detection efficiency will allow the measured yields of the gamma rays to be converted to a total reaction rate. In addition to these two major accomplishments, we have continued, as permitted by the TFTR operating schedule, the observation of high energy gamma rays from the 3He(D,{gamma})5Li reaction during deuterium NBI heating of 3He plasmas.

  5. Tempest Neoclassical Simulation of Fusion Edge Plasmas

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.; Xiong, Z.; Cohen, B. I.; Cohen, R. H.; Dorr, M.; Hittinger, J.; Kerbel, G. D.; Nevins, W. M.; Rognlien, T. D.

    2006-04-01

    We are developing a continuum gyrokinetic full-F code, TEMPEST, to simulate edge plasmas. The geometry is that of a fully diverted tokamak and so includes boundary conditions for both closed magnetic flux surfaces and open field lines. The code, presently 4-dimensional (2D2V), includes kinetic ions and electrons, a gyrokinetic Poisson solver for electric field, and the nonlinear Fokker-Planck collision operator. Here we present the simulation results of neoclassical transport with Boltzmann electrons. In a large aspect ratio circular geometry, excellent agreement is found for neoclassical equilibrium with parallel flows in the banana regime without a temperature gradient. In divertor geometry, it is found that the endloss of particles and energy induces pedestal-like density and temperature profiles inside the magnetic separatrix and parallel flow stronger than the neoclassical predictions in the SOL. The impact of the X-point divertor geometry on the self-consistent electric field and geo-acoustic oscillations will be reported. We will also discuss the status of extending TEMPEST into a 5-D code.

  6. A Motion Tracking and Sensor Fusion Module for Medical Simulation.

    PubMed

    Shen, Yunhe; Wu, Fan; Tseng, Kuo-Shih; Ye, Ding; Raymond, John; Konety, Badrinath; Sweet, Robert

    2016-01-01

    Here we introduce a motion tracking or navigation module for medical simulation systems. Our main contribution is a sensor fusion method for proximity or distance sensors integrated with inertial measurement unit (IMU). Since IMU rotation tracking has been widely studied, we focus on the position or trajectory tracking of the instrument moving freely within a given boundary. In our experiments, we have found that this module reliably tracks instrument motion. PMID:27046606

  7. SIMULATION OF CHAMBER TRANSPORT FOR HEAVY-ION FUSION DRIVERS

    SciTech Connect

    Sharp, W M; Callahan, D A; Tabak, M; Yu, S S; Peterson, P F; Rose, D V; Welch, D R

    2004-05-20

    The heavy-ion fusion (HIF) community recently developed a power-plant design that meets the various requirements of accelerators, final focus, chamber transport, and targets. The point design is intended to minimize physics risk and is certainly not optimal for the cost of electricity. Recent chamber-transport simulations, however, indicate that changes in the beam ion species, the convergence angle, and the emittance might allow more-economical designs.

  8. Simulated disparity and peripheral blur interact during binocular fusion.

    PubMed

    Maiello, Guido; Chessa, Manuela; Solari, Fabio; Bex, Peter J

    2014-01-01

    We have developed a low-cost, practical gaze-contingent display in which natural images are presented to the observer with dioptric blur and stereoscopic disparity that are dependent on the three-dimensional structure of natural scenes. Our system simulates a distribution of retinal blur and depth similar to that experienced in real-world viewing conditions by emmetropic observers. We implemented the system using light-field photographs taken with a plenoptic camera which supports digital refocusing anywhere in the images. We coupled this capability with an eye-tracking system and stereoscopic rendering. With this display, we examine how the time course of binocular fusion depends on depth cues from blur and stereoscopic disparity in naturalistic images. Our results show that disparity and peripheral blur interact to modify eye-movement behavior and facilitate binocular fusion, and the greatest benefit was gained by observers who struggled most to achieve fusion. Even though plenoptic images do not replicate an individual’s aberrations, the results demonstrate that a naturalistic distribution of depth-dependent blur may improve 3-D virtual reality, and that interruptions of this pattern (e.g., with intraocular lenses) which flatten the distribution of retinal blur may adversely affect binocular fusion. PMID:25034260

  9. Human Sensing Fusion Project for Safety and Health Society

    NASA Astrophysics Data System (ADS)

    Maenaka, Kazusuke

    This paper introduces objectives and status of “Human sensing fusion project” in the Exploratory Research for Advanced Technology (ERATO) scheme produced by Japan Science and Technology Agency (JST). This project was started in December 2007 and the laboratory with 11 members opened on April 2008. The aim of this project is to realize a human activity-monitoring device with many kinds of sensors in ultimate small size so that the device can be pasted or patched to the human body, and to establish the algorism for understanding human condition including both physical and mental conditions from obtained data. This system can be used towards the prevention of the danger of accidents and the maintenance of health. The actual research has just begun and preparations for project are well under way.

  10. Atomic Data and Modelling for Fusion: the ADAS Project

    NASA Astrophysics Data System (ADS)

    Summers, H. P.; O'Mullane, M. G.

    2011-05-01

    The paper is an update on the Atomic Data and Analysis Structure, ADAS, since ICAM-DATA06 and a forward look to its evolution in the next five years. ADAS is an international project supporting principally magnetic confinement fusion research. It has participant laboratories throughout the world, including ITER and all its partner countries. In parallel with ADAS, the ADAS-EU Project provides enhanced support for fusion research at Associated Laboratories and Universities in Europe and ITER. OPEN-ADAS, sponsored jointly by the ADAS Project and IAEA, is the mechanism for open access to principal ADAS atomic data classes and facilitating software for their use. EXTENDED-ADAS comprises a variety of special, integrated application software, beyond the purely atomic bounds of ADAS, tuned closely to specific diagnostic analyses and plasma models. The current scientific content and scope of these various ADAS and ADAS related activities are briefly reviewed. These span a number of themes including heavy element spectroscopy and models, charge exchange spectroscopy, beam emission spectroscopy and special features which provide a broad baseline of atomic modelling and support. Emphasis will be placed on `lifting the fundamental data baseline'—a principal ADAS task for the next few years. This will include discussion of ADAS and ADAS-EU coordinated and shared activities and some of the methods being exploited.

  11. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  12. KULL: LLNL's ASCI Inertial Confinement Fusion Simulation Code

    SciTech Connect

    Rathkopf, J. A.; Miller, D. S.; Owen, J. M.; Zike, M. R.; Eltgroth, P. G.; Madsen, N. K.; McCandless, K. P.; Nowak, P. F.; Nemanic, M. K.; Gentile, N. A.; Stuart, L. M.; Keen, N. D.; Palmer, T. S.

    2000-01-10

    KULL is a three dimensional, time dependent radiation hydrodynamics simulation code under development at Lawrence Livermore National Laboratory. A part of the U.S. Department of Energy's Accelerated Strategic Computing Initiative (ASCI), KULL's purpose is to simulate the physical processes in Inertial Confinement Fusion (ICF) targets. The National Ignition Facility, where ICF experiments will be conducted, and ASCI are part of the experimental and computational components of DOE's Stockpile Stewardship Program. This paper provides an overview of ASCI and describes KULL, its hydrodynamic simulation capability and its three methods of simulating radiative transfer. Particular emphasis is given to the parallelization techniques essential to obtain the performance required of the Stockpile Stewardship Program and to exploit the massively parallel processor machines that ASCI is procuring.

  13. Radiation Hydrodynamic Simulations of an Inertial Fusion Energy Reactor Chamber

    NASA Astrophysics Data System (ADS)

    Sacks, Ryan Foster

    Inertial fusion energy reactors present great promise for the future as they are capable of providing baseline power with no carbon footprint. Simulation work regarding the chamber response and first wall insult is carried out using the 1-D BUCKY radiation hydrodynamics code for a variety of differing chamber fills, radii, chamber obstructions and first wall materials. Discussion of the first wall temperature rise, x-ray spectrum incident on the wall, shock timing and maximum overpressure are presented. An additional discussion of the impact of different gas opacities and their effect on overall chamber dynamics, including the formation of two shock fronts, is also presented. This work is performed under collaboration with Lawrence Livermore National Laboratory at the University of Wisconsin-Madison's Fusion Technology Institute.

  14. Simulating Intense Ion Beams for Inertial Fusion Energy

    SciTech Connect

    Friedman, A

    2001-02-20

    The Heavy Ion Fusion (HIF) program's goal is the development of the body of knowledge needed for Inertial Fusion Energy (IFE) to realize its promise. The intense ion beams that will drive HIF targets are nonneutral plasmas and exhibit collective, nonlinear dynamics which must be understood using the kinetic models of plasma physics. This beam physics is both rich and subtle: a wide range in spatial and temporal scales is involved, and effects associated with both instabilities and non-ideal processes must be understood. Ion beams have a ''long memory'', and initialization of a beam at mid-system with an idealized particle distribution introduces uncertainties; thus, it will be crucial to develop, and to extensively use, an integrated and detailed ''source-to-target'' HIF beam simulation capability. We begin with an overview of major issues.

  15. Simulating Intense Ion Beams for Inertial Fusion Energy

    SciTech Connect

    Friedman, A.

    2001-02-20

    The Heavy Ion Fusion (HIF) program's goal is the development of the body of knowledge needed for Inertial Fusion Energy (IFE) to realize its promise. The intense ion beams that will drive HIF targets are rzonneutral plasmas and exhibit collective, nonlinear dynamics which must be understood using the kinetic models of plasma physics. This beam physics is both rich and subtle: a wide range in spatial and temporal scales is involved, and effects associated with both instabilities and non-ideal processes must be understood. Ion beams have a ''long memory,'' and initialization of a beam at mid-system with an idealized particle distribution introduces uncertainties; thus, it will be crucial to develop, and to extensively use, an integrated and detailed ''source-to-target'' HIF beam simulation capability. We begin with an overview of major issues.

  16. Simulation of polyethylene glycol and calcium-mediated membrane fusion

    SciTech Connect

    Pannuzzo, Martina; De Jong, Djurre H.; Marrink, Siewert J.; Raudino, Antonio

    2014-03-28

    We report on the mechanism of membrane fusion mediated by polyethylene glycol (PEG) and Ca{sup 2+} by means of a coarse-grained molecular dynamics simulation approach. Our data provide a detailed view on the role of cations and polymer in modulating the interaction between negatively charged apposed membranes. The PEG chains cause a reduction of the inter-lamellar distance and cause an increase in concentration of divalent cations. When thermally driven fluctuations bring the membranes at close contact, a switch from cis to trans Ca{sup 2+}-lipid complexes stabilizes a focal contact acting as a nucleation site for further expansion of the adhesion region. Flipping of lipid tails induces subsequent stalk formation. Together, our results provide a molecular explanation for the synergistic effect of Ca{sup 2+} and PEG on membrane fusion.

  17. Terascale simulations for heavy ion inertial fusion energy

    SciTech Connect

    Friedman, A; Cohen, R H; Grote, D P; Sharp, W M; Celata, C M; Lee, E P; Vay, J-L; Davidson, R C; Kaganovich, I; Lee, W W; Qin, H; Welch, D R; Haber, I; Kishek, R A

    2000-06-08

    The intense ion beams in a heavy ion Inertial Fusion Energy (IFE) driver and fusion chamber are non-neutral plasmas whose dynamics are largely dominated by space charge. We propose to develop a ''source-to-target'' Heavy Ion Fusion (HIF) beam simulation capability: a description of the kinetic behavior of this complex, nonlinear system which is both integrated and detailed. We will apply this new capability to further our understanding of key scientific issues in the physics of ion beams for IFE. The simulations will entail self-consistent field descriptions that require interprocessor communication, but are scalable and will run efficiently on terascale architectures. This new capability will be based on the integration of three types of simulations, each requiring terascale computing: (1) simulations of acceleration and confinement of the space-charge-dominated ion beams through the driver (accelerator, pulse compression line, and final focusing system) which accurately describe their dynamics, including emittance growth (phase-space dilution) effects; these are particle-in-cell (PIC) models; (2) electromagnetic (EM) and magnetoinductive (Darwin) simulations which describe the beam and the fusion chamber environment, including multibeam, neutralization, stripping, beam and plasma ionization processes, and return current effects; and (3) highly detailed simulations (6f, multispecies PIC, continuum Vlasov), which can examine electron effects and collective modes in the driver and chamber, and can study halo generation with excellent statistics, to ensure that these effects do not disrupt the focusability of the beams. The code development will involve: (i) adaptation of existing codes to run efficiently on multi-SMP computers that use a hybrid of shared and distributed memory; (ii) development of new and improved numerical algorithms, e.g., averaging techniques that will afford larger timesteps; and (iii) incorporation of improved physics models (e.g., for self

  18. Bohunice Simulator Data Collection Project

    SciTech Connect

    Cillik, Ivan; Prochaska, Jan

    2002-07-01

    The paper describes the way and results of human reliability data analysis collected as a part of the Bohunice Simulator Data Collection Project (BSDCP), which was performed by VUJE Trnava, Inc. with funding support from the U.S. DOE, National Nuclear Security Administration. The goal of the project was to create a methodology for simulator data collection and analysis to support activities in probabilistic safety assessment (PSA) and human reliability assessment for Jaslovske Bohunice nuclear power plant consisting of two sets of twin units: two VVER 440/V-230 (V1) and two VVER 440/V-213 (V2) reactors. During the project training of V-2 control room crews was performed at VUJE-Trnava simulator. The simulator training and the data collection were done in parallel. The main goal of BSDCP was to collect suitable data of human errors under simulated conditions requiring the use of symptom-based emergency operating procedures (SBEOPs). The subjects of the data collection were scenario progress time data, operator errors, and real-time technological parameters. The paper contains three main parts. The first part presents preparatory work and semi-automatic computer-based methods used to collect data and to check technological parameters in order to find hidden errors of operators, to be able to retrace the course of each scenario for purposes of further analysis, and to document the whole training process. The first part gives also an overview of collected data scope, human error taxonomy, and state classifications for SBEOP instructions coding. The second part describes analytical work undertaken to describe time distribution necessary for execution of various kinds of instructions performed by operators according to the classification for coding of SBEOP instructions. It also presents the methods used for determination of probability distribution for different operator errors. Results from the data evaluation are presented in the last part of the paper. An overview of

  19. Simulation of Carbon Production from Material Surfaces in Fusion Devices

    NASA Astrophysics Data System (ADS)

    Marian, J.; Verboncoeur, J.

    2005-10-01

    Impurity production at carbon surfaces by plasma bombardment is a key issue for fusion devices as modest amounts can lead to excessive radiative power loss and/or hydrogenic D-T fuel dilution. Here results of molecular dynamics (MD) simulations of physical and chemical sputtering of hydrocarbons are presented for models of graphite and amorphous carbon, the latter formed by continuous D-T impingement in conditions that mimic fusion devices. The results represent more extensive simulations than we reported last year, including incident energies in the 30-300 eV range for a variety of incident angles that yield a number of different hydrocarbon molecules. The calculated low-energy yields clarify the uncertainty in the complex chemical sputtering rate since chemical bonding and hard-core repulsion are both included in the interatomic potential. Also modeled is hydrocarbon break-up by electron-impact collisions and transport near the surface. Finally, edge transport simulations illustrate the sensitivity of the edge plasma properties arising from moderate changes in the carbon content. The models will provide the impurity background for the TEMPEST kinetic edge code.

  20. Simulation of Drift-Compression for Heavy-Ion-Fusion

    SciTech Connect

    Sharp, W M; Barnard, J J; Grote, D P; Celata, C M; Yu, S S

    2005-03-16

    Lengthwise compression of space-charge-dominated beams is needed to obtain the high input power required for heavy-ion fusion. The ''drift-compression'' scenario studied here first applies a head-to-tail velocity variation with the beam tail moving faster than the head. As the beam drifts, the longitudinal space-charge field slows compression, leaving the beam nearly monoenergetic as it enters the final-focus magnets. This paper presents initial work to model this compression scenario. Fluid and particle simulations are compared, and several strategies for setting up the compression schedule are discussed.

  1. Multisource report-level simulator for fusion research

    NASA Astrophysics Data System (ADS)

    Carlotto, Mark J.; Kadar, Ivan

    2003-08-01

    The Multi-source Report-level Simulator (MRS) is a tool developed by Veridian Systems as part of its Model-adaptive Multi-source Track Fusion (MMTF) effort under DARPA's DTT program. MRS generates simulated multisensor contact reports for GMTI, HUMINT, IMINT, SIGINT, UGS, and video. It contains a spatial editor for creating ground tracks along which vehicles move over the terrain. Vehicles can start, stop, speed up, or slow down. The spatial editor is also used to define the locations of fixed sensors such as UGS and HUMINT observers on the ground, and flight paths of GMTI, IMINT, SIGINT, and video sensors in the air. Observation models characterize each sensor at the report level in terms of their operating characteristics (revisit rate, resolution, etc.) measurement errors, and detection/classification performance (i.e., Pd, Nfa, Pcc, and Pid). Contact reports are linked to ground truth data to facilitate the testing of track/fusion algorithms and the validation of associated performance models.

  2. Computer modeling and simulation in inertial confinement fusion

    NASA Astrophysics Data System (ADS)

    McCrory, R. L.; Verdon, C. P.

    1989-03-01

    The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper we describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics.

  3. Computer modeling and simulation in inertial confinement fusion

    SciTech Connect

    McCrory, R.L.; Verdon, C.P.

    1989-03-01

    The complex hydrodynamic and transport processes associated with the implosion of an inertial confinement fusion (ICF) pellet place considerable demands on numerical simulation programs. Processes associated with implosion can usually be described using relatively simple models, but their complex interplay requires that programs model most of the relevant physical phenomena accurately. Most hydrodynamic codes used in ICF incorporate a one-fluid, two-temperature model. Electrons and ions are assumed to flow as one fluid (no charge separation). Due to the relatively weak coupling between the ions and electrons, each species is treated separately in terms of its temperature. In this paper we describe some of the major components associated with an ICF hydrodynamics simulation code. To serve as an example we draw heavily on a two-dimensional Lagrangian hydrodynamic code (ORCHID) written at the University of Rochester's Laboratory for Laser Energetics. 46 refs., 19 figs., 1 tab.

  4. Report of the Fusion Energy Sciences Advisory Committee. Panel on Integrated Simulation and Optimization of Magnetic Fusion Systems

    SciTech Connect

    Dahlburg, Jill; Corones, James; Batchelor, Donald; Bramley, Randall; Greenwald, Martin; Jardin, Stephen; Krasheninnikov, Sergei; Laub, Alan; Leboeuf, Jean-Noel; Lindl, John; Lokke, William; Rosenbluth, Marshall; Ross, David; Schnack, Dalton

    2002-11-01

    Fusion is potentially an inexhaustible energy source whose exploitation requires a basic understanding of high-temperature plasmas. The development of a science-based predictive capability for fusion-relevant plasmas is a challenge central to fusion energy science, in which numerical modeling has played a vital role for more than four decades. A combination of the very wide range in temporal and spatial scales, extreme anisotropy, the importance of geometric detail, and the requirement of causality which makes it impossible to parallelize over time, makes this problem one of the most challenging in computational physics. Sophisticated computational models are under development for many individual features of magnetically confined plasmas and increases in the scope and reliability of feasible simulations have been enabled by increased scientific understanding and improvements in computer technology. However, full predictive modeling of fusion plasmas will require qualitative improvements and innovations to enable cross coupling of a wider variety of physical processes and to allow solution over a larger range of space and time scales. The exponential growth of computer speed, coupled with the high cost of large-scale experimental facilities, makes an integrated fusion simulation initiative a timely and cost-effective opportunity. Worldwide progress in laboratory fusion experiments provides the basis for a recent FESAC recommendation to proceed with a burning plasma experiment (see FESAC Review of Burning Plasma Physics Report, September 2001). Such an experiment, at the frontier of the physics of complex systems, would be a huge step in establishing the potential of magnetic fusion energy to contribute to the world’s energy security. An integrated simulation capability would dramatically enhance the utilization of such a facility and lead to optimization of toroidal fusion plasmas in general. This science-based predictive capability, which was cited in the FESAC

  5. SIMULATION OF INTENSE BEAMS FOR HEAVY ION FUSION

    SciTech Connect

    Friedman, A

    2004-06-10

    Computer simulations of intense ion beams play a key role in the Heavy Ion Fusion research program. Along with analytic theory, they are used to develop future experiments, guide ongoing experiments, and aid in the analysis and interpretation of experimental results. They also afford access to regimes not yet accessible in the experimental program. The U.S. Heavy Ion Fusion Virtual National Laboratory and its collaborators have developed state-of-the art computational tools, related both to codes used for stationary plasmas and to codes used for traditional accelerator applications, but necessarily differing from each in important respects. These tools model beams in varying levels of detail and at widely varying computational cost. They include moment models (envelope equations and fluid descriptions), particle-in-cell methods (electrostatic and electromagnetic), nonlinear-perturbative descriptions (''{delta}f''), and continuum Vlasov methods. Increasingly, it is becoming clear that it is necessary to simulate not just the beams themselves, but also the environment in which they exist, be it an intentionally-created plasma or an unwanted cloud of electrons and gas. In this paper, examples of the application of simulation tools to intense ion beam physics are presented, including support of present-day experiments, fundamental beam physics studies, and the development of future experiments. Throughout, new computational models are described and their utility explained. These include Mesh Refinement (and its dynamic variant, Adaptive Mesh Refinement); improved electron cloud and gas models, and an electron advance scheme that allows use of larger time steps; and moving-mesh and adaptive-mesh Vlasov methods.

  6. The Mars Gravity Simulation Project

    NASA Technical Reports Server (NTRS)

    Korienek, Gene

    1998-01-01

    Human beings who make abrupt transitions between one gravitational environment and another undergo severe disruptions of their visual perception and visual- motor coordination, frequently accompanied by "space sickness." Clearly, such immediate effects of exposure to a novel gravitational condition have significant implications for human performance. For example, when astronauts first arrive in Earth orbit their attempts to move about in the spacecraft and to perform their duties are uncoordinated, inaccurate, and inefficient. Other inter-gravitational transitions for which these difficulties can be expected include going from the 0 g of the spacecraft to the. 16 g of the Moon, from 0 g to the .38 g of Mars, and from 0 g back to the 1.0 g of Earth. However, after astronauts have actively interacted with their new gravitational environment for several days, these problems tend to disappear, evidence that some sort of adaptive process has taken place. It would be advantageous, therefore, if there were some way to minimize or perhaps even to eliminate this potentially hazardous adaptive transition period by allowing astronauts to adapt to the altered gravitational conditions before actually entering them. Simultaneous adaptations to both the altered and the normal gravitational environment as a result of repeatedly adapting to one and readapting to the other, a phenomenon known as dual adaptation. The objective of the Mars Gravity Simulator (MGS) Project is to construct a simulation of the visual and bodily effects of altered gravity. This perceptual-motor simulation is created through the use of: 1) differential body pressure to produce simulated hypo-gravity and 2) treadmill-controlled virtual reality to create a corresponding visual effect. It is expected that this combination will produce sensory motor perturbations in the subjects. Both the immediate and adaptive behavioral (postural and ambulatory) responses to these sensory perturbations will be assessed.

  7. Online adaptive decision fusion framework based on projections onto convex sets with application to wildfire detection in video

    NASA Astrophysics Data System (ADS)

    Günay, Osman; Töreyin, Behcet Uǧur; Çetin, Ahmet Enis

    2011-07-01

    In this paper, an online adaptive decision fusion framework is developed for image analysis and computer vision applications. In this framework, it is assumed that the compound algorithm consists of several sub-algorithms, each of which yields its own decision as a real number centered around zero, representing the confidence level of that particular sub-algorithm. Decision values are linearly combined with weights that are updated online according to an active fusion method based on performing orthogonal projections onto convex sets describing sub-algorithms. It is assumed that there is an oracle, who is usually a human operator, providing feedback to the decision fusion method. A video-based wildfire detection system is developed to evaluate the performance of the algorithm in handling the problems where data arrives sequentially. In this case, the oracle is the security guard of the forest lookout tower verifying the decision of the combined algorithm. Simulation results are presented.

  8. Improved computational methods for simulating inertial confinement fusion

    NASA Astrophysics Data System (ADS)

    Fatenejad, Milad

    This dissertation describes the development of two multidimensional Lagrangian code for simulating inertial confinement fusion (ICF) on structured meshes. The first is DRACO, a production code primarily developed by the Laboratory for Laser Energetics. Several significant new capabilities were implemented including the ability to model radiative transfer using Implicit Monte Carlo [Fleck et al., JCP 8, 313 (1971)]. DRACO was also extended to operate in 3D Cartesian geometry on hexahedral meshes. Originally the code was only used in 2D cylindrical geometry. This included implementing thermal conduction and a flux-limited multigroup diffusion model for radiative transfer. Diffusion equations are solved by extending the 2D Kershaw method [Kershaw, JCP 39, 375 (1981)] to three dimensions. The second radiation-hydrodynamics code developed as part of this thesis is Cooper, a new 3D code which operates on structured hexahedral meshes. Cooper supports the compatible hydrodynamics framework [Caramana et al., JCP 146, 227 (1998)] to obtain round-off error levels of global energy conservation. This level of energy conservation is maintained even when two temperature thermal conduction, ion/electron equilibration, and multigroup diffusion based radiative transfer is active. Cooper is parallelized using domain decomposition, and photon energy group decomposition. The Mesh Oriented datABase (MOAB) computational library is used to exchange information between processes when domain decomposition is used. Cooper's performance is analyzed through direct comparisons with DRACO. Cooper also contains a method for preserving spherical symmetry during target implosions [Caramana et al., JCP 157, 89 (1999)]. Several deceleration phase implosion simulations were used to compare instability growth using traditional hydrodynamics and compatible hydrodynamics with/without symmetry modification. These simulations demonstrate increased symmetry preservation errors when traditional hydrodynamics

  9. On a Primal Coarse Projective Integration Method for Multiscale Simulations

    NASA Astrophysics Data System (ADS)

    Skoric, Milos; Ishiguro, Seiji; Maluckov, Sandra

    2006-10-01

    A novel simulation framework called Equation-Free Projective Integration (EFPI) was recently applied to nonlinear plasmas by M. Shay [1] to study propagation and steepening of a 1D ion sound (IS) with a PIC code as a microscopic simulator. To initialize, macro plasma variables are ``lifted'' to a fine micro-representation. PIC code is stepped forward for a short time, and the results are ``restricted'' or smoothed back to macro space. By extrapolation, time derivative is estimated and projected with a large step; the process is repeated. As a simple alternative, we propose a sort of a primal EPFI scheme to simulate nonlinear plasmas including kinetic effects. The micro-simulator is a standard 1D ES PIC code. Ions are assumed inherently coarse grained or ``smoothed'' and tracked to extrapolate in time and project. The potential is averaged over the electron plasma period to extrapolate and project. No adiabatic approximation for electrons is used [2], instead, self-consistently find the non-uniform electron distribution from the Poisson equation and ion density. Preliminary results for nonlinear IS as well as for the IS double layer paradigm are presented and some limitations on the EPFI discussed. [1] M. Shay, J. Drake, W. Dorland, J. of Comp. Phys (APS DPP 2005) [2] G. Stanchev, A. Maluckov et al., in EPS Fusion (Rome, 2006).

  10. Mesh refinement for particle-in-cell plasma simulations: Applications to - and benefits for - heavy ion fusion

    SciTech Connect

    Vay, J.L.; Colella, P.; McCorquodale, P.; Van Straalen, B.; Friedman, A.; Grote, D.P.

    2002-05-24

    The numerical simulation of the driving beams in a heavy ion fusion power plant is a challenging task, and simulation of the power plant as a whole, or even of the driver, is not yet possible. Despite the rapid progress in computer power, past and anticipated, one must consider the use of the most advanced numerical techniques, if they are to reach the goal expeditiously. One of the difficulties of these simulations resides in the disparity of scales, in time and in space, which must be resolved. When these disparities are in distinctive zones of the simulation region, a method which has proven to be effective in other areas (e.g., fluid dynamics simulations) is the mesh refinement technique. They discuss the challenges posed by the implementation of this technique into plasma simulations (due to the presence of particles and electromagnetic waves). They present the prospects for and projected benefits of its application to heavy ion fusion, in particular to the simulation of the ion source and the final beam propagation in the chamber. A Collaboration project is under way at LBNL between the Applied Numerical Algorithms Group (ANAG) and the HIF group to couple the Adaptive Mesh Refinement (AMR) library CHOMBO developed by the ANAG group to the Particle-In-Cell accelerator code (WARP) developed by the HIF-VNL. They describe their progress and present their initial findings.

  11. Evaluation of performance of select fusion experiments and projected reactors

    NASA Technical Reports Server (NTRS)

    Miley, G. H.

    1978-01-01

    The performance of NASA Lewis fusion experiments (SUMMA and Bumpy Torus) is compared with other experiments and that necessary for a power reactor. Key parameters cited are gain (fusion power/input power) and the time average fusion power, both of which may be more significant for real fusion reactors than the commonly used Lawson parameter. The NASA devices are over 10 orders of magnitude below the required powerplant values in both gain and time average power. The best experiments elsewhere are also as much as 4 to 5 orders of magnitude low. However, the NASA experiments compare favorably with other alternate approaches that have received less funding than the mainline experiments. The steady-state character and efficiency of plasma heating are strong advantages of the NASA approach. The problem, though, is to move ahead to experiments of sufficient size to advance in gain and average power parameters.

  12. CORSICA: A comprehensive simulation of toroidal magnetic-fusion devices. Final report to the LDRD Program

    SciTech Connect

    Crotinger, J.A.; LoDestro, L.; Pearlstein, L.D.; Tarditi, A.; Casper, T.A.; Hooper, E.B.

    1997-03-21

    In 1992, our group began exploring the requirements for a comprehensive simulation code for toroidal magnetic fusion experiments. There were several motivations for taking this step. First, the new machines being designed were much larger and more expensive than current experiments. Second, these new designs called for much more sophisticated control of the plasma shape and position, as well as the distributions of energy, mass, and current within the plasma. These factors alone made it clear that a comprehensive simulation capability would be an extremely valuable tool for machine design. The final motivating factor was that the national Numerical Tokamak Project (NTP) had recently received High Performance Computing and Communications (HPCC) Grand Challenge funding to model turbulent transport in tokamaks, raising the possibility that first-principles simulations of this process might be practical in the near future. We felt that the best way to capitalize on this development was to integrate the resulting turbulence simulation codes into a comprehensive simulation. Such simulations must include the effects of many microscopic length- and time-scales. In order to do a comprehensive simulation efficiently, the length- and time- scale disparities must be exploited. We proposed to do this by coupling the average or quasistatic effects from the fast time-scales to a slow-time-scale transport code for the macroscopic plasma evolution. In FY93-FY96 we received funding to investigate algorithms for computationally coupling such disparate-scale simulations and to implement these algorithms in a prototype simulation code, dubbed CORSICA. Work on algorithms and test cases proceeded in parallel, with the algorithms being incorporated into CORSICA as they became mature. In this report we discuss the methods and algorithms, the CORSICA code, its applications, and our plans for the future.

  13. Projection-Based linear constrained estimation and fusion over long-haul links

    SciTech Connect

    Rao, Nageswara S

    2016-01-01

    We study estimation and fusion with linear dynamics in long-haul sensor networks, wherein a number of sensors are remotely deployed over a large geographical area for performing tasks such as target tracking, and a remote fusion center serves to combine the information provided by these sensors in order to improve the overall tracking accuracy. In reality, the motion of a dynamic target might be subject to certain constraints, for instance, those defined by a road network. We explore the accuracy performance of projection-based constrained estimation and fusion methods that is affected by information loss over the long-haul links. We use an example to compare the tracking errors under various implementations of centralized and distributed projection-based estimation and fusion methods and demonstrate the effectiveness of using projection-based methods in these settings.

  14. Fusion

    NASA Astrophysics Data System (ADS)

    Herman, Robin

    1990-10-01

    The book abounds with fascinating anecdotes about fusion's rocky path: the spurious claim by Argentine dictator Juan Peron in 1951 that his country had built a working fusion reactor, the rush by the United States to drop secrecy and publicize its fusion work as a propaganda offensive after the Russian success with Sputnik; the fortune Penthouse magazine publisher Bob Guccione sank into an unconventional fusion device, the skepticism that met an assertion by two University of Utah chemists in 1989 that they had created "cold fusion" in a bottle. Aimed at a general audience, the book describes the scientific basis of controlled fusion--the fusing of atomic nuclei, under conditions hotter than the sun, to release energy. Using personal recollections of scientists involved, it traces the history of this little-known international race that began during the Cold War in secret laboratories in the United States, Great Britain and the Soviet Union, and evolved into an astonishingly open collaboration between East and West.

  15. Web Interface Connecting Gyrokinetic Turbulence Simulations with Tokamak Fusion Data

    NASA Astrophysics Data System (ADS)

    Suarez, A.; Ernst, D. R.

    2005-10-01

    We are developing a comprehensive interface to connect plasma microturbulence simulation codes with experimental data in the U.S. and abroad. This website automates the preparation and launch of gyrokinetic simulations utilizing plasma profile and magnetic equilibrium data. The functionality of existing standalone interfaces, such as GS2/PREP [D. R. Ernst et al., Phys. Plasmas 11(5) 2637 (2004)], in use for several years for the GS2 code [W. Dorland et al., Phys. Rev. Lett. 85(26) 5579 (2000)], will be extended to other codes, including GYRO [J. Candy / R.E. Waltz, J. Comput. Phys.186, (2003) 545]. Data is read from mdsplus and TRANSP [\\underline {http://w3.pppl.gov/transp}] and can be viewed using a java plotter, Webgraph, developed for this project by previous students Geoffrey Catto and Bo Feng. User sessions are tracked and saved to allow users to access their previous simulations, which can be used as templates for future work.

  16. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  17. Phase space structures in gyrokinetic simulations of fusion plasma turbulence

    NASA Astrophysics Data System (ADS)

    Ghendrih, Philippe; Norscini, Claudia; Cartier-Michaud, Thomas; Dif-Pradalier, Guilhem; Abiteboul, Jérémie; Dong, Yue; Garbet, Xavier; Gürcan, Ozgür; Hennequin, Pascale; Grandgirard, Virginie; Latu, Guillaume; Morel, Pierre; Sarazin, Yanick; Storelli, Alexandre; Vermare, Laure

    2014-10-01

    Gyrokinetic simulations of fusion plasmas give extensive information in 5D on turbulence and transport. This paper highlights a few of these challenging physics in global, flux driven simulations using experimental inputs from Tore Supra shot TS45511. The electrostatic gyrokinetic code GYSELA is used for these simulations. The 3D structure of avalanches indicates that these structures propagate radially at localised toroidal angles and then expand along the field line at sound speed to form the filaments. Analysing the poloidal mode structure of the potential fluctuations (at a given toroidal location), one finds that the low modes m = 0 and m = 1 exhibit a global structure; the magnitude of the m = 0 mode is much larger than that of the m = 1 mode. The shear layers of the corrugation structures are thus found to be dominated by the m = 0 contribution, that are comparable to that of the zonal flows. This global mode seems to localise the m = 2 mode but has little effect on the localisation of the higher mode numbers. However when analysing the pulsation of the latter modes one finds that all modes exhibit a similar phase velocity, comparable to the local zonal flow velocity. The consequent dispersion like relation between the modes pulsation and the mode numbers provides a means to measure the zonal flow. Temperature fluctuations and the turbulent heat flux are localised between the corrugation structures. Temperature fluctuations are found to exhibit two scales, small fluctuations that are localised by the corrugation shear layers, and appear to bounce back and forth radially, and large fluctuations, also readily observed on the flux, which are associated to the disruption of the corrugations. The radial ballistic velocity of both avalanche events if of the order of 0.5ρ∗c0 where ρ∗ = ρ0/a, a being the tokamak minor radius and ρ0 being the characteristic Larmor radius, ρ0 = c0/Ω0. c0 is the reference ion thermal velocity and Ω0 = qiB0/mi the reference

  18. Networking Industry and Academia: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Stephens, Simon; Onofrei, George

    2009-01-01

    Graduate development programmes such as FUSION continue to be seen by policy makers, higher education institutions and small and medium-sized enterprises (SMEs) as primary means of strengthening higher education-business links and in turn improving the match between graduate output and the needs of industry. This paper provides evidence from case…

  19. Projection-Based Linear Constrained Estimation and Fusion over Long-Haul Links

    SciTech Connect

    Rao, Nageswara S

    2016-01-01

    In this work, we study estimation and fusion with linear dynamics in long-haul sensor networks, wherein a number of sensors are remotely deployed over a large geographical area for performing tasks such as target tracking, and a remote fusion center serves to combine the information provided by these sensors in order to improve the overall tracking accuracy. In reality, the motion of a dynamic target might be subject to certain constraints, for instance, those defined by a road network. We explore the accuracy performance of projection-based constrained estimation and fusion methods that is affected by information loss over the long-haul links. We use a tracking example to compare the tracking errors under various implementations of centralized and distributed projection-based estimation and fusion methods.

  20. Internet and web projects for fusion plasma science and education. Final technical report

    SciTech Connect

    Eastman, Timothy E.

    1999-08-30

    The plasma web site at http://www.plasmas.org provides comprehensive coverage of all plasma science and technology with site links worldwide. Prepared to serve the general public, students, educators, researchers, and decision-makers, the site covers basic plasma physics, fusion energy, magnetic confinement fusion, high energy density physics include ICF, space physics and astrophysics, pulsed-power, lighting, waste treatment, plasma technology, plasma theory, simulations and modeling.

  1. Humanoid Flight Metabolic Simulator Project

    NASA Technical Reports Server (NTRS)

    Ross, Stuart

    2015-01-01

    NASA's Evolvable Mars Campaign (EMC) has identified several areas of technology that will require significant improvements in terms of performance, capacity, and efficiency, in order to make a manned mission to Mars possible. These include crew vehicle Environmental Control and Life Support System (ECLSS), EVA suit Portable Life Support System (PLSS) and Information Systems, autonomous environmental monitoring, radiation exposure monitoring and protection, and vehicle thermal control systems (TCS). (MADMACS) in a Suit can be configured to simulate human metabolism, consuming crew resources (oxygen) in the process. In addition to providing support for testing Life Support on unmanned flights, MADMACS will also support testing of suit thermal controls, and monitor radiation exposure, body zone temperatures, moisture, and loads.

  2. Programmable AC power supply for simulating power transient expected in fusion reactor

    SciTech Connect

    Halimi, B.; Suh, K. Y.

    2012-07-01

    This paper focus on control engineering of the programmable AC power source which has capability to simulate power transient expected in fusion reactor. To generate the programmable power source, AC-AC power electronics converter is adopted to control the power of a set of heaters to represent the transient phenomena of heat exchangers or heat sources of a fusion reactor. The International Thermonuclear Experimental Reactor (ITER) plasma operation scenario is used as the basic reference for producing this transient power source. (authors)

  3. Image Fusion Software in the Clearpem-Sonic Project

    NASA Astrophysics Data System (ADS)

    Pizzichemi, M.; di Vara, N.; Cucciati, G.; Ghezzi, A.; Paganoni, M.; Farina, F.; Frisch, B.; Bugalho, R.

    2012-08-01

    ClearPEM-Sonic is a mammography scanner that combines Positron Emission Tomography with 3D ultrasound echographic and elastographic imaging. It has been developed to improve early stage detection of breast cancer by combining metabolic and anatomical information. The PET system has been developed by the Crystal Clear Collaboration, while the 3D ultrasound probe has been provided by SuperSonic Imagine. In this framework, the visualization and fusion software is an essential tool for the radiologists in the diagnostic process. This contribution discusses the design choices, the issues faced during the implementation, and the commissioning of the software tools developed for ClearPEM-Sonic.

  4. One-dimensional particle simulations of Knudsen-layer effects on D-T fusion

    SciTech Connect

    Cohen, Bruce I.; Dimits, Andris M.; Zimmerman, George B.; Wilks, Scott C.

    2014-12-15

    Particle simulations are used to solve the fully nonlinear, collisional kinetic equation describing the interaction of a high-temperature, high-density, deuterium-tritium plasma with absorbing boundaries, a plasma source, and the influence of kinetic effects on fusion reaction rates. Both hydrodynamic and kinetic effects influence the end losses, and the simulations show departures of the ion velocity distributions from Maxwellian due to the reduction of the population of the highest energy ions (Knudsen-layer effects). The particle simulations show that the interplay between sources, plasma dynamics, and end losses results in temperature anisotropy, plasma cooling, and concomitant reductions in the fusion reaction rates. However, for the model problems and parameters considered, particle simulations show that Knudsen-layer modifications do not significantly affect the velocity distribution function for velocities most important in determining the fusion reaction rates, i.e., the thermal fusion reaction rates using the local densities and bulk temperatures give good estimates of the kinetic fusion reaction rates.

  5. Simulation of transition dynamics to high confinement in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Nielsen, A. H.; Xu, G. S.; Madsen, J.; Naulin, V.; Juul Rasmussen, J.; Wan, B. N.

    2015-12-01

    The transition dynamics from the low (L) to the high (H) confinement mode in magnetically confined plasmas is investigated using a first-principles four-field fluid model. Numerical results are in agreement with measurements from the Experimental Advanced Superconducting Tokamak - EAST. Particularly, the slow transition with an intermediate dithering phase is well reproduced at proper parameters. The model recovers the power threshold for the L-H transition as well as the decrease in power threshold switching from single to double null configuration observed experimentally. The results are highly relevant for developing predictive models of the transition, essential for understanding and optimizing future fusion power reactors.

  6. Overview of Theory and Simulations in the Heavy Ion Fusion Science Virtual National Laboratory

    SciTech Connect

    Friedman, A

    2006-07-03

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  7. Overview of Theory and Simulations in the Heavy Ion Fusion ScienceVirtual National Laboratory

    SciTech Connect

    Friedman, Alex

    2006-07-09

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  8. Numerical analysis corresponding with experiment in compact beam simulator for heavy ion inertial fusion driver

    NASA Astrophysics Data System (ADS)

    Kikuchi, T.; Sakai, Y.; Komori, T.; Sato, T.; Hasegawa, J.; Horioka, K.; Takahashi, K.; Sasaki, T.; Harada, Nob

    2016-05-01

    Tune depression in a compact beam equipment is estimated, and numerical simulation results are compared with an experimental one for the compact beam simulator in a driver of heavy ion inertial fusion. The numerical simulation with multi-particle tracking is carried out, corresponding to the experimental condition, and the result is discussed with the experimental one. It is expected that the numerical simulation developed in this paper is useful tool to investigate the beam dynamics in the experiment with the compact beam simulator.

  9. Graduate Training: Evidence from FUSION Projects in Ireland

    ERIC Educational Resources Information Center

    Hegarty, Cecilia; Johnston, Janet

    2008-01-01

    Purpose: This paper aims to explore graduate training through SME-based project work. The views and behaviours of graduates are examined along with the perceptions of the SMEs and academic partner institutions charged with training graduates. Design/methodology/approach: The data are largely qualitative and derived from the experiences of…

  10. Assurance management program for the 30 Nova laser fusion project

    SciTech Connect

    Levy, A.J.

    1983-11-30

    The Nova assurance management program was developed using the quality assurance (QA) approach first implemented at LLNL in early 1978. The LLNL QA program is described as an introduction to the Nova assurance management program. The Nova system is described pictorially through the Nova configuration, subsystems and major components, interjecting the QA techniques which are being pragmatically used to assure the successful completion of the project.

  11. Simulations of the performance of the Fusion-FEM, for an increased e-beam emittance

    SciTech Connect

    Tulupov, A.V.; Urbanus, W.H.; Caplan, M.

    1995-12-31

    The original design of the Fusion-FEM, which is under construction at the FOM-Institute for Plasma Physics, was based on an electron beam emittance of 50 {pi} mm mrad. Recent measurements of the emittance of the beam emitted by the electron gun showed that the actual emittance is 80 {pi} mm mrad. This results in a 2.5 times lower beam current density inside the undulator. As a result it changes the linear gain, the start-up time, the saturation level and the frequency spectrum. The main goal of the FEM project is to demonstrate a stable microwave output power of at least 1 MW. The decrease of the electron beam current density has to be compensated by variations of the other FEM parameters, such as the reflection (feedback) coefficient of the microwave cavity and the length of the drift gap between the two sections of the step-tapered undulator. All basic dependencies of the linear and nonlinear gain, and of the output power on the main FEM parameters have been simulated numerically with the CRMFEL code. Regimes of stable operation of the FEM with the increased emittance have been found. These regimes could be found because of the original flexibility of the FEM design.

  12. Agent-Based Simulations for Project Management

    NASA Technical Reports Server (NTRS)

    White, J. Chris; Sholtes, Robert M.

    2011-01-01

    Currently, the most common approach used in project planning tools is the Critical Path Method (CPM). While this method was a great improvement over the basic Gantt chart technique being used at the time, it now suffers from three primary flaws: (1) task duration is an input, (2) productivity impacts are not considered , and (3) management corrective actions are not included. Today, computers have exceptional computational power to handle complex simulations of task e)(eculion and project management activities (e.g ., dynamically changing the number of resources assigned to a task when it is behind schedule). Through research under a Department of Defense contract, the author and the ViaSim team have developed a project simulation tool that enables more realistic cost and schedule estimates by using a resource-based model that literally turns the current duration-based CPM approach "on its head." The approach represents a fundamental paradigm shift in estimating projects, managing schedules, and reducing risk through innovative predictive techniques.

  13. Kinetic simulation of edge instability in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Fulton, Daniel Patrick

    In this work, gyrokinetic simulations in edge plasmas of both tokamaks and field reversed. configurations (FRC) have been carried out using the Gyrokinetic Toroidal Code (GTC) and A New Code (ANC) has been formulated for cross-separatrix FRC simulation. In the tokamak edge, turbulent transport in the pedestal of an H-mode DIII-D plasma is. studied via simulations of electrostatic driftwaves. Annulus geometry is used and simulations focus on two radial locations corresponding to the pedestal top with mild pressure gradient and steep pressure gradient. A reactive trapped electron instability with typical ballooning mode structure is excited in the pedestal top. At the steep gradient, the electrostatic instability exhibits unusual mode structure, peaking at poloidal angles theta=+- pi/2. Simulations find this unusual mode structure is due to steep pressure gradients in the pedestal but not due to the particular DIII-D magnetic geometry. Realistic DIII-D geometry has a stabilizing effect compared to a simple circular tokamak geometry. Driftwave instability in FRC is studied for the first time using gyrokinetic simulation. GTC. is upgraded to treat realistic equilibrium calculated by an MHD equilibrium code. Electrostatic local simulations in outer closed flux surfaces find ion-scale modes are stable due to the large ion gyroradius and that electron drift-interchange modes are excited by electron temperature gradient and bad magnetic curvature. In the scrape-off layer (SOL) ion-scale modes are excited by density gradient and bad curvature. Collisions have weak effects on instabilities both in the core and SOL. Simulation results are consistent with density fluctuation measurements in the C-2 experiment using Doppler backscattering (DBS). The critical density gradients measured by the DBS qualitatively agree with the linear instability threshold calculated by GTC simulations. One outstanding critical issue in the FRC is the interplay between turbulence in the FRC. core

  14. Sensitivity of mix in Inertial Confinement Fusion simulations to diffusion processes

    NASA Astrophysics Data System (ADS)

    Melvin, Jeremy; Cheng, Baolian; Rana, Verinder; Lim, Hyunkyung; Glimm, James; Sharp, David H.

    2015-11-01

    We explore two themes related to the simulation of mix within an Inertial Confinement Fusion (ICF) implosion, the role of diffusion (viscosity, mass diffusion and thermal conduction) processes and the impact of front tracking on the growth of the hydrodynamic instabilities. Using the University of Chicago HEDP code FLASH, we study the sensitivity of post-shot simulations of a NIC cryogenic shot to the diffusion models and front tracking of the material interfaces. Results of 1D and 2D simulations are compared to experimental quantities and an analysis of the current state of fully integrated ICF simulations is presented.

  15. Developing models for simulation of pinched-beam dynamics in heavy ion fusion. Revision 1

    SciTech Connect

    Boyd, J.K.; Mark, J.W.K.; Sharp, W.M.; Yu, S.S.

    1984-02-22

    For heavy-ion fusion energy applications, Mark and Yu have derived hydrodynamic models for numerical simulation of energetic pinched-beams including self-pinches and external-current pinches. These pinched-beams are applicable to beam propagation in fusion chambers and to the US High Temperature Experiment. The closure of the Mark-Yu model is obtained with adiabatic assumptions mathematically analogous to those of Chew, Goldberger, and Low for MHD. Features of this hydrodynamic beam model are compared with a kinetic treatment.

  16. Simulating Halos with the Caterpillar Project

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-04-01

    The Caterpillar Project is a beautiful series of high-resolution cosmological simulations. The goal of this project is to examine the evolution of dark-matter halos like the Milky Ways, to learn about how galaxies like ours formed. This immense computational project is still in progress, but the Caterpillar team is already providing a look at some of its first results.Lessons from Dark-Matter HalosWhy simulate the dark-matter halos of galaxies? Observationally, the formation history of our galaxy is encoded in galactic fossil record clues, like the tidal debris from disrupted satellite galaxies in the outer reaches of our galaxy, or chemical abundance patterns throughout our galactic disk and stellar halo.But to interpret this information in a way that lets us learn about our galaxys history, we need to first test galaxy formation and evolution scenarios via cosmological simulations. Then we can compare the end result of these simulations to what we observe today.This figure illustrates the difference that mass resolution makes. In the left panel, the mass resolution is 1.5*10^7 solar masses per particle. In the right panel, the mass resolution is 3*10^4 solar masses per particle [Griffen et al. 2016]A Computational ChallengeDue to how computationally expensive such simulations are, previous N-body simulations of the growth of Milky-Way-like halos have consisted of only one or a few halos each. But in order to establish a statistical understanding of how galaxy halos form and find out whether the Milky Ways halo is typical or unusual! it is necessary to simulate a larger number of halos.In addition, in order to accurately follow the formation and evolution of substructure within the dark-matter halos, these simulations must be able to resolve the smallest dwarf galaxies, which are around a million solar masses. This requires an extremely high mass resolution, which adds to the computational expense of the simulation.First OutcomesThese are the challenges faced by

  17. Digitalized Design of Extraforaminal Lumbar Interbody Fusion: A Computer-Based Simulation and Cadaveric Study

    PubMed Central

    Yang, Mingjie; Zeng, Cheng; Guo, Song; Pan, Jie; Han, Yingchao; Li, Zeqing; Li, Lijun; Tan, Jun

    2014-01-01

    Purpose This study aims to investigate the feasibility of a novel lumbar approach named extraforaminal lumbar interbody fusion (ELIF), a newly emerging minimally invasive technique for treating degenerative lumbar disorders, using a digitalized simulation and a cadaveric study. Methods The ELIF surgical procedure was simulated using the Mimics surgical simulator and included dissection of the superior articular process, dilation of the vertebral foramen, and placement of pedicle screws and a cage. ELIF anatomical measures were documented using a digitalized technique and subsequently validated on fresh cadavers. Results The use of the Mimics allowed for the vivid simulation of ELIF surgical procedures, while the cadaveric study proved the feasibility of this novel approach. ELIF had a relatively lateral access approach that was located 8–9 cm lateral to the median line with an access depth of approximately 9 cm through the intermuscular space. Dissection of the superior articular processes could fully expose the target intervertebral discs and facilitate a more inclined placement of the pedicle screws and cage with robust enhancement. Conclusions According to the computer-based simulation and cadaveric study, it is feasible to perform ELIF. Further research including biomechanical study is needed to prove ELIF has a superior ability to preserve the posterior tension bands of the spinal column, with similar effects on spinal decompression, fixation, and fusion, and if it can enhance post-fusion spinal stability and expedites postoperative recovery. PMID:25157907

  18. A hybrid model for coupling kinetic corrections of fusion reactivity to hydrodynamic implosion simulations

    NASA Astrophysics Data System (ADS)

    Tang, Xian-Zhu; McDevitt, C. J.; Guo, Zehua; Berk, H. L.

    2014-03-01

    Inertial confinement fusion requires an imploded target in which a central hot spot is surrounded by a cold and dense pusher. The hot spot/pusher interface can take complicated shape in three dimensions due to hydrodynamic mix. It is also a transition region where the Knudsen and inverse Knudsen layer effect can significantly modify the fusion reactivity in comparison with the commonly used value evaluated with background Maxwellians. Here, we describe a hybrid model that couples the kinetic correction of fusion reactivity to global hydrodynamic implosion simulations. The key ingredient is a non-perturbative treatment of the tail ions in the interface region where the Gamow ion Knudsen number approaches or surpasses order unity. The accuracy of the coupling scheme is controlled by the precise criteria for matching the non-perturbative kinetic model to perturbative solutions in both configuration space and velocity space.

  19. [Super-resolution image reconstruction algorithm based on projection onto convex sets and wavelet fusion].

    PubMed

    Cao, Yuzhen; Liu, Xiaoting; Wang, Wei; Xing, Zhanfeng

    2009-10-01

    In this paper a new super-resolution image reconstruction algorithm was proposed. With the improvement of the classical projection onto convex sets (POCS) algorithm, as ground work, and with the combined use of POCS and wavelet fusion, a high resolution CT image was restored by using a group of low resolution CT images. The experimental results showed: the proposed algorithm improves the ability of fusing different information, the detail of the image is more prominent, and the image quality is better.

  20. Colorado School of Mines fusion gamma ray diagnostic project. Technical progress report

    SciTech Connect

    Cecil, F.E.

    1992-02-14

    This report summarizes the 1991 calendar year activities of the fusion gamma ray diagnostics project in the Physics Department at the Colorado School of Mines. Considerable progress has been realized in the fusion gamma ray diagnostic project in the last year. Specifically we have achieved the two major goals of the project as outlined in last year`s proposed work statement to the Office of Applied Plasma Physics in the DOE Division of Magnetic Fusion Energy. The two major goals were: (1) Solution of the severe interference problem encountered during the operation of the gamma ray spectrometer concurrent with high power levels of the neutral beam injectors (NBI) and the ICRH antenae. (2) Experimental determination of the absolute detection efficiency of the gamma ray spectrometer. This detection efficiency will allow the measured yields of the gamma rays to be converted to a total reaction rate. In addition to these two major accomplishments, we have continued, as permitted by the TFTR operating schedule, the observation of high energy gamma rays from the 3He(D,{gamma})5Li reaction during deuterium NBI heating of 3He plasmas.

  1. Multiple Time and Spatial Scale Plasma Simulation -Prospect Based on Current Status- 4.Prospect for Multiple Time and Spatial Scale Simulation Research of Laser Fusion Plasmas

    NASA Astrophysics Data System (ADS)

    Mima, Kunioki; Nagatomo, Hideo; Sakagami, Hitoshi

    Reviewed is the development of the integrated simulation code related to laser fusion plasma research. In particular, the simulation system for describing ultra-intense laser interaction with high density plasmas is discussed. In ultra-intense laser plasma interaction, the relativistic electron current reaches a few hundred mega amperes and generates strong magnetic fields which control the electron transport. Therefore, the simulation system should include particle-in-cell simulation for laser plasma interactions, Fokker-Planck simulation and hybrid simulation for transport and dense plasma heating, and radiation hydrodynamic simulation for laser implosion and fusion burning. This paper reports the present status of the research regarding those simulations and how the above 4 simulation codes are interconnected as parts of the study of multi-space-time scale laser fusion plasma phenomena.

  2. Comparison between initial Magnetized Liner Inertial Fusion experiments and integrated simulations

    NASA Astrophysics Data System (ADS)

    Sefkow, A. B.; Gomez, M. R.; Geissel, M.; Hahn, K. D.; Hansen, S. B.; Harding, E. C.; Peterson, K. J.; Slutz, S. A.; Koning, J. M.; Marinak, M. M.

    2014-10-01

    The Magnetized Liner Inertial Fusion (MagLIF) approach to ICF has obtained thermonuclear fusion yields using the Z facility. Integrated magnetohydrodynamic simulations provided the design for the first neutron-producing experiments using capabilities that presently exist, and the initial experiments measured stagnation radii rstag < 75 μm, temperatures around 3 keV, and isotropic neutron yields up to YnDD = 2 ×1012 from imploded liners reaching peak velocities around 70 km/s over an implosion time of about 60 ns. We present comparisons between the experimental observables and post-shot degraded integrated simulations. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  3. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, Dale L.; Greenwood, Lawrence R.; Loomis, Benny A.

    1989-01-01

    An apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  4. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, D.L.; Greenwood, L.R.; Loomis, B.A.

    1988-05-20

    This paper discusses an apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  5. Apparatus and method for simulating material damage from a fusion reactor

    DOEpatents

    Smith, Dale L.; Greenwood, Lawrence R.; Loomis, Benny A.

    1989-03-07

    An apparatus and method for simulating a fusion environment on a first wall or blanket structure. A material test specimen is contained in a capsule made of a material having a low hydrogen solubility and permeability. The capsule is partially filled with a lithium solution, such that the test specimen is encapsulated by the lithium. The capsule is irradiated by a fast fission neutron source.

  6. Study of Plasma Liner Driven Magnetized Target Fusion Via Advanced Simulations

    SciTech Connect

    Samulyak, Roman V.; Parks, Paul

    2013-08-31

    The feasibility of the plasma liner driven Magnetized Target Fusion (MTF) via terascale numerical simulations will be assessed. In the MTF concept, a plasma liner, formed by merging of a number (60 or more) of radial, highly supersonic plasma jets, implodes on the target in the form of two compact plasma toroids, and compresses it to conditions of the fusion ignition. By avoiding major difficulties associated with both the traditional laser driven inertial confinement fusion and solid liner driven MTF, the plasma liner driven MTF potentially provides a low-cost and fast R&D path towards the demonstration of practical fusion energy. High fidelity numerical simulations of full nonlinear models associated with the plasma liner MTF using state-of-art numerical algorithms and terascale computing are necessary in order to resolve uncertainties and provide guidance for future experiments. At Stony Brook University, we have developed unique computational capabilities that ideally suite the MTF problem. The FronTier code, developed in collaboration with BNL and LANL under DOE funding including SciDAC for the simulation of 3D multi-material hydro and MHD flows, has beenbenchmarked and used for fundamental and engineering problems in energy science applications. We have performed 3D simulations of converging supersonic plasma jets, their merger and the formation of the plasma liner, and a study of the corresponding oblique shock problem. We have studied the implosion of the plasma liner on the magnetized plasma target by resolving Rayleigh-Taylor instabilities in 2D and 3D and other relevant physics and estimate thermodynamic conditions of the target at the moment of maximum compression and the hydrodynamic efficiency of the method.

  7. Lipid droplets fusion in adipocyte differentiated 3T3-L1 cells: A Monte Carlo simulation

    SciTech Connect

    Boschi, Federico; Rizzatti, Vanni; Zamboni, Mauro; Sbarbati, Andrea

    2014-02-15

    Several human worldwide diseases like obesity, type 2 diabetes, hepatic steatosis, atherosclerosis and other metabolic pathologies are related to the excessive accumulation of lipids in cells. Lipids accumulate in spherical cellular inclusions called lipid droplets (LDs) whose sizes range from fraction to one hundred of micrometers in adipocytes. It has been suggested that LDs can grow in size due to a fusion process by which a larger LD is obtained with spherical shape and volume equal to the sum of the progenitors’ ones. In this study, the size distribution of two populations of LDs was analyzed in immature and mature (5-days differentiated) 3T3-L1 adipocytes (first and second populations, respectively) after Oil Red O staining. A Monte Carlo simulation of interaction between LDs has been developed in order to quantify the size distribution and the number of fusion events needed to obtain the distribution of the second population size starting from the first one. Four models are presented here based on different kinds of interaction: a surface weighted interaction (R2 Model), a volume weighted interaction (R3 Model), a random interaction (Random model) and an interaction related to the place where the LDs are born (Nearest Model). The last two models mimic quite well the behavior found in the experimental data. This work represents a first step in developing numerical simulations of the LDs growth process. Due to the complex phenomena involving LDs (absorption, growth through additional neutral lipid deposition in existing droplets, de novo formation and catabolism) the study focuses on the fusion process. The results suggest that, to obtain the observed size distribution, a number of fusion events comparable with the number of LDs themselves is needed. Moreover the MC approach results a powerful tool for investigating the LDs growth process. Highlights: • We evaluated the role of the fusion process in the synthesis of the lipid droplets. • We compared the

  8. Ion Beam Heated Target Simulations for Warm Dense Matter Physics and Inertial Fusion Energy

    SciTech Connect

    Barnard, J J; Armijo, J; Bailey, D S; Friedman, A; Bieniosek, F M; Henestroza, E; Kaganovich, I; Leung, P T; Logan, B G; Marinak, M M; More, R M; Ng, S F; Penn, G E; Perkins, L J; Veitzer, S; Wurtele, J S; Yu, S S; Zylstra, A B

    2008-08-12

    Hydrodynamic simulations have been carried out using the multi-physics radiation hydrodynamics code HYDRA and the simplified one-dimensional hydrodynamics code DISH. We simulate possible targets for a near-term experiment at LBNL (the Neutralized Drift Compression Experiment, NDCX) and possible later experiments on a proposed facility (NDCX-II) for studies of warm dense matter and inertial fusion energy related beam-target coupling. Simulations of various target materials (including solids and foams) are presented. Experimental configurations include single pulse planar metallic solid and foam foils. Concepts for double-pulsed and ramped-energy pulses on cryogenic targets and foams have been simulated for exploring direct drive beam target coupling, and concepts and simulations for collapsing cylindrical and spherical bubbles to enhance temperature and pressure for warm dense matter studies are described.

  9. ION BEAM HEATED TARGET SIMULATIONS FOR WARM DENSE MATTER PHYSICS AND INERTIAL FUSION ENERGY

    SciTech Connect

    Barnard, J.J.; Armijo, J.; Bailey, D.S.; Friedman, A.; Bieniosek, F.M.; Henestroza, E.; Kaganovich, I.; Leung, P.T.; Logan, B.G.; Marinak, M.M.; More, R.M.; Ng, S.F.; Penn, G.E.; Perkins, L.J.; Veitzer, S.; Wurtele, J.S.; Yu, S.S.; Zylstra, A.B.

    2008-08-01

    Hydrodynamic simulations have been carried out using the multi-physics radiation hydrodynamics code HYDRA and the simplified one-dimensional hydrodynamics code DISH. We simulate possible targets for a near-term experiment at LBNL (the Neutralized Drift Compression Experiment, NDCX) and possible later experiments on a proposed facility (NDCX-II) for studies of warm dense matter and inertial fusion energy related beam-target coupling. Simulations of various target materials (including solids and foams) are presented. Experimental configurations include single pulse planar metallic solid and foam foils. Concepts for double-pulsed and ramped-energy pulses on cryogenic targets and foams have been simulated for exploring direct drive beam target coupling, and concepts and simulations for collapsing cylindrical and spherical bubbles to enhance temperature and pressure for warm dense matter studies are described.

  10. The Mercury Project: A High Average Power, Gas-Cooled Laser For Inertial Fusion Energy Development

    SciTech Connect

    Bayramian, A; Armstrong, P; Ault, E; Beach, R; Bibeau, C; Caird, J; Campbell, R; Chai, B; Dawson, J; Ebbers, C; Erlandson, A; Fei, Y; Freitas, B; Kent, R; Liao, Z; Ladran, T; Menapace, J; Molander, B; Payne, S; Peterson, N; Randles, M; Schaffers, K; Sutton, S; Tassano, J; Telford, S; Utterback, E

    2006-11-03

    Hundred-joule, kilowatt-class lasers based on diode-pumped solid-state technologies, are being developed worldwide for laser-plasma interactions and as prototypes for fusion energy drivers. The goal of the Mercury Laser Project is to develop key technologies within an architectural framework that demonstrates basic building blocks for scaling to larger multi-kilojoule systems for inertial fusion energy (IFE) applications. Mercury has requirements that include: scalability to IFE beamlines, 10 Hz repetition rate, high efficiency, and 10{sup 9} shot reliability. The Mercury laser has operated continuously for several hours at 55 J and 10 Hz with fourteen 4 x 6 cm{sup 2} ytterbium doped strontium fluoroapatite (Yb:S-FAP) amplifier slabs pumped by eight 100 kW diode arrays. The 1047 nm fundamental wavelength was converted to 523 nm at 160 W average power with 73% conversion efficiency using yttrium calcium oxy-borate (YCOB).

  11. Multi-Megawatt MPD Plasma Source Operation and Modeling for Fusion Propulsion Simulations

    NASA Astrophysics Data System (ADS)

    Gilland, James; Williams, Craig; Mikellides, Ioannis; Mikellides, Pavlos; Marriott, Darin

    2004-02-01

    The expansion of a high temperature fusion plasma through an expanding magnetic field is a process common to most fusion propulsion concepts. The efficiency of this process has a strong bearing on the overall performance of fusion propulsion. In order to simulate the expansion of a fusion plasma, a concept has been developed in which a high velocity plasma is first stagnated in a converging magnetic field to high (100's of eV) temperatures, then expanded though a converging/diverging magnetic nozzle. A Magnetoplasmadynamic (MPD) plasma accelerator has been constructed to generate the initial high velocity plasma and is currently undergoing characterization at the Ohio State University. The device has been operated with currents up to 300 kA and power levels up to 200 MWe. The source is powered by a 1.6 MJ, 1.6 ms pulse-forming-network. In addition to experimental tests of the accelerator, computational and theoretical modeling of both the accelerator and the plasma stagnation have been performed using the MACH2 MHD code. Insights into plasma compression and attachment to magnetic field lines have led to recommended design improvements in the facility and to preliminary predictions of nozzle performance.

  12. Beam dynamics analysis in pulse compression using electron beam compact simulator for Heavy Ion Fusion

    NASA Astrophysics Data System (ADS)

    Kikuchi, Takashi; Horioka, Kazuhiko; Sasaki, Toru; Harada, Nob.

    2013-11-01

    In a final stage of an accelerator system for heavy ion inertial fusion (HIF), pulse shaping and beam current increase by bunch compression are required for effective pellet implosion. A compact simulator with an electron beam was constructed to understand the beam dynamics. In this study, we investigate theoretically and numerically the beam dynamics for the extreme bunch compression in the final stage of HIF accelerator complex. The theoretical and numerical results implied that the compact experimental device simulates the beam dynamics around the stagnation point for initial low temperature condition.

  13. Online Simulation of Radiation Track Structure Project

    NASA Technical Reports Server (NTRS)

    Plante, Ianik

    2015-01-01

    Space radiation comprises protons, helium and high charged and energy (HZE) particles. High-energy particles are a concern for human space flight, because they are no known options for shielding astronauts from them. When these ions interact with matter, they damage molecules and create radiolytic species. The pattern of energy deposition and positions of the radiolytic species, called radiation track structure, is highly dependent on the charge and energy of the ion. The radiolytic species damage biological molecules, which may lead to several long-term health effects such as cancer. Because of the importance of heavy ions, the radiation community is very interested in the interaction of HZE particles with DNA, notably with regards to the track structure. A desktop program named RITRACKS was developed to simulate radiation track structure. The goal of this project is to create a web interface to allow registered internal users to use RITRACKS remotely.

  14. A Simulation Environment for Benchmarking Sensor Fusion-Based Pose Estimators

    PubMed Central

    Ligorio, Gabriele; Sabatini, Angelo Maria

    2015-01-01

    In-depth analysis and performance evaluation of sensor fusion-based estimators may be critical when performed using real-world sensor data. For this reason, simulation is widely recognized as one of the most powerful tools for algorithm benchmarking. In this paper, we present a simulation framework suitable for assessing the performance of sensor fusion-based pose estimators. The systems used for implementing the framework were magnetic/inertial measurement units (MIMUs) and a camera, although the addition of further sensing modalities is straightforward. Typical nuisance factors were also included for each sensor. The proposed simulation environment was validated using real-life sensor data employed for motion tracking. The higher mismatch between real and simulated sensors was about 5% of the measured quantity (for the camera simulation), whereas a lower correlation was found for an axis of the gyroscope (0.90). In addition, a real benchmarking example of an extended Kalman filter for pose estimation from MIMU and camera data is presented. PMID:26703603

  15. A Simulation Environment for Benchmarking Sensor Fusion-Based Pose Estimators.

    PubMed

    Ligorio, Gabriele; Sabatini, Angelo Maria

    2015-01-01

    In-depth analysis and performance evaluation of sensor fusion-based estimators may be critical when performed using real-world sensor data. For this reason, simulation is widely recognized as one of the most powerful tools for algorithm benchmarking. In this paper, we present a simulation framework suitable for assessing the performance of sensor fusion-based pose estimators. The systems used for implementing the framework were magnetic/inertial measurement units (MIMUs) and a camera, although the addition of further sensing modalities is straightforward. Typical nuisance factors were also included for each sensor. The proposed simulation environment was validated using real-life sensor data employed for motion tracking. The higher mismatch between real and simulated sensors was about 5% of the measured quantity (for the camera simulation), whereas a lower correlation was found for an axis of the gyroscope (0.90). In addition, a real benchmarking example of an extended Kalman filter for pose estimation from MIMU and camera data is presented. PMID:26703603

  16. Project Icarus: Analysis of Plasma jet driven Magneto-Inertial Fusion as potential primary propulsion driver for the Icarus probe

    NASA Astrophysics Data System (ADS)

    Stanic, M.; Cassibry, J. T.; Adams, R. B.

    2013-05-01

    Hopes of sending probes to another star other than the Sun are currently limited by the maturity of advanced propulsion technologies. One of the few candidate propulsion systems for providing interstellar flight capabilities is nuclear fusion. In the past many fusion propulsion concepts have been proposed and some of them have even been explored in detail, Project Daedalus for example. However, as scientific progress in this field has advanced, new fusion concepts have emerged that merit evaluation as potential drivers for interstellar missions. Plasma jet driven Magneto-Inertial Fusion (PJMIF) is one of those concepts. PJMIF involves a salvo of converging plasma jets that form a uniform liner, which compresses a magnetized target to fusion conditions. It is an Inertial Confinement Fusion (ICF)-Magnetic Confinement Fusion (MCF) hybrid approach that has the potential for a multitude of benefits over both ICF and MCF, such as lower system mass and significantly lower cost. This paper concentrates on a thermodynamic assessment of basic performance parameters necessary for utilization of PJMIF as a candidate propulsion system for the Project Icarus mission. These parameters include: specific impulse, thrust, exhaust velocity, mass of the engine system, mass of the fuel required etc. This is a submission of the Project Icarus Study Group.

  17. SciDAC Fusiongrid Project--A National Collaboratory to Advance the Science of High Temperature Plasma Physics for Magnetic Fusion

    SciTech Connect

    SCHISSEL, D.P.; ABLA, G.; BURRUSS, J.R.; FEIBUSH, E.; FREDIAN, T.W.; GOODE, M.M.; GREENWALD, M.J.; KEAHEY, K.; LEGGETT, T.; LI, K.; McCUNE, D.C.; PAPKA, M.E.; RANDERSON, L.; SANDERSON, A.; STILLERMAN, J.; THOMPSON, M.R.; URAM, T.; WALLACE, G.

    2006-08-31

    This report summarizes the work of the National Fusion Collaboratory (NFC) Project funded by the United States Department of Energy (DOE) under the Scientific Discovery through Advanced Computing Program (SciDAC) to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. A five year project that was initiated in 2001, it built on the past collaborative work performed within the U.S. fusion community and added the component of computer science research done with the USDOE Office of Science, Office of Advanced Scientific Computer Research. The project was a collaboration itself uniting fusion scientists from General Atomics, MIT, and PPPL and computer scientists from ANL, LBNL, Princeton University, and the University of Utah to form a coordinated team. The group leveraged existing computer science technology where possible and extended or created new capabilities where required. Developing a reliable energy system that is economically and environmentally sustainable is the long-term goal of Fusion Energy Science (FES) research. In the U.S., FES experimental research is centered at three large facilities with a replacement value of over $1B. As these experiments have increased in size and complexity, there has been a concurrent growth in the number and importance of collaborations among large groups at the experimental sites and smaller groups located nationwide. Teaming with the experimental community is a theoretical and simulation community whose efforts range from applied analysis of experimental data to fundamental theory (e.g., realistic nonlinear 3D plasma models) that run on massively parallel computers. Looking toward the future, the large-scale experiments needed for FES research are staffed by correspondingly large, globally dispersed teams. The fusion program will be increasingly oriented toward the International Thermonuclear Experimental Reactor (ITER) where even now, a decade before operation begins, a large

  18. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment

    PubMed Central

    Marzinek, Jan K.; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G.; Panzade, Sadhana; Verma, Chandra; Bond, Peter J.

    2016-01-01

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes. PMID:26785994

  19. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment.

    PubMed

    Marzinek, Jan K; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G; Panzade, Sadhana; Verma, Chandra; Bond, Peter J

    2016-01-01

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes. PMID:26785994

  20. Assembly of Influenza Hemagglutinin Fusion Peptides in a Phospholipid Bilayer by Coarse-grained Computer Simulations

    PubMed Central

    Collu, Francesca; Spiga, Enrico; Lorenz, Christian D.; Fraternali, Franca

    2015-01-01

    Membrane fusion is critical to eukaryotic cellular function and crucial to the entry of enveloped viruses such as influenza and human immunodeficiency virus. Influenza viral entry in the host cell is mediated by a 20–23 amino acid long sequence, called the fusion peptide (FP). Recently, possible structures for the fusion peptide (ranging from an inverted V shaped α-helical structure to an α-helical hairpin, or to a complete α-helix) and their implication in the membrane fusion initiation have been proposed. Despite the large number of studies devoted to the structure of the FP, the mechanism of action of this peptide remains unclear with several mechanisms having been suggested, including the induction of local disorder, promoting membrane curvature, and/or altering local membrane composition. In recent years, several research groups have employed atomistic and/or coarse-grained molecular dynamics (MD) simulations to investigate the matter. In all previous works, the behavior of a single FP monomer was studied, while in this manuscript, we use a simplified model of a tripeptide (TP) monomer of FP (TFP) instead of a single FP monomer because each Influenza Hemagglutinin contains three FP molecules in the biological system. In this manuscript we report findings targeted at understanding the fusogenic properties and the collective behavior of these trimers of FP peptides on a 1-palmitoyl-2-oleoyl-sn-glycero-3-phosphocholine model membrane. Here we show how the TFP monomers self-assemble into differently sized oligomers in the presence of the membrane. We measure the perturbation to the structure of the phospholipid membrane caused by the presence of these TFP oligomers. Our work (i) shows how self-assembly of TFP in the presence of the membrane induces non negligible deformation to the membrane and (ii) could be a useful starting point to stimulate discussion and further work targeted to fusion pore formation. PMID:26636093

  1. Quasi-spherical direct drive fusion simulations for the Z machine and future accelerators.

    SciTech Connect

    VanDevender, J. Pace; McDaniel, Dillon Heirman; Roderick, Norman Frederick; Nash, Thomas J.

    2007-11-01

    We explored the potential of Quasi-Spherical Direct Drive (QSDD) to reduce the cost and risk of a future fusion driver for Inertial Confinement Fusion (ICF) and to produce megajoule thermonuclear yield on the renovated Z Machine with a pulse shortening Magnetically Insulated Current Amplifier (MICA). Analytic relationships for constant implosion velocity and constant pusher stability have been derived and show that the required current scales as the implosion time. Therefore, a MICA is necessary to drive QSDD capsules with hot-spot ignition on Z. We have optimized the LASNEX parameters for QSDD with realistic walls and mitigated many of the risks. Although the mix-degraded 1D yield is computed to be {approx}30 MJ on Z, unmitigated wall expansion under the > 100 gigabar pressure just before burn prevents ignition in the 2D simulations. A squeezer system of adjacent implosions may mitigate the wall expansion and permit the plasma to burn.

  2. Simulation of Neural Firing Dynamics: A Student Project.

    ERIC Educational Resources Information Center

    Kletsky, E. J.

    This paper describes a student project in digital simulation techniques that is part of a graduate systems analysis course entitled Biosimulation. The students chose different simulation techniques to solve a problem related to the neuron model. (MLH)

  3. Equations of State for Ablator Materials in Inertial Confinement Fusion Simulations

    NASA Astrophysics Data System (ADS)

    Sterne, P. A.; Benedict, L. X.; Hamel, S.; Correa, A. A.; Milovich, J. L.; Marinak, M. M.; Celliers, P. M.; Fratanduono, D. E.

    2016-05-01

    We discuss the development of the tabular equation of state (EOS) models for ablator materials in current use at Lawrence Livermore National Laboratory in simulations of inertial confinement fusion (ICF) experiments at the National Ignition Facility. We illustrate the methods with a review of current models for ablator materials and discuss some of the challenges in performing hydrocode simulations with high-fidelity multiphase models. We stress the importance of experimental data, as well as the utility of ab initio electronic structure calculations, in regions where data is not currently available. We illustrate why Hugoniot data alone is not sufficient to constrain the EOS models. These cases illustrate the importance of experimental EOS data in multi-megabar regimes, and the vital role they play in the development and validation of EOS models for ICF simulations.

  4. Simulations of longitudinal beam dynamics of space-charge dominated beams for heavy ion fusion

    SciTech Connect

    Miller, D.A.C.

    1994-12-01

    The longitudinal instability has potentially disastrous effects on the ion beams used for heavy ion driven inertial confinement fusion. This instability is a {open_quotes}resistive wall{close_quotes} instability with the impedance coining from the induction modules in the accelerator used as a driver. This instability can greatly amplify perturbations launched from the beam head and can prevent focusing of the beam onto the small spot necessary for fusion. This instability has been studied using the WARPrz particle-in-cell code. WARPrz is a 2 1/2 dimensional electrostatic axisymmetric code. This code includes a model for the impedance of the induction modules. Simulations with resistances similar to that expected in a driver show moderate amounts of growth from the instability as a perturbation travels from beam head to tail as predicted by cold beam fluid theory. The perturbation reflects off the beam tail and decays as it travels toward the beam head. Nonlinear effects cause the perturbation to steepen during reflection. Including the capacitive component of the, module impedance. has a partially stabilizing effect on the longitudinal instability. This reduction in the growth rate is seen in both cold beam fluid theory and in simulations with WARPrz. Instability growth rates for warm beams measured from WARPrz are lower than cold beam fluid theory predicts. Longitudinal thermal spread cannot account for this decrease in the growth rate. A mechanism for coupling the transverse thermal spread to decay of the longitudinal waves is presented. The longitudinal instability is no longer a threat to the heavy ion fusion program. The simulations in this thesis have shown that the growth rate for this instability will not be as large as earlier calculations predicted.

  5. Simulation of fusion-mediated nanoemulsion interactions with model lipid bilayers

    PubMed Central

    Lee, Sun-Joo; Schlesinger, Paul H.; Wickline, Samuel A.; Lanza, Gregory M.; Baker, Nathan A.

    2012-01-01

    Perfluorocarbon-based nanoemulsion particles have become promising platforms for the delivery of therapeutic and diagnostic agents to specific target cells in a non-invasive manner. A “contact-facilitated” delivery mechanism has been proposed wherein the emulsifying phospholipid monolayer on the nanoemulsion surface contacts and forms a lipid complex with the outer monolayer of target cell plasma membrane, allowing cargo to diffuse to the surface of target cell. While this mechanism is supported by experimental evidence, its molecular details are unknown. The present study develops a coarse-grained model of nanoemulsion particles that are compatible with the MARTINI force field. Simulations using this coarse-grained model have demonstrated multiple fusion events between the particles and a model vesicular lipid bilayer. The fusion proceeds in the following sequence: dehydration at the interface, close apposition of the particles, protrusion of hydrophobic molecules to the particle surface, transient lipid complex formation, absorption of nanoemulsion into the liposome. The initial monolayer disruption acts as a rate-limiting step and is strongly influenced by particle size as well as by the presence of phospholipids supporting negative spontaneous curvature. The core-forming perfluorocarbons play critical roles in initiating the fusion process by facilitating protrusion of hydrophobic moieties into the interface between the two particles. This study directly supports the hypothesized nanoemulsion delivery mechanism and provides the underlying molecular details that enable engineering of nanoemulsions for a variety of medical applications. PMID:22712024

  6. Improving Project Management with Simulation and Completion Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cates, Grant R.

    2004-01-01

    Despite the critical importance of project completion timeliness, management practices in place today remain inadequate for addressing the persistent problem of project completion tardiness. A major culprit in late projects is uncertainty, which most, if not all, projects are inherently subject to. This uncertainty resides in the estimates for activity durations, the occurrence of unplanned and unforeseen events, and the availability of critical resources. In response to this problem, this research developed a comprehensive simulation based methodology for conducting quantitative project completion time risk analysis. It is called the Project Assessment by Simulation Technique (PAST). This new tool enables project stakeholders to visualize uncertainty or risk, i.e. the likelihood of their project completing late and the magnitude of the lateness, by providing them with a completion time distribution function of their projects. Discrete event simulation is used within PAST to determine the completion distribution function for the project of interest. The simulation is populated with both deterministic and stochastic elements. The deterministic inputs include planned project activities, precedence requirements, and resource requirements. The stochastic inputs include activity duration growth distributions, probabilities for events that can impact the project, and other dynamic constraints that may be placed upon project activities and milestones. These stochastic inputs are based upon past data from similar projects. The time for an entity to complete the simulation network, subject to both the deterministic and stochastic factors, represents the time to complete the project. Repeating the simulation hundreds or thousands of times allows one to create the project completion distribution function. The Project Assessment by Simulation Technique was demonstrated to be effective for the on-going NASA project to assemble the International Space Station. Approximately $500

  7. Final Report for Project "Framework Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Estep, Donald

    2014-01-17

    This is the final report for the Colorado State University Component of the FACETS Project. FACETS was focused on the development of a multiphysics, parallel framework application that could provide the capability to enable whole-device fusion reactor modeling and, in the process, the development of the modeling infrastructure and computational understanding needed for ITER. It was intended that FACETS be highly flexible, through the use of modern computational methods, including component technology and object oriented design, to facilitate switching from one model to another for a given aspect of the physics, and making it possible to use simplified models for rapid turnaround or high-fidelity models that will take advantage of the largest supercomputer hardware. FACETS was designed in a heterogeneous parallel context, where different parts of the application can take advantage through parallelism based on task farming, domain decomposition, and/or pipelining as needed and applicable. As with all fusion simulations, an integral part of the FACETS project was treatment of the coupling of different physical processes at different scales interacting closely. A primary example for the FACETS project is the coupling of existing core and edge simulations, with the transport and wall interactions described by reduced models. However, core and edge simulations themselves involve significant coupling of different processes with large scale differences. Numerical treatment of coupling is impacted by a number of factors including, scale differences, form of information transferred between processes, implementation of solvers for different codes, and high performance computing concerns. Operator decomposition involving the computation of the individual processes individually using appropriate simulation codes and then linking/synchronizing the component simulations at regular points in space and time, is the defacto approach to high performance simulation of multiphysics

  8. Structure and Dynamics of a Fusion Peptide Helical Hairpin on the Membrane Surface: Comparison of Molecular Simulations and NMR

    PubMed Central

    2015-01-01

    The conserved N-terminal residues of the HA2 subunit of influenza hemagglutinin (fusion peptide) are essential for membrane fusion and viral entry. Recent NMR studies showed that the 23-residue fusion peptide forms a helical hairpin that undergoes rocking motion relative to the membrane surface on a nanosecond time scale. To compare with NMR and to obtain a detailed molecular picture of the peptide–membrane interaction, we performed molecular dynamics simulations of the fusion peptide in explicit dimyristoylphosphatidylcholine and with the IMM1 implicit membrane model. To account for low and neutral pH conditions, simulations were performed with acidic groups (E11 and D19) protonated and unprotonated, respectively. The hairpin structure was stable in the simulations, with the N-terminal helix buried more deeply into the hydrophobic membrane interior than the C-terminal helix. Interactions between the tryptophans in the fusion peptide and phospholipid residues contribute to peptide orientation. Higher flexibility of the hairpin was observed in the implicit membrane simulations. Internal correlation functions of backbone N–H vectors were fit to the extended Lipari–Szabo model-free approach to obtain order parameters and correlation times. Good agreement with the NMR results was obtained for orientational fluctuations around the hairpin axis (rotation), but those around the perpendicular axis (tilting) were more limited in the simulations than inferred from the NMR experiments. PMID:24712538

  9. Simulation of System Error Tolerances of a High Current Transport Experiment for Heavy-Ion Fusion

    NASA Astrophysics Data System (ADS)

    Lund, Steven M.; Bangerter, Roger O.; Freidman, Alex; Grote, Dave P.; Seidl, Peter A.

    2000-10-01

    A driver-scale, intense ion beam transport experiment (HCX) is being designed to test issues for Heavy Ion Fusion (HIF) [1]. Here we present detailed, Particle in Cell simulations of HCX to parametrically explore how various system errors can impact machine performance. The simulations are transverse and include the full 3D fields of the quadrupole focusing magnets, spreads in axial momentum, conducting pipe boundary conditions, etc. System imperfections such as applied focusing field errors (magnet strength, field nonlinearities, etc.), alignment errors (magnet offsets and rotations), beam envelope mismatches to the focusing lattice, induced beam image charges, and beam distribution errors (beam nonuniformities, collective modes, and other distortions) are all analyzed in turn and in combination. The influence of these errors on the degradation of beam quality (emittance growth), halo production, and loss of beam control are evaluated. Evaluations of practical machine apertures and centroid steering corrections that can mitigate particle loss and degradation of beam quality are carried out. 1. P.A. Seidl, L.E. Ahle, R.O. Bangerter, V.P. Karpenko, S.M. Lund, A Faltens, R.M. Franks, D.B. Shuman, and H.K. Springer, Design of a Proof of Principal High Current Transport Experiment for Heavy-Ion Fusion, these proceedings.

  10. Verification of particle simulation of radio frequency waves in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Kuley, Animesh; Wang, Z. X.; Lin, Z.; Wessel, F.

    2013-10-01

    Radio frequency (RF) waves can provide heating, current and flow drive, as well as instability control for steady state operations of fusion experiments. A particle simulation model has been developed in this work to provide a first-principles tool for studying the RF nonlinear interactions with plasmas. In this model, ions are considered as fully kinetic particles using the Vlasov equation and electrons are treated as guiding centers using the drift kinetic equation. This model has been implemented in a global gyrokinetic toroidal code using real electron-to-ion mass ratio. To verify the model, linear simulations of ion plasma oscillation, ion Bernstein wave, and lower hybrid wave are carried out in cylindrical geometry and found to agree well with analytic predictions.

  11. Verification of particle simulation of radio frequency waves in fusion plasmas

    SciTech Connect

    Kuley, Animesh; Lin, Z.; Wang, Z. X.; Wessel, F.

    2013-10-15

    Radio frequency (RF) waves can provide heating, current and flow drive, as well as instability control for steady state operations of fusion experiments. A particle simulation model has been developed in this work to provide a first-principles tool for studying the RF nonlinear interactions with plasmas. In this model, ions are considered as fully kinetic particles using the Vlasov equation and electrons are treated as guiding centers using the drift kinetic equation. This model has been implemented in a global gyrokinetic toroidal code using real electron-to-ion mass ratio. To verify the model, linear simulations of ion plasma oscillation, ion Bernstein wave, and lower hybrid wave are carried out in cylindrical geometry and found to agree well with analytic predictions.

  12. Using Geostatistical Data Fusion Techniques and MODIS Data to Upscale Simulated Wheat Yield

    NASA Astrophysics Data System (ADS)

    Castrignano, A.; Buttafuoco, G.; Matese, A.; Toscano, P.

    2014-12-01

    Population growth increases food request. Assessing food demand and predicting the actual supply for a given location are critical components of strategic food security planning at regional scale. Crop yield can be simulated using crop models because is site-specific and determined by weather, management, length of growing season and soil properties. Crop models require reliable location-specific data that are not generally available. Obtaining these data at a large number of locations is time-consuming, costly and sometimes simply not feasible. An upscaling method to extend coverage of sparse estimates of crop yield to an appropriate extrapolation domain is required. This work is aimed to investigate the applicability of a geostatistical data fusion approach for merging remote sensing data with the predictions of a simulation model of wheat growth and production using ground-based data. The study area is Capitanata plain (4000 km2) located in Apulia Region, mostly cropped with durum wheat. The MODIS EVI/NDVI data products for Capitanata plain were downloaded from the Land Processes Distributed Active Archive Center (LPDAAC) remote for the whole crop cycle of durum wheat. Phenological development, biomass growth and grain quantity of durum wheat were simulated by the Delphi system, based on a crop simulation model linked to a database including soil properties, agronomical and meteorological data. Multicollocated cokriging was used to integrate secondary exhaustive information (multi-spectral MODIS data) with primary variable (sparsely distributed biomass/yield model predictions of durum wheat). The model estimates looked strongly spatially correlated with the radiance data (red and NIR bands) and the fusion data approach proved to be quite suitable and flexible to integrate data of different type and support.

  13. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    NASA Astrophysics Data System (ADS)

    Haines, Brian M.; Grim, Gary P.; Fincke, James R.; Shah, Rahul C.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; Hayes-Sterbenz, Anna C.; Jungman, Gerard; Klein, Andreas; Rundberg, Robert S.; Steinkamp, Michael J.; Wilhelmy, Jerry B.

    2016-07-01

    We present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a "CD Mixcap," is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employ any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  14. Detailed high-resolution three-dimensional simulations of OMEGA separated reactants inertial confinement fusion experiments

    DOE PAGESBeta

    Haines, Brian Michael; Grim, Gary P.; Fincke, James R.; Shah, Rahul C.; Forrest, Chad J.; Silverstein, Kevin; Marshall, Frederic J.; Boswell, Melissa; Fowler, Malcolm M.; Gore, Robert A.; et al

    2016-07-29

    Here, we present results from the comparison of high-resolution three-dimensional (3D) simulations with data from the implosions of inertial confinement fusion capsules with separated reactants performed on the OMEGA laser facility. Each capsule, referred to as a “CD Mixcap,” is filled with tritium and has a polystyrene (CH) shell with a deuterated polystyrene (CD) layer whose burial depth is varied. In these implosions, fusion reactions between deuterium and tritium ions can occur only in the presence of atomic mix between the gas fill and shell material. The simulations feature accurate models for all known experimental asymmetries and do not employmore » any adjustable parameters to improve agreement with experimental data. Simulations are performed with the RAGE radiation-hydrodynamics code using an Implicit Large Eddy Simulation (ILES) strategy for the hydrodynamics. We obtain good agreement with the experimental data, including the DT/TT neutron yield ratios used to diagnose mix, for all burial depths of the deuterated shell layer. Additionally, simulations demonstrate good agreement with converged simulations employing explicit models for plasma diffusion and viscosity, suggesting that the implicit sub-grid model used in ILES is sufficient to model these processes in these experiments. In our simulations, mixing is driven by short-wavelength asymmetries and longer-wavelength features are responsible for developing flows that transport mixed material towards the center of the hot spot. Mix material transported by this process is responsible for most of the mix (DT) yield even for the capsule with a CD layer adjacent to the tritium fuel. Consistent with our previous results, mix does not play a significant role in TT neutron yield degradation; instead, this is dominated by the displacement of fuel from the center of the implosion due to the development of turbulent instabilities seeded by long-wavelength asymmetries. Through these processes, the long

  15. Fusion core start-up, ignition and burn simulations of reversed-field pinch (RFP) reactors

    SciTech Connect

    Chu, Yuh-Yi

    1988-01-01

    A transient reactor simulation model is developed to investigate and simulate the start-up, ignition and burn of a reversed-field pinch reactor. The simulation is based upon a spatially averaged plasma balance model with field profiles obtained from MHD quasi-equilibrium analysis. Alpha particle heating is estimated from Fokker-Planck calculations. The instantaneous plasma current is derived from a self-consistent circuit analysis for plasma/coil/eddy current interactions. The simulation code is applied to the TITAN RFP reactor design which features a compact, high-power-density reversed-field pinch fusion system. A contour analysis is performed using the steady-state global plasma balance. The results are presented with contours of constant plasma current. A saddle point is identified in the contour plot which determines the minimum value of plasma current required to achieve ignition. An optimized start-up to ignition and burn path can be obtained by passing through the saddle point. The simulation code is used to study and optimize the start-up scenario. In the simulations of the TITAN RFP reactor, the OH-driven superconducting EF coils are found to deviate from the required equilibrium values as the induced plasma current increases. This results in the modification of superconducting EF coils and the addition of a set of EF trim coils. The design of the EF coil system is performed with the simulation code subject to the optimization of trim-coil power and current. In addition, the trim-coil design is subject to the constraints of vertical-field stability index and maintenance access. A power crowbar is also needed to prevent the superconducting EF coils from generating excessive vertical field. A set of basic results from the simulation of TITAN RFP reactor yield a picture of RFP plasma operation in a reactor. Investigations of eddy current are also presented. 145 refs., 37 figs., 2 tabs.

  16. In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation

    SciTech Connect

    G. R. Odette; G. E. Lucas

    2005-11-15

    This final report on "In-Service Design & Performance Prediction of Advanced Fusion Material Systems by Computational Modeling and Simulation" (DE-FG03-01ER54632) consists of a series of summaries of work that has been published, or presented at meetings, or both. It briefly describes results on the following topics: 1) A Transport and Fate Model for Helium and Helium Management; 2) Atomistic Studies of Point Defect Energetics, Dynamics and Interactions; 3) Multiscale Modeling of Fracture consisting of: 3a) A Micromechanical Model of the Master Curve (MC) Universal Fracture Toughness-Temperature Curve Relation, KJc(T - To), 3b) An Embrittlement DTo Prediction Model for the Irradiation Hardening Dominated Regime, 3c) Non-hardening Irradiation Assisted Thermal and Helium Embrittlement of 8Cr Tempered Martensitic Steels: Compilation and Analysis of Existing Data, 3d) A Model for the KJc(T) of a High Strength NFA MA957, 3e) Cracked Body Size and Geometry Effects of Measured and Effective Fracture Toughness-Model Based MC and To Evaluations of F82H and Eurofer 97, 3-f) Size and Geometry Effects on the Effective Toughness of Cracked Fusion Structures; 4) Modeling the Multiscale Mechanics of Flow Localization-Ductility Loss in Irradiation Damaged BCC Alloys; and 5) A Universal Relation Between Indentation Hardness and True Stress-Strain Constitutive Behavior. Further details can be found in the cited references or presentations that generally can be accessed on the internet, or provided upon request to the authors. Finally, it is noted that this effort was integrated with our base program in fusion materials, also funded by the DOE OFES.

  17. A new paradigm for variable-fidelity stochastic simulation and information fusion in fluid mechanics

    NASA Astrophysics Data System (ADS)

    Venturi, Daniele; Parussini, Lucia; Perdikaris, Paris; Karniadakis, George

    2015-11-01

    Predicting the statistical properties of fluid systems based on stochastic simulations and experimental data is a problem of major interest across many disciplines. Even with recent theoretical and computational advancements, no broadly applicable techniques exist that could deal effectively with uncertainty propagation and model inadequacy in high-dimensions. To address these problems, we propose a new paradigm for variable-fidelity stochastic modeling, simulation and information fusion in fluid mechanics. The key idea relies in employing recursive Bayesian networks and multi-fidelity information sources (e.g., stochastic simulations at different resolution) to construct optimal predictors for quantities of interest, e.g., the random temperature field in stochastic Rayleigh-Bénard convection. The object of inference is the quantity of interest at the highest possible level of fidelity, for which we can usually afford only few simulations. To compute the optimal predictors, we developed a multivariate recursive co-kriging approach that simultaneously takes into account variable fidelity in the space of models (e.g., DNS vs. potential flow solvers), as well as variable-fidelity in probability space. Numerical applications are presented and discussed. This research was supported by AFOSR and DARPA.

  18. Simulation of plume dispersion from single release in Fusion Field Trial-07 experiment

    NASA Astrophysics Data System (ADS)

    Singh, Sarvesh Kumar; Sharan, Maithili

    2013-12-01

    Accurate description of source-receptor relationship is required for an efficient source reconstruction. This is examined by simulating the dispersion of plumes resulted from the available ten trials of single releases conducted at Fusion Field Trials, Dugway Proving Ground, Utah. The simulation is addressed with an earlier developed IIT (Indian Institute of Technology) dispersion model using the dispersion parameters in terms of measurements of turbulent velocity fluctuations. Simulation is described separately in both stable and unstable conditions, characterizing the peak as well as overall observed concentration distribution. Simulated results are compared with those obtained using AERMOD. With IIT model, peak concentrations are predicted within a factor of two in all the trials. The higher concentrations (>5 × 10-4 g m-3) are well predicted in stable condition and under-predicted (within a factor of two) in unstable condition whereas relatively smaller concentrations (<5 × 10-4 g m-3) are severely under-predicted in stable conditions and over-predicted in unstable conditions. The AERMOD exhibits the similar prediction of concentrations as shown by IIT model in most of the trials. Overall, both the models predict 70-80% concentrations in stable conditions and 85-95% concentrations in unstable conditions within a factor of six. The statistical measures for both the models are found well in agreement with the observations.

  19. Angular radiation temperature simulation for time-dependent capsule drive prediction in inertial confinement fusion

    SciTech Connect

    Jing, Longfei; Yang, Dong; Li, Hang; Zhang, Lu; Lin, Zhiwei; Li, Liling; Kuang, Longyu; Jiang, Shaoen Ding, Yongkun; Huang, Yunbao

    2015-02-15

    The x-ray drive on a capsule in an inertial confinement fusion setup is crucial for ignition. Unfortunately, a direct measurement has not been possible so far. We propose an angular radiation temperature simulation to predict the time-dependent drive on the capsule. A simple model, based on the view-factor method for the simulation of the radiation temperature, is presented and compared with the experimental data obtained using the OMEGA laser facility and the simulation results acquired with VISRAD code. We found a good agreement between the time-dependent measurements and the simulation results obtained using this model. The validated model was then used to analyze the experimental results from the Shenguang-III prototype laser facility. More specifically, the variations of the peak radiation temperatures at different view angles with the albedo of the hohlraum, the motion of the laser spots, the closure of the laser entrance holes, and the deviation of the laser power were investigated. Furthermore, the time-dependent radiation temperature at different orientations and the drive history on the capsule were calculated. The results indicate that the radiation temperature from “U20W112” (named according to the diagnostic hole ID on the target chamber) can be used to approximately predict the drive temperature on the capsule. In addition, the influence of the capsule on the peak radiation temperature is also presented.

  20. Application of proton boron fusion reaction to radiation therapy: A Monte Carlo simulation study

    SciTech Connect

    Yoon, Do-Kun; Jung, Joo-Young; Suh, Tae Suk

    2014-12-01

    Three alpha particles are emitted from the point of reaction between a proton and boron. The alpha particles are effective in inducing the death of a tumor cell. After boron is accumulated in the tumor region, the emitted from outside the body proton can react with the boron in the tumor region. An increase of the proton's maximum dose level is caused by the boron and only the tumor cell is damaged more critically. In addition, a prompt gamma ray is emitted from the proton boron reaction point. Here, we show that the effectiveness of the proton boron fusion therapy was verified using Monte Carlo simulations. We found that a dramatic increase by more than half of the proton's maximum dose level was induced by the boron in the tumor region. This increase occurred only when the proton's maximum dose point was located within the boron uptake region. In addition, the 719 keV prompt gamma ray peak produced by the proton boron fusion reaction was positively detected. This therapy method features the advantages such as the application of Bragg-peak to the therapy, the accurate targeting of tumor, improved therapy effects, and the monitoring of the therapy region during treatment.

  1. Three-Dimensional Simulations of the Deceleration Phase of Inertial Fusion Implosions

    NASA Astrophysics Data System (ADS)

    Woo, K. M.; Betti, R.; Bose, A.; Epstein, R.; Delettrez, J. A.; Anderson, K. S.; Yan, R.; Chang, P.-Y.; Jonathan, D.; Charissis, M.

    2015-11-01

    The three-dimensional radiation-hydrodynamics code DEC3D has been developed to model the deceleration phase of direct-drive inertial confinement fusion implosions. The code uses the approximate Riemann solver on a moving mesh to achieve high resolution near discontinuities. The domain decomposition parallelization strategy is implemented to maintain high computation efficiency for the 3-D calculation through message passing interface. The implicit thermal diffusion is solved by the parallel successive-over-relaxation iteration. Results from 3-D simulations of low-mode Rayleigh-Taylor instability are presented and compared with 2-D results. A systematic comparison of yields, pressures, temperatures, and areal densities between 2-D and 3-D is carried out to determine the additional degradation in target performance caused by the three-dimensionality of the nonuniformities. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944 and DE-FC02-04ER54789 (Fusion Science Center).

  2. A fusion feature and its improvement based on locality preserving projections for rolling element bearing fault classification

    NASA Astrophysics Data System (ADS)

    Ding, Xiaoxi; He, Qingbo; Luo, Nianwu

    2015-01-01

    The sensitive feature extraction from vibration signals is still a great challenge for effective fault classification of rolling element bearing. Current fault classification generally depends on feature pattern difference of different fault classes. This paper explores the active role of healthy pattern in fault classification and proposes a new fusion feature extraction method based on locality preserving projections (LPP). This study intends to discover the local feature pattern difference between each bearing status and the healthy condition to characterize and discriminate different bearing statuses. Specifically, the proposed fusion feature is achieved by two main steps. In the first step, a two-class model is firstly constructed for each class by using this class of signals and healthy condition signals. Then a fusion mapping is generated by mathematically combing the mappings of the LPP or its improvement for all two-class models. In the second step, the LPP is further applied to reduce the fusion mapping dimension, which is to find more sensitive low-dimensional information hidden in the high-dimensional fusion feature structure. The final achieved fusion feature can enhance the discrimination between all classes by improving the between-class scatter and within-class scatter for fault classification. Experimental results using different bearing fault types and severities under different loads show that the proposed method is well-suited and effective for bearing fault classification.

  3. An Overview of the Distributed Space Exploration Simulation (DSES) Project

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Chung, Victoria I.; Blum, Michael G.; Bowman, James D.

    2007-01-01

    This paper describes the Distributed Space Exploration Simulation (DSES) Project, a research and development collaboration between NASA centers which investigates technologies, and processes related to integrated, distributed simulation of complex space systems in support of NASA's Exploration Initiative. In particular, it describes the three major components of DSES: network infrastructure, software infrastructure and simulation development. With regard to network infrastructure, DSES is developing a Distributed Simulation Network for use by all NASA centers. With regard to software, DSES is developing software models, tools and procedures that streamline distributed simulation development and provide an interoperable infrastructure for agency-wide integrated simulation. Finally, with regard to simulation development, DSES is developing an integrated end-to-end simulation capability to support NASA development of new exploration spacecraft and missions. This paper presents the current status and plans for these three areas, including examples of specific simulations.

  4. Project ITCH: Interactive Digital Simulation in Electrical Engineering Education.

    ERIC Educational Resources Information Center

    Bailey, F. N.; Kain, R. Y.

    A two-stage project is investigating the educational potential of a low-cost time-sharing system used as a simulation tool in Electrical Engineering (EE) education. Phase I involves a pilot study and Phase II a full integration. The system employs interactive computer simulation to teach engineering concepts which are not well handled by…

  5. The PLX- α project: Radiation-MHD Simulations of Imploding Plasma Liners Using USim

    NASA Astrophysics Data System (ADS)

    Beckwith, Kristian; Stoltz, Peter; Kundrapu, Madhusudhan; Hsu, Scott; PLX-α Team

    2015-11-01

    USim is a tool for modeling high energy density plasmas using multi-fluid models coupled to electromagnetics using fully-implicit iterative solvers, combined with finite volume discretizations on unstructured meshes. Prior work has demonstrated application of USim models and algorithms to simulation of supersonic plasma jets relevant to the Plasma Liner Experiment (PLX) and compared synthetic interferometry to that gathered from the experiment. Here, we give an overview of the models and algorithms included in USim; review results from prior modeling campaigns for the PLX; and describe plans for radiation magnetohydrodynamic (MHD) simulation efforts focusing on integrated plasma-liner implosion and target compression in a fusion-relevant regime using USim for the PLX- α project. Supported by ARPA-E's ALPHA program. Original PLX construction supported by OFES. USim development supported in part by Air Force Office of Scientific Research.

  6. Multilevel fusion exploitation

    NASA Astrophysics Data System (ADS)

    Lindberg, Perry C.; Dasarathy, Belur V.; McCullough, Claire L.

    1996-06-01

    This paper describes a project that was sponsored by the U.S. Army Space and Strategic Defense Command (USASSDC) to develop, test, and demonstrate sensor fusion algorithms for target recognition. The purpose of the project was to exploit the use of sensor fusion at all levels (signal, feature, and decision levels) and all combinations to improve target recognition capability against tactical ballistic missile (TBM) targets. These algorithms were trained with simulated radar signatures to accurately recognize selected TBM targets. The simulated signatures represent measurements made by two radars (S-band and X- band) with the targets at a variety of aspect and roll angles. Two tests were conducted: one with simulated signatures collected at angles different from those in the training database and one using actual test data. The test results demonstrate a high degree of recognition accuracy. This paper describes the training and testing techniques used; shows the fusion strategy employed; and illustrates the advantages of exploiting multi-level fusion.

  7. Final Report for the "Fusion Application for Core-Edge Transport Simulations (FACETS)"

    SciTech Connect

    Cary, John R; Kruger, Scott

    2014-10-02

    The FACETS project over its lifetime developed the first self-consistent core-edge coupled capabilities, a new transport solver for modeling core transport in tokamak cores, developed a new code for modeling wall physics over long time scales, and significantly improved the capabilities and performance of legacy components, UEDGE, NUBEAM, GLF23, GYRO, and BOUT++. These improved capabilities leveraged the team’s expertise in applied mathematics (solvers and algorithms) and computer science (performance improvements and language interoperability). The project pioneered new methods for tackling the complexity of simulating the concomitant complexity of tokamak experiments.

  8. FY2014 FES (Fusion Energy Sciences) Theory & Simulation Performance Target, Final Report

    SciTech Connect

    Fu, Guoyong; Budny, Robert; Gorelenkov, Nikolai; Poli, Francesca; Chen, Yang; McClenaghan, Joseph; Lin, Zhihong; Spong, Don; Bass, Eric; Waltz, Ron

    2014-10-14

    We report here the work done for the FY14 OFES Theory Performance Target as given below: "Understanding alpha particle confinement in ITER, the world's first burning plasma experiment, is a key priority for the fusion program. In FY 2014, determine linear instability trends and thresholds of energetic particle-driven shear Alfven eigenmodes in ITER for a range of parameters and profiles using a set of complementary simulation models (gyrokinetic, hybrid, and gyrofluid). Carry out initial nonlinear simulations to assess the effects of the unstable modes on energetic particle transport". In the past year (FY14), a systematic study of the alpha-driven Alfven modes in ITER has been carried out jointly by researchers from six institutions involving seven codes including the transport simulation code TRANSP (R. Budny and F. Poli, PPPL), three gyrokinetic codes: GEM (Y. Chen, Univ. of Colorado), GTC (J. McClenaghan, Z. Lin, UCI), and GYRO (E. Bass, R. Waltz, UCSD/GA), the hybrid code M3D-K (G.Y. Fu, PPPL), the gyro-fluid code TAEFL (D. Spong, ORNL), and the linear kinetic stability code NOVA-K (N. Gorelenkov, PPPL). A range of ITER parameters and profiles are specified by TRANSP simulation of a hybrid scenario case and a steady-state scenario case. Based on the specified ITER equilibria linear stability calculations are done to determine the stability boundary of alpha-driven high-n TAEs using the five initial value codes (GEM, GTC, GYRO, M3D-K, and TAEFL) and the kinetic stability code (NOVA-K). Both the effects of alpha particles and beam ions have been considered. Finally, the effects of the unstable modes on energetic particle transport have been explored using GEM and M3D-K.

  9. Simulation of plume dispersion of multiple releases in Fusion Field Trial-07 experiment

    NASA Astrophysics Data System (ADS)

    Pandey, Gavendra; Sharan, Maithili

    2015-12-01

    For an efficient source term estimation, it is important to use an accurate dispersion model with appropriate dispersion parameters. This is examined by simulating the dispersion of plumes resulted from the available multiple releases conducted at Fusion Field Trials, Dugway Proving Ground, Utah. The simulation is carried out with an earlier developed IIT (Indian Institute of Technology) dispersion model using the dispersion parameters in terms of measurements of turbulent velocity fluctuations. Simulation is discussed separately in both stable and unstable conditions in light of (i) plume behavior of observed and predicted concentrations in the form of isopleths, (ii) peak/maximum concentrations and (iii) overall concentration distribution. Simulated results from IIT model are compared with those obtained using AERMOD. Both, IIT model and AERMOD, predicted peak concentrations within a factor of two in all the releases and tracer transport is mostly along the mean wind direction. With IIT model, the higher concentrations are predicted close to observations in all the trials of stable conditions and with in a factor of two in the trials of unstable conditions. However, the relatively smaller concentrations are under-predicted severely in stable conditions and over-predicted in unstable conditions. The AERMOD exhibits the similar prediction of concentrations as in IIT model except slightly over-prediction in stable conditions and under-prediction in unstable conditions. The statistical measures for both the models are found good in agreement with the observations and a quantitative analysis based on F-test shows that the performance from both the models are found to be similar at 5% significance level.

  10. Revealing Surface Waters on an Antifreeze Protein by Fusion Protein Crystallography Combined with Molecular Dynamic Simulations.

    PubMed

    Sun, Tianjun; Gauthier, Sherry Y; Campbell, Robert L; Davies, Peter L

    2015-10-01

    Antifreeze proteins (AFPs) adsorb to ice through an extensive, flat, relatively hydrophobic surface. It has been suggested that this ice-binding site (IBS) organizes surface waters into an ice-like clathrate arrangement that matches and fuses to the quasi-liquid layer on the ice surface. On cooling, these waters join the ice lattice and freeze the AFP to its ligand. Evidence for the generality of this binding mechanism is limited because AFPs tend to crystallize with their IBS as a preferred protein-protein contact surface, which displaces some bound waters. Type III AFP is a 7 kDa globular protein with an IBS made up two adjacent surfaces. In the crystal structure of the most active isoform (QAE1), the part of the IBS that docks to the primary prism plane of ice is partially exposed to solvent and has clathrate waters present that match this plane of ice. The adjacent IBS, which matches the pyramidal plane of ice, is involved in protein-protein crystal contacts with few surface waters. Here we have changed the protein-protein contacts in the ice-binding region by crystallizing a fusion of QAE1 to maltose-binding protein. In this 1.9 Å structure, the IBS that fits the pyramidal plane of ice is exposed to solvent. By combining crystallography data with MD simulations, the surface waters on both sides of the IBS were revealed and match well with the target ice planes. The waters on the pyramidal plane IBS were loosely constrained, which might explain why other isoforms of type III AFP that lack the prism plane IBS are less active than QAE1. The AFP fusion crystallization method can potentially be used to force the exposure to solvent of the IBS on other AFPs to reveal the locations of key surface waters.

  11. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  12. Using a Scientific Process for Curriculum Development and Formative Evaluation: Project FUSION

    ERIC Educational Resources Information Center

    Doabler, Christian; Cary, Mari Strand; Clarke, Benjamin; Fien, Hank; Baker, Scott; Jungjohann, Kathy

    2011-01-01

    Given the vital importance of using a scientific approach for curriculum development, the authors employed a design experiment methodology (Brown, 1992; Shavelson et al., 2003) to develop and evaluate, FUSION, a first grade mathematics intervention intended for students with or at-risk for mathematics disabilities. FUSION, funded through IES…

  13. Using gaming engines and editors to construct simulations of fusion algorithms for situation management

    NASA Astrophysics Data System (ADS)

    Lewis, Lundy M.; DiStasio, Nolan; Wright, Christopher

    2010-04-01

    In this paper we discuss issues in testing various cognitive fusion algorithms for situation management. We provide a proof-of-principle discussion and demo showing how gaming technologies and platforms could be used to devise and test various fusion algorithms, including input, processing, and output, and we look at how the proof-of-principle could lead to more advanced test beds and methods for high-level fusion in support of situation management. We develop four simple fusion scenarios and one more complex scenario in which a simple rule-based system is scripted to govern the behavior of battlespace entities.

  14. A web-based repository of surgical simulator projects.

    PubMed

    Leskovský, Peter; Harders, Matthias; Székely, Gábor

    2006-01-01

    The use of computer-based surgical simulators for training of prospective surgeons has been a topic of research for more than a decade. As a result, a large number of academic projects have been carried out, and a growing number of commercial products are available on the market. Keeping track of all these endeavors for established groups as well as for newly started projects can be quite arduous. Gathering information on existing methods, already traveled research paths, and problems encountered is a time consuming task. To alleviate this situation, we have established a modifiable online repository of existing projects. It contains detailed information about a large number of simulator projects gathered from web pages, papers and personal communication. The database is modifiable (with password protected sections) and also allows for a simple statistical analysis of the collected data. For further information, the surgical repository web page can be found at www.virtualsurgery.vision.ee.ethz.ch. PMID:16404068

  15. Modeling and simulation support for ICRF heating of fusion plasmas. Annual report, 1990

    SciTech Connect

    1990-03-15

    Recent experimental, theoretical and computational results have shown the need and usefulness of a combined approach to the design, analysis and evaluation of ICH antenna configurations. The work at the University of Wisconsin (UW) in particular has shown that much needed information on the vacuum operation of ICH antennas can be obtained by a modest experimental and computational effort. These model experiments at UW and SAIC simulations have shown dramatically the potential for positive impact upon the ICRF program. Results of the UW-SAIC joint ICRF antenna analysis effort have been presented at several international meetings and numerous meetings in the United States. The PPPL bay M antenna has been modeled using the ARGUS code. The results of this effort are shown in Appendix C. SAIC has recently begun a collaboration with the ICRF antenna design and analysis group at ORNL. At present there are two separate projects underway. The first is associated with the simulation of and determination of the effect of adding slots in the antenna septum and side walls. The second project concerns the modeling and simulation of the ORNL folded waveguide (FWG) concept.

  16. Simulation of normal and pathological gaits using a fusion knowledge strategy

    PubMed Central

    2013-01-01

    Gait distortion is the first clinical manifestation of many pathological disorders. Traditionally, the gait laboratory has been the only available tool for supporting both diagnosis and prognosis, but under the limitation that any clinical interpretation depends completely on the physician expertise. This work presents a novel human gait model which fusions two important gait information sources: an estimated Center of Gravity (CoG) trajectory and learned heel paths, by that means allowing to reproduce kinematic normal and pathological patterns. The CoG trajectory is approximated with a physical compass pendulum representation that has been extended by introducing energy accumulator elements between the pendulum ends, thereby emulating the role of the leg joints and obtaining a complete global gait description. Likewise, learned heel paths captured from actual data are learned to improve the performance of the physical model, while the most relevant joint trajectories are estimated using a classical inverse kinematic rule. The model is compared with standard gait patterns, obtaining a correlation coefficient of 0.96. Additionally,themodel simulates neuromuscular diseases like Parkinson (phase 2, 3 and 4) and clinical signs like the Crouch gait, case in which the averaged correlation coefficient is 0.92. PMID:23844901

  17. Simulation and Analysis of Isotope Separation System for Fusion Fuel Recovery System

    NASA Astrophysics Data System (ADS)

    Senevirathna, Bathiya; Gentile, Charles

    2011-10-01

    This paper presents results of a simulation of the Fuel Recovery System (FRS) for the Laser Inertial Fusion Engine (LIFE) reactor. The LIFE reaction will produce exhaust gases that will need to be recycled in the FRS along with xenon, the chamber's intervention gas. Solids and liquids will first be removed and then vapor traps are used to remove large gas molecules such as lead. The gas will be reacted with lithium at high temperatures to extract the hydrogen isotopes, protium, deuterium, and tritium in hydride form. The hydrogen isotopes will be recovered using a lithium blanket processing system already in place and this product will be sent to the Isotope Separation System (ISS). The ISS will be modeled in software to analyze its effectiveness. Aspen HYSYS was chosen for this purpose for its widespread use industrial gas processing systems. Reactants and corresponding chemical reactions had to be initialized in the software. The ISS primarily consists of four cryogenic distillation columns and these were modeled in HYSYS based on design requirements. Fractional compositions of the distillate and liquid products were analyzed and used to optimize the overall system.

  18. Fast discontinuous Galerkin lattice-Boltzmann simulations on GPUs via maximal kernel fusion

    NASA Astrophysics Data System (ADS)

    Mazzeo, Marco D.

    2013-03-01

    A GPU implementation of the discontinuous Galerkin lattice-Boltzmann method with square spectral elements, and highly optimised for speed and precision of calculations is presented. An extensive analysis of the numerous variants of the fluid solver unveils that best performance is obtained by maximising CUDA kernel fusion and by arranging the resulting kernel tasks so as to trigger memory coherent and scattered loads in a specific manner, albeit at the cost of introducing cross-thread load unbalancing. Surprisingly, any attempt to vanish this, to maximise thread occupancy and to adopt conventional work tiling or distinct custom kernels highly tuned via ad hoc data and computation layouts invariably deteriorate performance. As such, this work sheds light into the possibility to hide fetch latencies of workloads involving heterogeneous loads in a way that is more effective than what is achieved with frequently suggested techniques. When simulating the lid-driven cavity on a NVIDIA GeForce GTX 480 via a 5-stage 4th-order Runge-Kutta (RK) scheme, the first four digits of the obtained centreline velocity values, or more, converge to those of the state-of-the-art literature data at a simulation speed of 7.0G primitive variable updates per second during the collision stage and 4.4G ones during each RK step of the advection by employing double-precision arithmetic (DPA) and a computational grid of 642 4×4-point elements only. The new programming engine leads to about 2× performance w.r.t. the best programming guidelines in the field. The new fluid solver on the above GPU is also 20-30 times faster than a highly optimised version running on a single core of a Intel Xeon X5650 2.66 GHz.

  19. M3D project for simulation studies of plasmas

    SciTech Connect

    Park, W.; Belova, E.V.; Fu, G.Y.; Strauss, H.R.; Sugiyama, L.E.

    1998-12-31

    The M3D (Multi-level 3D) project carries out simulation studies of plasmas of various regimes using multi-levels of physics, geometry, and mesh schemes in one code package. This paper and papers by Strauss, Sugiyama, and Belova in this workshop describe the project, and present examples of current applications. The currently available physics models of the M3D project are MHD, two-fluids, gyrokinetic hot particle/MHD hybrid, and gyrokinetic particle ion/two-fluid hybrid models. The code can be run with both structured and unstructured meshes.

  20. Fusion studies with low-intensity radioactive ion beams using an active-target time projection chamber

    NASA Astrophysics Data System (ADS)

    Kolata, J. J.; Howard, A. M.; Mittig, W.; Ahn, T.; Bazin, D.; Becchetti, F. D.; Beceiro-Novo, S.; Chajecki, Z.; Febbrarro, M.; Fritsch, A.; Lynch, W. G.; Roberts, A.; Shore, A.; Torres-Isea, R. O.

    2016-09-01

    The total fusion excitation function for 10Be+40Ar has been measured over the center-of-momentum (c.m.) energy range from 12 to 24 MeV using a time-projection chamber (TPC). The main purpose of this experiment, which was carried out in a single run of duration 90 h using a ≈100 particle per second (pps) 10Be beam, was to demonstrate the capability of an active-target TPC to determine fusion excitation functions for extremely weak radioactive ion beams. Cross sections as low as 12 mb were measured with acceptable (50%) statistical accuracy. It also proved to be possible to separate events in which charged particles were emitted from the fusion residue from those in which only neutrons were evaporated. The method permits simultaneous measurement of incomplete fusion, break-up, scattering, and transfer reactions, and therefore fully exploits the opportunities presented by the very exotic beams that will be available from the new generation of radioactive beam facilities.

  1. Mechanisms of Plastic and Fracture Instabilities for Alloy Development of Fusion Materials. Final Project Report for period July 15, 1998 - July 14, 2003

    SciTech Connect

    Ghoniem, N. M.

    2003-07-14

    The main objective of this research was to develop new computational tools for the simulation and analysis of plasticity and fracture mechanisms of fusion materials, and to assist in planning and assessment of corresponding radiation experiments.

  2. Exploring International Investment through a Classroom Portfolio Simulation Project

    ERIC Educational Resources Information Center

    Chen, Xiaoying; Yur-Austin, Jasmine

    2013-01-01

    A rapid integration of financial markets has prevailed during the last three decades. Investors are able to diversify investment beyond national markets to mitigate return volatility of a "pure domestic portfolio." This article discusses a simulation project through which students learn the role of international investment by managing…

  3. The SIMRAND methodology - Simulation of Research and Development Projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1984-01-01

    In research and development projects, a commonly occurring management decision is concerned with the optimum allocation of resources to achieve the project goals. Because of resource constraints, management has to make a decision regarding the set of proposed systems or tasks which should be undertaken. SIMRAND (Simulation of Research and Development Projects) is a methodology which was developed for aiding management in this decision. Attention is given to a problem description, aspects of model formulation, the reduction phase of the model solution, the simulation phase, and the evaluation phase. The implementation of the considered approach is illustrated with the aid of an example which involves a simplified network of the type used to determine the price of silicon solar cells.

  4. Response to FESAC survey, non-fusion connections to Fusion Energy Sciences. Applications of the FES-supported beam and plasma simulation code, Warp

    SciTech Connect

    Friedman, A.; Grote, D. P.; Vay, J. L.

    2015-05-29

    The Fusion Energy Sciences Advisory Committee’s subcommittee on non-fusion applications (FESAC NFA) is conducting a survey to obtain information from the fusion community about non-fusion work that has resulted from their DOE-funded fusion research. The subcommittee has requested that members of the community describe recent developments connected to the activities of the DOE Office of Fusion Energy Sciences. Two questions in particular were posed by the subcommittee. This document contains the authors’ responses to those questions.

  5. Projected profile similarity in gyrokinetic simulations of Bohm and gyro-Bohm scaled DIII-D L and H modes

    SciTech Connect

    Waltz, R. E.; Candy, J.; Petty, C. C.

    2006-07-15

    Global gyrokinetic simulations of DIII-D [M. A. Mahdavi and J. L. Luxon, in 'DIII-D Tokamak Special Issue', Fusion Sci. Technol. 48, 2 (2005)] L- and H-mode dimensionally similar discharge pairs are treated in detail. The simulations confirm the Bohm scaling of the well-matched L-mode pair. The paradoxical but experimentally apparent gyro-Bohm scaling of the H-mode pair at larger relative gyroradius (rho-star) and lower transport levels is due to poor profile similarity. Simulations of projected experimental plasma profiles with perfect similarity show both the L- and H-mode pairs to have Bohm scaling. A {rho}{sub *} stabilization rule for predicting the breakdown of gyro-Bohm scaling from simulations of a single discharge is presented.

  6. The GeantV project: Preparing the future of simulation

    DOE PAGESBeta

    Amadio, G.; J. Apostolakis; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; et al

    2015-12-23

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energymore » Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. Furthermore, a set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.« less

  7. The GeantV project: Preparing the future of simulation

    SciTech Connect

    Amadio, G.; J. Apostolakis; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; de Fine Licht, J.; Gheata, A.; Iope, R. L.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2015-12-23

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. Furthermore, a set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.

  8. The GeantV project: preparing the future of simulation

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, Ph; Carminati, F.; Duhem, L.; Elvira, D.; de Fine Licht, J.; Gheata, A.; Iope, R. L.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.

    2015-12-01

    Detector simulation is consuming at least half of the HEP computing cycles, and even so, experiments have to take hard decisions on what to simulate, as their needs greatly surpass the availability of computing resources. New experiments still in the design phase such as FCC, CLIC and ILC as well as upgraded versions of the existing LHC detectors will push further the simulation requirements. Since the increase in computing resources is not likely to keep pace with our needs, it is therefore necessary to explore innovative ways of speeding up simulation in order to sustain the progress of High Energy Physics. The GeantV project aims at developing a high performance detector simulation system integrating fast and full simulation that can be ported on different computing architectures, including CPU accelerators. After more than two years of R&D the project has produced a prototype capable of transporting particles in complex geometries exploiting micro-parallelism, SIMD and multithreading. Portability is obtained via C++ template techniques that allow the development of machine- independent computational kernels. A set of tables derived from Geant4 for cross sections and final states provides a realistic shower development and, having been ported into a Geant4 physics list, can be used as a basis for a direct performance comparison.

  9. Tooth model reconstruction based upon data fusion for orthodontic treatment simulation.

    PubMed

    Yau, Hong-Tzong; Yang, Tsan-Jui; Chen, Yi-Chen

    2014-05-01

    This paper proposes a full tooth reconstruction method by integrating 3D scanner data and computed tomography (CT) image sets. In traditional dental treatment, plaster models are used to record patient׳s oral information and assist dentists for diagnoses. However, plaster models only save surface information, and are therefore unable to provide further information for clinical treatment. With the rapid development of medical imaging technology, computed tomography images have become very popular in dental treatment. Computed tomography images with complete internal information can assist the clinical diagnosis for dental implants or orthodontic treatment, and a digital dental model can be used to simulate and predict results before treatment. However, a method of producing a high quality and precise dental model has yet to be developed. To this end, this paper presents a tooth reconstruction method based on the data fusion concept via integrating external scanned data and CT-based medical images. First, a plaster model is digitized with a 3D scanner. Then, each crown can be separated from the base according to the characteristics of tooth. CT images must be processed for feature enhancement and noise reduction, and to define the tooth axis direction which will be used for root slicing. The outline of each slice of dental root can then be determined by the level set algorithm, and converted to point cloud data. Finally, the crown and root data can be registered by the iterative closest point (ICP) algorithm. With this information, a complete digital dental model can be reconstructed by the Delaunay-based region-growing (DBRG) algorithm. The main contribution of this paper is to reconstruct a high quality customized dental model with root information that can offer significant help to the planning of dental implant and orthodontic treatment. PMID:24631784

  10. The simulation model of teleradiology in telemedicine project.

    PubMed

    Goodini, Azadeh; Torabi, Mashallah; Goodarzi, Maryam; Safdari, Reza; Darayi, Mohamad; Tavassoli, Mahdieh; Shabani, MohammadMehdi

    2015-01-01

    Telemedicine projects are aimed at offering medical services to people who do not have access to direct diagnosis and treatment services. As a powerful tool for analyzing the performance of complex systems and taking probable events into consideration, systemic simulation can facilitate the analysis of implementation processes of telemedicine projects in real-life-like situations. The aim of the present study was to propose a model for planning resource capacities and allocating human and operational resources to promote the efficiency of telemedicine project by investigating the process of teleradiology. In this article, after verification of the conceptual model by the experts of this field, the computerized simulation model is developed using simulation software Arena. After specifying the required data, different improvement scenarios are run using the computerized model by feeding the data into the software and validation and verification of the model. Fixing input data of the system such as the number of patients, their waiting time, and process time of each function, for example, magnetic resonance imaging or scan, has been compared with the current radiology process. Implementing the teleradiology model resulted in reduction of time of patients in the system (current: 1.84 ± 0.00, tele: 0.81 ± 0.00). Furthermore, through this process, they can allocate the lower resources to perform better functions of staff. The use of computerized simulation is essential for designing processes, optimal allocation of resources, planning, and making appropriate decisions for providing timely services to patients. PMID:25627857

  11. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  12. Community Petascale Project for Accelerator Science and Simulation

    SciTech Connect

    Warren B. Mori

    2013-02-01

    The UCLA Plasma Simulation Group is a major partner of the "Community Petascale Project for Accelerator Science and Simulation. This is the final technical report. We include an overall summary, a list of publications and individual progress reports for each years. During the past five years we have made tremendous progress in enhancing the capabilities of OSIRIS and QuickPIC, in developing new algorithms and data structures for PIC codes to run on GPUS and many future core architectures, and in using these codes to model experiments and in making new scientific discoveries. Here we summarize some highlights for which SciDAC was a major contributor.

  13. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  14. Ocular vergence measurement in projected and collimated simulator displays.

    PubMed

    Morahan, P; Meehan, J W; Patterson, J; Hughes, P K

    1998-09-01

    The purpose of this study was to investigate electrooculography (EOG) as a measurement of ocular vergence in both collimated and projected simulator environments. The task required participants to shift their gaze between a central fixation point and a target appearing at one of three eccentricities. EOG was effective in recording ocular vergence. The EOG results were similar between collimated and projected displays, except for differences in vergence changes during lateral movement of the eyes, and ocular excursions downward elicited a greater EOG response than the reverse upward movement. The computer-based technique of recording vergence was found to produce measurable traces from a majority of participants. The technique has potential for further development as a tool for measuring ocular vergence in virtual environments where methods that require the wearing of head-mounted apparatus to track ocular structures (e.g., the pupil), which cannot be worn at the same time as a flight or flight-simulator helmet, are unsuitable.

  15. Adaptive quantum computation in changing environments using projective simulation

    NASA Astrophysics Data System (ADS)

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-08-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  16. The HiPER project for inertial confinement fusion and some experimental results on advanced ignition schemes

    NASA Astrophysics Data System (ADS)

    Batani, D.; Koenig, M.; Baton, S.; Perez, F.; Gizzi, L. A.; Koester, P.; Labate, L.; Honrubia, J.; Antonelli, L.; Morace, A.; Volpe, L.; Santos, J.; Schurtz, G.; Hulin, S.; Ribeyre, X.; Fourment, C.; Nicolai, P.; Vauzour, B.; Gremillet, L.; Nazarov, W.; Pasley, J.; Richetta, M.; Lancaster, K.; Spindloe, Ch; Tolley, M.; Neely, D.; Kozlová, M.; Nejdl, J.; Rus, B.; Wolowski, J.; Badziak, J.; Dorchies, F.

    2011-12-01

    This paper presents the goals and some of the results of experiments conducted within the Working Package 10 (Fusion Experimental Programme) of the HiPER Project. These experiments concern the study of the physics connected to 'advanced ignition schemes', i.e. the fast ignition and the shock ignition approaches to inertial fusion. Such schemes are aimed at achieving a higher gain, as compared with the classical approach which is used in NIF, as required for future reactors, and make fusion possible with smaller facilities. In particular, a series of experiments related to fast ignition were performed at the RAL (UK) and LULI (France) Laboratories and studied the propagation of fast electrons (created by a short-pulse ultra-high-intensity beam) in compressed matter, created either by cylindrical implosions or by compression of planar targets by (planar) laser-driven shock waves. A more recent experiment was performed at PALS and investigated the laser-plasma coupling in the 1016 W cm-2 intensity regime of interest for shock ignition.

  17. The AGORA High-resolution Galaxy Simulations Comparison Project

    NASA Astrophysics Data System (ADS)

    Kim, Ji-hoon; Abel, Tom; Agertz, Oscar; Bryan, Greg L.; Ceverino, Daniel; Christensen, Charlotte; Conroy, Charlie; Dekel, Avishai; Gnedin, Nickolay Y.; Goldbaum, Nathan J.; Guedes, Javiera; Hahn, Oliver; Hobbs, Alexander; Hopkins, Philip F.; Hummels, Cameron B.; Iannuzzi, Francesca; Keres, Dusan; Klypin, Anatoly; Kravtsov, Andrey V.; Krumholz, Mark R.; Kuhlen, Michael; Leitner, Samuel N.; Madau, Piero; Mayer, Lucio; Moody, Christopher E.; Nagamine, Kentaro; Norman, Michael L.; Onorbe, Jose; O'Shea, Brian W.; Pillepich, Annalisa; Primack, Joel R.; Quinn, Thomas; Read, Justin I.; Robertson, Brant E.; Rocha, Miguel; Rudd, Douglas H.; Shen, Sijing; Smith, Britton D.; Szalay, Alexander S.; Teyssier, Romain; Thompson, Robert; Todoroki, Keita; Turk, Matthew J.; Wadsley, James W.; Wise, John H.; Zolotov, Adi; AGORA Collaboration29,the

    2014-01-01

    We introduce the Assembling Galaxies Of Resolved Anatomy (AGORA) project, a comprehensive numerical study of well-resolved galaxies within the ΛCDM cosmology. Cosmological hydrodynamic simulations with force resolutions of ~100 proper pc or better will be run with a variety of code platforms to follow the hierarchical growth, star formation history, morphological transformation, and the cycle of baryons in and out of eight galaxies with halo masses M vir ~= 1010, 1011, 1012, and 1013 M ⊙ at z = 0 and two different ("violent" and "quiescent") assembly histories. The numerical techniques and implementations used in this project include the smoothed particle hydrodynamics codes GADGET and GASOLINE, and the adaptive mesh refinement codes ART, ENZO, and RAMSES. The codes share common initial conditions and common astrophysics packages including UV background, metal-dependent radiative cooling, metal and energy yields of supernovae, and stellar initial mass function. These are described in detail in the present paper. Subgrid star formation and feedback prescriptions will be tuned to provide a realistic interstellar and circumgalactic medium using a non-cosmological disk galaxy simulation. Cosmological runs will be systematically compared with each other using a common analysis toolkit and validated against observations to verify that the solutions are robust—i.e., that the astrophysical assumptions are responsible for any success, rather than artifacts of particular implementations. The goals of the AGORA project are, broadly speaking, to raise the realism and predictive power of galaxy simulations and the understanding of the feedback processes that regulate galaxy "metabolism." The initial conditions for the AGORA galaxies as well as simulation outputs at various epochs will be made publicly available to the community. The proof-of-concept dark-matter-only test of the formation of a galactic halo with a z = 0 mass of M vir ~= 1.7 × 1011 M ⊙ by nine different

  18. Dynamical downscaling simulation and future projection of precipitation over China

    NASA Astrophysics Data System (ADS)

    Bao, Jiawei; Feng, Jinming; Wang, Yongli

    2015-08-01

    This study assesses present-day and future precipitation changes over China by using the Weather Research and Forecasting (WRF) model version 3.5.1. The WRF model was driven by the Geophysical Fluid Dynamics Laboratory Earth System Model with the Generalized Ocean Layer Dynamics component (GFDL-ESM2G) output over China at the resolution of 30 km for the present day (1976-2005) and near future (2031-2050) under the Representative Concentration Pathways 4.5 (RCP4.5) scenario. The results demonstrate that with improved resolution and better representation of finer-scale physical process, dynamical downscaling adds value to the regional precipitation simulation. WRF downscaling generally simulates more reliable spatial distributions of total precipitation and extreme precipitation in China with higher spatial pattern correlations and closer magnitude. It is able to successfully eliminate the artificial precipitation maximum area simulated by GFDL-ESM2G over the west of the Sichuan Basin, along the eastern edge of the Tibetan Plateau in both summer and winter. Besides, the regional annual cycle and frequencies of precipitation intensity are also well depicted by WRF. In the future projections, under the RCP4.5 scenario, both models project that summer precipitation over most parts of China will increase, especially in western and northern China, and that precipitation over some southern regions is projected to decrease. The projected increase of future extreme precipitation makes great contributions to the total precipitation increase. In southern regions, the projected larger extreme precipitation amounts accompanied with fewer extreme precipitation frequencies suggest that future daily extreme precipitation intensity is likely to increase in these regions.

  19. Preliminary Results from SCEC Earthquake Simulator Comparison Project

    NASA Astrophysics Data System (ADS)

    Tullis, T. E.; Barall, M.; Richards-Dinger, K. B.; Ward, S. N.; Heien, E.; Zielke, O.; Pollitz, F. F.; Dieterich, J. H.; Rundle, J. B.; Yikilmaz, M. B.; Turcotte, D. L.; Kellogg, L. H.; Field, E. H.

    2010-12-01

    Earthquake simulators are computer programs that simulate long sequences of earthquakes. If such simulators could be shown to produce synthetic earthquake histories that are good approximations to actual earthquake histories they could be of great value in helping to anticipate the probabilities of future earthquakes and so could play an important role in helping to make public policy decisions. Consequently it is important to discover how realistic are the earthquake histories that result from these simulators. One way to do this is to compare their behavior with the limited knowledge we have from the instrumental, historic, and paleoseismic records of past earthquakes. Another, but slow process for large events, is to use them to make predictions about future earthquake occurrence and to evaluate how well the predictions match what occurs. A final approach is to compare the results of many varied earthquake simulators to determine the extent to which the results depend on the details of the approaches and assumptions made by each simulator. Five independently developed simulators, capable of running simulations on complicated geometries containing multiple faults, are in use by some of the authors of this abstract. Although similar in their overall purpose and design, these simulators differ from one another widely in their details in many important ways. They require as input for each fault element a value for the average slip rate as well as a value for friction parameters or stress reduction due to slip. They share the use of the boundary element method to compute stress transfer between elements. None use dynamic stress transfer by seismic waves. A notable difference is the assumption different simulators make about the constitutive properties of the faults. The earthquake simulator comparison project is designed to allow comparisons among the simulators and between the simulators and past earthquake history. The project uses sets of increasingly detailed

  20. Unilateral spectral and temporal compression reduces binaural fusion for normal hearing listeners with cochlear implant simulations.

    PubMed

    Aronoff, Justin M; Shayman, Corey; Prasad, Akila; Suneel, Deepa; Stelmach, Julia

    2015-02-01

    Patients with single sided deafness have recently begun receiving cochlear implants in their deaf ear. These patients gain a significant benefit from having a cochlear implant. However, despite this benefit, they are considerably slower to develop binaural abilities such as summation compared to bilateral cochlear implant patients. This suggests that these patients have difficulty fusing electric and acoustic signals. Although this may reflect inherent differences between electric and acoustic stimulation, it may also reflect properties of the processor and fitting system, which result in spectral and temporal compression. To examine the possibility that unilateral spectral and temporal compression can adversely affect binaural fusion, this study tested normal hearing listeners' binaural fusion through the use of vocoded speech with unilateral spectral and temporal compression. The results indicate that unilateral spectral and temporal compression can each hinder binaural fusion and thus may adversely affect binaural abilities in patients with single sided deafness who use a cochlear implant in their deaf ear. PMID:25549574

  1. Progress of the NASAUSGS Lunar Regolith Simulant Project

    NASA Technical Reports Server (NTRS)

    Rickman, Douglas; McLemore, C.; Stoeser, D.; Schrader, C.; Fikes, J.; Street, K.

    2009-01-01

    Beginning in 2004 personnel at MSFC began serious efforts to develop a new generation of lunar simulants. The first two products were a replication of the previous JSC-1 simulant under a contract to Orbitec and a major workshop in 2005 on future simulant development. It was recognized in early 2006 there were serious limitations with the standard approach of simply taking a single terrestrial rock and grinding it. To a geologist, even a cursory examination of the Lunar Sourcebook shows that matching lunar heterogeneity, crystal size, relative mineral abundances, lack of H2O, plagioclase chemistry and glass abundance simply can not be done with any simple combination of terrestrial rocks. Thus the project refocused its efforts and approached simulant development in a new and more comprehensive manner, examining new approaches in simulant development and ways to more accurately compare simulants to actual lunar materials. This led to a multi-year effort with five major tasks running in parallel. The five tasks are Requirements, Lunar Analysis, Process Development, Feed Stocks, and Standards.

  2. SciDAC - Center for Plasma Edge Simulation - Project Summary

    SciTech Connect

    Parker, Scott

    2014-11-03

    Final Technical Report: Center for Plasma Edge Simulation (CPES) Principal Investigator: Scott Parker, University of Colorado, Boulder Description/Abstract First-principle simulations of edge pedestal micro-turbulence are performed with the global gyrokinetic turbulence code GEM for both low and high confinement tokamak plasmas. The high confinement plasmas show a larger growth rate, but nonlinearly a lower particle and heat flux. Numerical profiles are obtained from the XGC0 neoclassical code. XGC0/GEM code coupling is implemented under the EFFIS (“End-to-end Framework for Fusion Integrated Simulation”) framework. Investigations are underway to clearly identify the micro-instabilities in the edge pedestal using global and flux-tube gyrokinetic simulation with realistic experimental high confinement profiles. We use both experimental profiles and those obtained using the EFFIS XGC0/GEM coupled code framework. We find there are three types of instabilities at the edge: a low-n, high frequency electron mode, a high-n, low frequency ion mode, and possibly an ion mode like kinetic ballooning mode (KBM). Investigations are under way for the effects of the radial electric field. Finally, we have been investigating how plasmas dominated by ion-temperature gradient (ITG) driven turbulence, how cold Deuterium and Tritium ions near the edge will naturally pinch radially inward towards the core. We call this mechanism “natural fueling.” It is due to the quasi-neutral heat flux dominated nature of the turbulence and still applies when trapped and passing kinetic electron effects are included. To understand this mechanism, examine the situation where the electrons are adiabatic, and there is an ion heat flux. In such a case, lower energy particles move inward and higher energy particles move outward. If a trace amount of cold particles are added, they will move inward.

  3. Three-dimensional gyrokinetic particle-in-cell simulation of plasmas on a massively parallel computer: Final report on LDRD Core Competency Project, FY 1991--FY 1993

    SciTech Connect

    Byers, J.A.; Williams, T.J.; Cohen, B.I.; Dimits, A.M.

    1994-04-27

    One of the programs of the Magnetic fusion Energy (MFE) Theory and computations Program is studying the anomalous transport of thermal energy across the field lines in the core of a tokamak. We use the method of gyrokinetic particle-in-cell simulation in this study. For this LDRD project we employed massively parallel processing, new algorithms, and new algorithms, and new formal techniques to improve this research. Specifically, we sought to take steps toward: researching experimentally-relevant parameters in our simulations, learning parallel computing to have as a resource for our group, and achieving a 100 {times} speedup over our starting-point Cray2 simulation code`s performance.

  4. Fusing simulation and experiment: The effect of mutations on the structure and activity of the influenza fusion peptide

    PubMed Central

    Lousa, Diana; Pinto, Antónia R. T.; Victor, Bruno L.; Laio, Alessandro; Veiga, Ana S.; Castanho, Miguel A. R. B.; Soares, Cláudio M.

    2016-01-01

    During the infection process, the influenza fusion peptide (FP) inserts into the host membrane, playing a crucial role in the fusion process between the viral and host membranes. In this work we used a combination of simulation and experimental techniques to analyse the molecular details of this process, which are largely unknown. Although the FP structure has been obtained by NMR in detergent micelles, there is no atomic structure information in membranes. To answer this question, we performed bias-exchange metadynamics (BE-META) simulations, which showed that the lowest energy states of the membrane-inserted FP correspond to helical-hairpin conformations similar to that observed in micelles. BE-META simulations of the G1V, W14A, G12A/G13A and G4A/G8A/G16A/G20A mutants revealed that all the mutations affect the peptide’s free energy landscape. A FRET-based analysis showed that all the mutants had a reduced fusogenic activity relative to the WT, in particular the mutants G12A/G13A and G4A/G8A/G16A/G20A. According to our results, one of the major causes of the lower activity of these mutants is their lower membrane affinity, which results in a lower concentration of peptide in the bilayer. These findings contribute to a better understanding of the influenza fusion process and open new routes for future studies. PMID:27302370

  5. A Particle-in-Cell Simulation for the Traveling Wave Direct Energy Converter (TWDEC) for Fusion Propulsion

    NASA Technical Reports Server (NTRS)

    Chap, Andrew; Tarditi, Alfonso G.; Scott, John H.

    2013-01-01

    A Particle-in-cell simulation model has been developed to study the physics of the Traveling Wave Direct Energy Converter (TWDEC) applied to the conversion of charged fusion products into electricity. In this model the availability of a beam of collimated fusion products is assumed; the simulation is focused on the conversion of the beam kinetic energy into alternating current (AC) electric power. The model is electrostatic, as the electro-dynamics of the relatively slow ions can be treated in the quasistatic approximation. A two-dimensional, axisymmetric (radial-axial coordinates) geometry is considered. Ion beam particles are injected on one end and travel along the axis through ring-shaped electrodes with externally applied time-varying voltages, thus modulating the beam by forming a sinusoidal pattern in the beam density. Further downstream, the modulated beam passes through another set of ring electrodes, now electrically oating. The modulated beam induces a time alternating potential di erence between adjacent electrodes. Power can be drawn from the electrodes by connecting a resistive load. As energy is dissipated in the load, a corresponding drop in beam energy is measured. The simulation encapsulates the TWDEC process by reproducing the time-dependent transfer of energy and the particle deceleration due to the electric eld phase time variations.

  6. Progress of the NASA/USGS Lunar Regolith Simulant Project

    NASA Technical Reports Server (NTRS)

    Rickman, Doug; MLemore, Carole; Wilson, Steve; Stoeser, Doug; Schrader, Christian; Fikes, John; Street, Kenneth

    2009-01-01

    Beginning in 2004 personnel at MSFC began serious efforts to develop a new generation of lunar simulants. The first two products were a replication of the previous JSC-1 simulant under a contract to Orbitec and a major workshop in 2005 on future simulant development. Beginning in 2006 the project refocused its efforts and approached simulant development in a new and more comprehensive manner, examining new approaches in simulant development and ways to more accurately compare simulants to actual lunar materials. This led to a multi-year effort with five major tasks running in parallel. The five tasks are Requirements, Lunar Analysis, Process Development, Feed Stocks, and Standards. Major progress has been made in all five areas. A substantial draft of a formal requirements document now exists and has been largely stable since 2007. It does evolve as specific details of the standards and Lunar Analysis efforts proceed. Lunar Analysis has turned out to be vastly more difficult than anticipated. After great effort to mine existing published and gray literature, the team has realized the necessity of making new measurements of the Apollo samples, an effort that is currently in progress. Process development is substantially ahead of expectations in 2006. It is now practical to synthesize glasses of appropriate composition and purity. It is also possible to make agglutinate particles in significant quantities. A series of minerals commonly found on the Moon has been synthesized. Separation of mineral constituents from starting rock material is also proceeding. Customized grinding and mixing processes have been developed and tested are now being documented. Identification and development of appropriate feedstocks has been both easier and more difficult than anticipated. The Stillwater Mining Company, operating in the Stillwater layered mafic intrusive complex of Montana, has been an amazing resource for the project, but finding adequate sources for some of the components

  7. Three-dimensional simulation strategy to determine the effects of turbulent mixing on inertial-confinement-fusion capsule performance.

    PubMed

    Haines, Brian M; Grinstein, Fernando F; Fincke, James R

    2014-05-01

    In this paper, we present and justify an effective strategy for performing three-dimensional (3D) inertial-confinement-fusion (ICF) capsule simulations. We have evaluated a frequently used strategy in which two-dimensional (2D) simulations are rotated to 3D once sufficient relevant 2D flow physics has been captured and fine resolution requirements can be restricted to relatively small regions. This addresses situations typical of ICF capsules which are otherwise prohibitively intensive computationally. We tested this approach for our previously reported fully 3D simulations of laser-driven reshock experiments where we can use the available 3D data as reference. Our studies indicate that simulations that begin as purely 2D lead to significant underprediction of mixing and turbulent kinetic energy production at later time when compared to the fully 3D simulations. If, however, additional suitable nonuniform perturbations are applied at the time of rotation to 3D, we show that one can obtain good agreement with the purely 3D simulation data, as measured by vorticity distributions as well as integrated mixing and turbulent kinetic energy measurements. Next, we present results of simulations of a simple OMEGA-type ICF capsule using the developed strategy. These simulations are in good agreement with available experimental data and suggest that the dominant mechanism for yield degradation in ICF implosions is hydrodynamic instability growth seeded by long-wavelength surface defects. This effect is compounded by drive asymmetries and amplified by repeated shock interactions with an increasingly distorted shell, which results in further yield reduction. Our simulations are performed with and without drive asymmetries in order to compare the importance of these effects to those of surface defects; our simulations indicate that long-wavelength surface defects degrade yield by approximately 60% and short-wavelength drive asymmetry degrades yield by a further 30%. PMID

  8. Integrated fusion simulation with self-consistent core-pedestal coupling

    SciTech Connect

    Meneghini, Orso; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, David L; Elwasif, Wael R; Grierson, Brian A.; Holland, C.

    2016-01-01

    Accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments. An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Z eff.

  9. Integrated fusion simulation with self-consistent core-pedestal coupling

    DOE PAGESBeta

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, D. L.; Elwasif, W.; et al

    2016-04-20

    In this study, accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self- consistent solution to this strongly-coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Validation against DIII-D discharges shows that the workflow is capable of robustly pre- dicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in good agreement with the experiments.more » An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff.« less

  10. Integrated fusion simulation with self-consistent core-pedestal coupling

    NASA Astrophysics Data System (ADS)

    Meneghini, O.; Snyder, P. B.; Smith, S. P.; Candy, J.; Staebler, G. M.; Belli, E. A.; Lao, L. L.; Park, J. M.; Green, D. L.; Elwasif, W.; Grierson, B. A.; Holland, C.

    2016-04-01

    Accurate prediction of fusion performance in present and future tokamaks requires taking into account the strong interplay between core transport, pedestal structure, current profile, and plasma equilibrium. An integrated modeling workflow capable of calculating the steady-state self-consistent solution to this strongly coupled problem has been developed. The workflow leverages state-of-the-art components for collisional and turbulent core transport, equilibrium and pedestal stability. Testing against a DIII-D discharge shows that the workflow is capable of robustly predicting the kinetic profiles (electron and ion temperature and electron density) from the axis to the separatrix in a good agreement with the experiments. An example application is presented, showing self-consistent optimization for the fusion performance of the 15 MA D-T ITER baseline scenario as functions of the pedestal density and ion effective charge Zeff .

  11. Simulating the magnetized liner inertial fusion plasma confinement with smaller-scale experiments

    SciTech Connect

    Ryutov, D. D.; Cuneo, M. E.; Herrmann, M. C.; Sinars, D. B.; Slutz, S. A.

    2012-06-15

    The recently proposed magnetized liner inertial fusion approach to a Z-pinch driven fusion [Slutz et al., Phys. Plasmas 17, 056303 (2010)] is based on the use of an axial magnetic field to provide plasma thermal insulation from the walls of the imploding liner. The characteristic plasma transport regimes in the proposed approach cover parameter domains that have not been studied yet in either magnetic confinement or inertial confinement experiments. In this article, an analysis is presented of the scalability of the key physical processes that determine the plasma confinement. The dimensionless scaling parameters are identified and conclusion is drawn that the plasma behavior in scaled-down experiments can correctly represent the full-scale plasma, provided these parameters are approximately the same in two systems. This observation is important in that smaller-scale experiments typically have better diagnostic access and more experiments per year are possible.

  12. Simulations for experimental study of warm dense matter and inertial fusion energy applications on NDCX-II

    SciTech Connect

    Barnard, J J; Armijo, J; Bieniosek, F M; Friedman, A; Hay, M J; Henestroza, E; Logan, B G; More, R M; Ni, P A; Perkins, L J; Ng, S; Wurtele, J S; Yu, S S; Zylstra, A B

    2010-03-19

    The Neutralized Drift Compression Experiment II (NDCX II) is an induction accelerator planned for initial commissioning in 2012. The final design calls for a {approx}3 MeV, {approx}30 A Li{sup +} ion beam, delivered in a bunch with characteristic pulse duration of 1 ns, and transverse dimension of order 1 mm. The purpose of NDCX II is to carry out experimental studies of material in the warm dense matter regime, and ion beam/hydrodynamic coupling experiments relevant to heavy ion based inertial fusion energy. In preparation for this new machine, we have carried out hydrodynamic simulations of ion-beam-heated, metallic solid targets, connecting quantities related to observables, such as brightness temperature and expansion velocity at the critical frequency, with the simulated fluid density, temperature, and velocity. We examine how these quantities depend on two commonly used equations of state.

  13. Simulations for experimental study of warm dense matter and inertial fusion energy applications on NDCX-II

    SciTech Connect

    Barnard, J.J.; Armijo, J.; Bieniosek, F.M.; Friedman, A.; Hay, M.; Henestroza, E.; Logan, B.G.; More, R.M.; Ni, P.A.; Perkins, L. J.; Ng, S-F.; Wurtele, J.S.; Yu, S.S.; Zylstra, A.B.

    2009-09-01

    The Neutralized Drift Compression Experiment II (NDCX II) is an induction accelerator planned for initial commissioning in 2012. The final design calls for a {approx}3 MeV, {approx}30 A Li{sup +} ion beam, delivered in a bunch with characteristic pulse duration of 1 ns, and ransverse dimension of order 1 mm. The purpose of NDCX II is to carry out experimental studies of material in the warm dense matter regime, and ion beam/hydrodynamic coupling experiments relevant to heavy ion based inertial fusion energy. In preparation for this new machine, we have carried out hydrodynamic simulations of ion-beam-heated, metallic solid targets, connecting quantities related to observables, such as brightness temperature and expansion velocity at the critical frequency, with the simulated fluid density, temperature, and velocity. We examine how these quantities depend on two commonly used equations of state.

  14. The Jefferson Project: Large-eddy simulations of a watershed

    NASA Astrophysics Data System (ADS)

    Watson, C.; Cipriani, J.; Praino, A. P.; Treinish, L. A.; Tewari, M.; Kolar, H.

    2015-12-01

    The Jefferson Project is a new endeavor at Lake George, NY by IBM Research, Rensselaer Polytechnic Institute (RPI) and The Fund for Lake George. Lake George is an oligotrophic lake - one of low nutrients - and a 30-year study recently published by RPI's Darrin Fresh Water Institute highlighted the renowned water quality is declining from the injection of salt (from runoff), algae, and invasive species. In response, the Jefferson Project is developing a system to provide extensive data on relevant physical, chemical and biological parameters that drive ecosystem function. The system will be capable of real-time observations and interactive modeling of the atmosphere, watershed hydrology, lake circulation and food web dynamics. In this presentation, we describe the development of the operational forecast system used to simulate the atmosphere in the model stack, Deep ThunderTM (a configuration of the ARW-WRF model). The model performs 48-hr forecasts twice daily in a nested configuration, and in this study we present results from ongoing tests where the innermost domains are dx = 333-m and 111-m. We discuss the model's ability to simulate boundary layer processes, lake surface conditions (an input into the lake model), and precipitation (an input into the hydrology model) during different weather regimes, and the challenges of data assimilation and validation at this scale. We also explore the potential for additional nests over select regions of the watershed to better capture turbulent boundary layer motions.

  15. Spinal fusion

    MedlinePlus

    ... Anterior spinal fusion; Spine surgery - spinal fusion; Low back pain - fusion; Herniated disk - fusion ... If you had chronic back pain before surgery, you will likely still have some pain afterward. Spinal fusion is unlikely to take away all your pain ...

  16. NASA GRC UAS Project: Communications Modeling and Simulation Status

    NASA Technical Reports Server (NTRS)

    Kubat, Greg

    2013-01-01

    The integration of Unmanned Aircraft Systems (UAS) in the National Airspace represents new operational concepts required in civil aviation. These new concepts are evolving as the nation moves toward the Next Generation Air Transportation System (NextGen) under the leadership of the Joint Planning and Development Office (JPDO), and through ongoing work by the Federal Aviation Administration (FAA). The desire and ability to fly UAS in the National Air Space (NAS) in the near term has increased dramatically, and this multi-agency effort to develop and implement a national plan to successfully address the challenges of UAS access to the NAS in a safe and timely manner is well underway. As part of the effort to integrate UAS in the National Airspace, NASA Glenn Research Center is currently involved with providing research into Communications systems and Communication system operations in order to assist with developing requirements for this implementation. In order to provide data and information regarding communication systems performance that will be necessary, NASA GRC is tasked with developing and executing plans for simulations of candidate future UAS command and control communications, in line with architectures and communications technologies being developed and/or proposed by NASA and relevant aviation organizations (in particular, RTCA SC-203). The simulations and related analyses will provide insight into the ability of proposed communications technologies and system architectures to enable safe operation of UAS, meeting UAS in the NAS project goals (including performance requirements, scalability, and interoperability), and ultimately leading to a determination of the ability of NextGen communication systems to accommodate UAS. This presentation, compiled by the NASA GRC team, will provide a view of the overall planned simulation effort and objectives, a description of the simulation concept and status of the design and development that has occurred to date.

  17. Laser fusion

    SciTech Connect

    Smit, W.A.; Boskma, P.

    1980-12-01

    Unrestricted laser fusion offers nations an opportunity to circumvent arms control agreements and develop thermonuclear weapons. Early laser weapons research sought a clean radiation-free bomb to replace the fission bomb, but this was deceptive because a fission bomb was needed to trigger the fusion reaction and additional radioactivity was induced by generating fast neutrons. As laser-implosion experiments focused on weapons physics, simulating weapons effects, and applications for new weapons, the military interest shifted from developing a laser-ignited hydrogen bomb to more sophisticated weapons and civilian applications for power generation. Civilian and military research now overlap, making it possible for several countries to continue weapons activities and permitting proliferation of nuclear weapons. These countries are reluctant to include inertial confinement fusion research in the Non-Proliferation Treaty. 16 references. (DCK)

  18. GUMICS4 Synthetic and Dynamic Simulations of the ECLAT Project

    NASA Astrophysics Data System (ADS)

    Facsko, G.; Palmroth, M. M.; Gordeev, E.; Hakkinen, L. V.; Honkonen, I. J.; Janhunen, P.; Sergeev, V. A.; Kauristie, K.; Milan, S. E.

    2012-12-01

    The European Commission funded the European Cluster Assimilation Techniques (ECLAT) project as a collaboration of five leader European universities and research institutes. A main contribution of the Finnish Meteorological Institute (FMI) is to provide a wide range of global MHD runs with the Grand Unified Magnetosphere Ionosphere Coupling simulation (GUMICS). The runs are divided in two categories: synthetic runs investigating the extent of solar wind drivers that can influence magnetospheric dynamics, as well as dynamic runs using measured solar wind data as input. Here we consider the first set of runs with synthetic solar wind input. The solar wind density, velocity and the interplanetary magnetic field had different magnitudes and orientations; furthermore two F10.7 flux values were selected for solar radiation minimum and maximum values. The solar wind parameter values were constant such that a constant stable solution was archived. All configurations were run several times with three different (-15°, 0°, +15°) tilt angles in the GSE X-Z plane. The Cray XT supercomputer of the FMI provides a unique opportunity in global magnetohydrodynamic simulation: running the GUMICS-4 based on one year real solar wind data. Solar wind magnetic field, density, temperature and velocity data based on Advanced Composition Explorer (ACE) and WIND measurements are downloaded from the OMNIWeb open database and a special input file is created for each Cluster orbit. All data gaps are replaced with linear interpolations between the last and first valid data values before and after the data gap. Minimum variance transformation is applied for the Interplanetary Magnetic Field data to clean and avoid the code of divergence. The Cluster orbits are divided into slices allowing parallel computation and each slice has an average tilt angle value. The file timestamps start one hour before the perigee to provide time for building up a magnetosphere in the simulation space. The real

  19. NASA GRC UAS Project - Communications Modeling and Simulation Development Status

    NASA Technical Reports Server (NTRS)

    Apaza, Rafael; Bretmersky, Steven; Dailey, Justin; Satapathy, Goutam; Ditzenberger, David; Ye, Chris; Kubat, Greg; Chevalier, Christine; Nguyen, Thanh

    2014-01-01

    The integration of Unmanned Aircraft Systems (UAS) in the National Airspace represents new operational concepts required in civil aviation. These new concepts are evolving as the nation moves toward the Next Generation Air Transportation System (NextGen) under the leadership of the Joint Planning and Development Office (JPDO), and through ongoing work by the Federal Aviation Administration (FAA). The desire and ability to fly UAS in the National Air Space (NAS) in the near term has increased dramatically, and this multi-agency effort to develop and implement a national plan to successfully address the challenges of UAS access to the NAS in a safe and timely manner is well underway. As part of the effort to integrate UAS in the National Airspace, NASA Glenn Research Center is currently involved with providing research into Communications systems and Communication system operations in order to assist with developing requirements for this implementation. In order to provide data and information regarding communication systems performance that will be necessary, NASA GRC is tasked with developing and executing plans for simulations of candidate future UAS command and control communications, in line with architectures and communications technologies being developed and or proposed by NASA and relevant aviation organizations (in particular, RTCA SC-203). The simulations and related analyses will provide insight into the ability of proposed communications technologies and system architectures to enable safe operation of UAS, meeting UAS in the NAS project goals (including performance requirements, scalability, and interoperability), and ultimately leading to a determination of the ability of NextGen communication systems to accommodate UAS. This presentation, compiled by the NASA GRC Modeling and Simulation team, will provide an update to this ongoing effort at NASA GRC as follow-up to the overview of the planned simulation effort presented at ICNS in 2013. The objective

  20. A fully non-linear multi-species Fokker-Planck-Landau collision operator for simulation of fusion plasma

    NASA Astrophysics Data System (ADS)

    Hager, Robert; Yoon, E. S.; Ku, S.; D'Azevedo, E. F.; Worley, P. H.; Chang, C. S.

    2016-06-01

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. In this article, the non-linear single-species Fokker-Planck-Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. The finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker-Planck-Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computing systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. The collision operator's good weak and strong scaling behavior are shown.

  1. Accuracy and convergence of coupled finite-volume/Monte Carlo codes for plasma edge simulations of nuclear fusion reactors

    NASA Astrophysics Data System (ADS)

    Ghoos, K.; Dekeyser, W.; Samaey, G.; Börner, P.; Baelmans, M.

    2016-10-01

    The plasma and neutral transport in the plasma edge of a nuclear fusion reactor is usually simulated using coupled finite volume (FV)/Monte Carlo (MC) codes. However, under conditions of future reactors like ITER and DEMO, convergence issues become apparent. This paper examines the convergence behaviour and the numerical error contributions with a simplified FV/MC model for three coupling techniques: Correlated Sampling, Random Noise and Robbins Monro. Also, practical procedures to estimate the errors in complex codes are proposed. Moreover, first results with more complex models show that an order of magnitude speedup can be achieved without any loss in accuracy by making use of averaging in the Random Noise coupling technique.

  2. Transition from Beam-Target to Thermonuclear Fusion in High-Current Deuterium Z-Pinch Simulations.

    PubMed

    Offermann, Dustin T; Welch, Dale R; Rose, Dave V; Thoma, Carsten; Clark, Robert E; Mostrom, Chris B; Schmidt, Andrea E W; Link, Anthony J

    2016-05-13

    Fusion yields from dense, Z-pinch plasmas are known to scale with the drive current, which is favorable for many potential applications. Decades of experimental studies, however, show an unexplained drop in yield for currents above a few mega-ampere (MA). In this work, simulations of DD Z-Pinch plasmas have been performed in 1D and 2D for a constant pinch time and initial radius using the code Lsp, and observations of a shift in scaling are presented. The results show that yields below 3 MA are enhanced relative to pure thermonuclear scaling by beamlike particles accelerated in the Rayleigh-Taylor induced electric fields, while yields above 3 MA are reduced because of energy lost by the instability and the inability of the beamlike ions to enter the pinch region.

  3. A fully non-linear multi-species Fokker–Planck–Landau collision operator for simulation of fusion plasma

    DOE PAGESBeta

    Hager, Robert; Yoon, E. S.; Ku, S.; D'Azevedo, E. F.; Worley, P. H.; Chang, C. S.

    2016-04-04

    Fusion edge plasmas can be far from thermal equilibrium and require the use of a non-linear collision operator for accurate numerical simulations. The non-linear single-species Fokker–Planck–Landau collision operator developed by Yoon and Chang (2014) [9] is generalized to include multiple particle species. Moreover, the finite volume discretization used in this work naturally yields exact conservation of mass, momentum, and energy. The implementation of this new non-linear Fokker–Planck–Landau operator in the gyrokinetic particle-in-cell codes XGC1 and XGCa is described and results of a verification study are discussed. Finally, the numerical techniques that make our non-linear collision operator viable on high-performance computingmore » systems are described, including specialized load balancing algorithms and nested OpenMP parallelization. As a result, the collision operator's good weak and strong scaling behavior are shown.« less

  4. Susitna Hydroelectric Project: terrestrial environmental workshop and preliminary simulation model

    USGS Publications Warehouse

    Everitt, Robert R.; Sonntag, Nicholas C.; Auble, Gregory T.; Roelle, James E.; Gazey, William

    1982-01-01

    The technical feasibility, economic viability, and environmental impacts of a hydroelectric development project in the Susitna River Basin are being studied by Acres American, Inc. on behalf of the Alaska Power Authority. As part of these studies, Acres American recently contracted LGL Alaska Research Associates, Inc. to coordinate the terrestrial environmental studies being performed by the Alaska Department of Fish and Game and, as subcontractors to LGL, several University of Alaska research groups. LGL is responsible for further quantifying the potential impacts of the project on terrestrial wildlife and vegetation, and for developing a plan to mitigate adverse impacts on the terrestrial environment. The impact assessment and mitigation plan will be included as part of a license application to the Federal Energy Regulatory Commission (FERC) scheduled for the first quarter of 1983. The quantification of impacts, mitigation planning, and design of future research is being organized using a computer simulation modelling approach. Through a series of workshops attended by researchers, resource managers, and policy-makers, a computer model is being developed and refined for use in the quantification of impacts on terrestrial wildlife and vegetation, and for evaluating different mitigation measures such as habitat enhancement and the designation of replacement lands to be managed by wildlife habitat. This report describes the preliminary model developed at the first workshop held August 23 -27, 1982 in Anchorage.

  5. Computer simulations for minds-on learning with ``Project Spectra!''

    NASA Astrophysics Data System (ADS)

    Wood, E. L.; Renfrow, S.; Marks, N.; Christofferson, R.

    2010-12-01

    How do we gain information about the Sun? How do we know Mars has CO2 or that Titan has a nitrogen-rich atmosphere? How do we use light in astronomy? These concepts are something education professionals generally struggle with because they are abstract. Making use of visualizations and presenting material so it can be manipulated is the easiest way to conquer abstractions to bring them home to students. Using simulations and computer interactives (games) where students experience and manipulate the information makes concepts accessible. “Project Spectra!” is a science and engineering program that uses computer-based Flash interactives to expose students to astronomical spectroscopy and actual data in a way that is not possible with traditional in-class activities. Visualizing lessons with multi-media is a way to solidify understanding and retention of knowledge and is completely unlike its paper-and-pencil counterpart. To engage students in “Project Spectra!”, students are given a mission, which connects them with the research at hand. Missions range from exploring remote planetary atmospheres and surfaces, experimenting with the Sun using different filters, and comparing spectroscopic atmospheric features between different bodies. Additionally, students have an opportunity to learn about NASA missions, view movies, and see images connected with their mission. In the end, students are asked critical thinking questions and conduct web-based research. These interactives complement the in-class activities where students engineer spectrographs and explore the electromagnetic spectrum.

  6. MULTI-IFE-A one-dimensional computer code for Inertial Fusion Energy (IFE) target simulations

    NASA Astrophysics Data System (ADS)

    Ramis, R.; Meyer-ter-Vehn, J.

    2016-06-01

    The code MULTI-IFE is a numerical tool devoted to the study of Inertial Fusion Energy (IFE) microcapsules. It includes the relevant physics for the implosion and thermonuclear ignition and burning: hydrodynamics of two component plasmas (ions and electrons), three-dimensional laser light ray-tracing, thermal diffusion, multigroup radiation transport, deuterium-tritium burning, and alpha particle diffusion. The corresponding differential equations are discretized in spherical one-dimensional Lagrangian coordinates. Two typical application examples, a high gain laser driven capsule and a low gain radiation driven marginally igniting capsule are discussed. In addition to phenomena relevant for IFE, the code includes also components (planar and cylindrical geometries, transport coefficients at low temperature, explicit treatment of Maxwell's equations) that extend its range of applicability to laser-matter interaction at moderate intensities (<1016 W cm-2). The source code design has been kept simple and structured with the aim to encourage user's modifications for specialized purposes.

  7. Mixed Waste Treatment Project: Computer simulations of integrated flowsheets

    SciTech Connect

    Dietsche, L.J.

    1993-12-01

    The disposal of mixed waste, that is waste containing both hazardous and radioactive components, is a challenging waste management problem of particular concern to DOE sites throughout the United States. Traditional technologies used for the destruction of hazardous wastes need to be re-evaluated for their ability to handle mixed wastes, and in some cases new technologies need to be developed. The Mixed Waste Treatment Project (MWTP) was set up by DOE`s Waste Operations Program (EM30) to provide guidance on mixed waste treatment options. One of MWTP`s charters is to develop flowsheets for prototype integrated mixed waste treatment facilities which can serve as models for sites developing their own treatment strategies. Evaluation of these flowsheets is being facilitated through the use of computer modelling. The objective of the flowsheet simulations is to provide mass and energy balances, product compositions, and equipment sizing (leading to cost) information. The modelled flowsheets need to be easily modified to examine how alternative technologies and varying feed streams effect the overall integrated process. One such commercially available simulation program is ASPEN PLUS. This report contains details of the Aspen Plus program.

  8. Enhancing chemical identification efficiency by SAW sensor transients through a data enrichment and information fusion strategy—a simulation study

    NASA Astrophysics Data System (ADS)

    Singh, Prashant; Yadava, R. D. S.

    2013-05-01

    The paper proposes a new approach for improving the odor recognition efficiency of a surface acoustic wave (SAW) transient sensor system based on a single polymer coating. The vapor identity information is hidden in transient response shapes through dependences on specific vapor solvation and diffusion parameters in the polymer coating. The variations in the vapor exposure and purge durations and the sensor operating frequency have been used to create diversity in transient shapes via termination of the vapor-polymer equilibration process up to different stages. The transient signals were analyzed by the discrete wavelet transform using Daubechies-4 mother wavelet basis. The wavelet approximation coefficients were then processed by principal component analysis for creating feature space. The set of principal components define the vapor identity information. In an attempt to enhance vapor class separability we analyze two types of information fusion methods. In one, the sensor operation frequency is fixed and the sensing and purge durations are varied, and in the second, the sensing and purge durations are fixed and the sensor operating frequency is varied. The fusion is achieved by concatenation of discrete wavelet coefficients corresponding to various transients prior to the principal component analysis. The simulation experiments with polyisobutylene SAW sensor coating for operation frequencies over [55-160] MHz and sensing durations over [5-60] s were analyzed. The target vapors are seven volatile organics: chloroform, chlorobenzene, o-dichlorobenzene, n-heptane, toluene, n-hexane and n-octane whose concentrations were varied over [10-100] ppm. The simulation data were generated using a SAW sensor transient response model that incorporates the viscoelastic effects due to polymer coating and an additive noise source in the output. The analysis reveals that: (i) in single transient analysis the class separability increases with sensing duration for a given

  9. Models and Simulations of C60-Fullerene Plasma Jets for Disruption Mitigation and Magneto-Inertial Fusion

    NASA Astrophysics Data System (ADS)

    Bogatu, Ioan-Niculae; Galkin, Sergei A.; Kim, Jin-Soo

    2009-11-01

    We present the models and simulation results of C60-fullerene plasma jets proposed to be used for the disruption mitigation on ITER and for magneto-inertial fusion (MIF). The model describing the fast production of a large mass of C60 molecular gas in the pulsed power source by explosive sublimation of C60 micro-grains is detailed. Several aspects of the magnetic ``piston'' model and the 2D interchange (magnetic Rayleigh-Taylor) instability in the rail gun arc dynamics are described. A plasma jet adiabatic expansion model is used to investigate the in-flight three-body recombination during jet transport to the plasma boundary. Our LSP PIC code 3D simulations show that heavy C60 plasmoid penetrates deeply through a transverse magnetic barrier demonstrating self-polarization and magnetic field expulsion effects. The LSP code 3D simulation of two plasma jets head-on injection along a magnetic field lines for MIF are also discussed.

  10. Appreciating the Complexity of Project Management Execution: Using Simulation in the Classroom

    ERIC Educational Resources Information Center

    Hartman, Nathan S.; Watts, Charles A.; Treleven, Mark D.

    2013-01-01

    As the popularity and importance of project management increase, so does the need for well-prepared project managers. This article discusses our experiences using a project management simulation in undergraduate and MBA classes to help students better grasp the complexity of project management. This approach gives students hands-on experience with…

  11. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    SciTech Connect

    Vold, E. L.; Molvig, K.; Joglekar, A. S.; Ortega, M. I.; Moll, R.; Fenn, D.

    2015-11-15

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion (ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. We have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasma viscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasma viscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasma viscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Plasma viscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  12. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    NASA Astrophysics Data System (ADS)

    Vold, E. L.; Joglekar, A. S.; Ortega, M. I.; Moll, R.; Fenn, D.; Molvig, K.

    2015-11-01

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion (ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. We have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasma viscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasma viscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasma viscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Plasma viscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  13. Heavy ion beam illumination and implosion simulation in inertial confinement fusion

    NASA Astrophysics Data System (ADS)

    Kawata, Shigeo; Ogoyski, A. I.

    2005-10-01

    In direct-driven pellet implosion, heavy ion beams (HIBs) illuminate a spherical target and deposit their energy on a target after a HIB final transport. In our study, we develop a three-dimensional HIB illumination code [1] and a target hydrodynamic implosion code for heavy ion fusion (HIF). The main objects of our study are to clarify a dependence of multi-HIB illumination non-uniformity on parameter values of HIB illumination in HIF and to calculate the target hydrodynamics during the HIB pulse by using the our HIB illumination and implosion code. In our illumination code, we calculate the HIB energy deposition. The target nuclei, target bound electrons, free electrons and target ions contribute to the HIB energy deposition. The HIB ions impinge the target surface, penetrate relatively deep into the deposition layer and deposit their energy in a rather wide region in the deposition layer: this HIB deposition feature influences the beam illumination non-uniformity. Therefore we calculate target implosion using the coupled hydrodynamic code in order to investigate the beam illumination non-uniformity influence on a fuel ignition. [1] T.Someya, et.al, Phy.Rev.STAB, 7, 044701 (2004).

  14. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    DOE PAGESBeta

    Vold, Erik Lehman; Joglekar, Archis S.; Ortega, Mario I.; Moll, Ryan; Fenn, Daniel; Molvig, Kim

    2015-11-20

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion(ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. In this paper, we have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasmaviscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasmaviscosity andmore » to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasmaviscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Finally, plasmaviscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.« less

  15. Plasma viscosity with mass transport in spherical inertial confinement fusion implosion simulations

    SciTech Connect

    Vold, Erik Lehman; Joglekar, Archis S.; Ortega, Mario I.; Moll, Ryan; Fenn, Daniel; Molvig, Kim

    2015-11-20

    The effects of viscosity and small-scale atomic-level mixing on plasmas in inertial confinement fusion(ICF) currently represent challenges in ICF research. Many current ICF hydrodynamic codes ignore the effects of viscosity though recent research indicates viscosity and mixing by classical transport processes may have a substantial impact on implosion dynamics. In this paper, we have implemented a Lagrangian hydrodynamic code in one-dimensional spherical geometry with plasmaviscosity and mass transport and including a three temperature model for ions, electrons, and radiation treated in a gray radiation diffusion approximation. The code is used to study ICF implosion differences with and without plasmaviscosity and to determine the impacts of viscosity on temperature histories and neutron yield. It was found that plasmaviscosity has substantial impacts on ICF shock dynamics characterized by shock burn timing, maximum burn temperatures, convergence ratio, and time history of neutron production rates. Finally, plasmaviscosity reduces the need for artificial viscosity to maintain numerical stability in the Lagrangian formulation and also modifies the flux-limiting needed for electron thermal conduction.

  16. Experiments with Memory-to-Memory Coupling for End-to-End fusion Simulation Workflows

    SciTech Connect

    Docan, Ciprian; Zhang, Fan; Parashar, Manish; Cummings, Julian; Podhorszki, Norbert; Klasky, Scott A

    2010-01-01

    Scientific applications are striving to accurately simulate multiple interacting physical processes that comprise complex phenomena being modeled. Efficient and scalable parallel implementations of these coupled simulations present challenging interaction and coordination requirements, especially when the coupled physical processes are computationally heterogeneous and progress at different speeds. In this paper, we present the design, implementation and evaluation of a memory-to-memory coupling framework for coupled scientific simulations on high-performance parallel computing platforms. The framework is driven by the coupling requirements of the Center for Plasma Edge Simulation, and it provides simple coupling abstractions as well as efficient asynchronous (RDMA-based) memory-to-memory data transport mechanisms that complement existing parallel programming systems and data sharing frameworks. The framework enables flexible coupling behaviors that are asynchronous in time and space, and it supports dynamic coupling between heterogeneous simulation processes without enforcing any synchronization constraints. We evaluate the performance and scalability of the coupling framework using a specific coupling scenario, on the Jaguar Cray XT5 system at Oak Ridge National Laboratory.

  17. Projections of African drought extremes in CORDEX regional climate simulations

    NASA Astrophysics Data System (ADS)

    Gbobaniyi, Emiola; Nikulin, Grigory; Jones, Colin; Kjellström, Erik

    2013-04-01

    We investigate trends in drought extremes for different climate regions of the African continent over a combined historical and future period 1951-2100. Eight CMIP5 coupled atmospheric global climate models (CanESM2, CNRM-CM5, HadGEM2-ES, NorESM1-M, EC-EARTH, MIROC5, GFDL-ESM2M and MPI-ESM-LR) under two forcing scenarios, the relative concentration pathways (RCP) 4.5 and 8.5, with spatial resolution varying from about 1° to 3° are downscaled to 0.44° resolution by the Rossby Centre (SMHI) regional climate model RCA4. We use data from the ensuing ensembles of CORDEX-Africa regional climate simulations to explore three drought indices namely: standardized precipitation index (SPI), moisture index (MI) and difference in precipitation and evaporation (P-E). Meteorological and agricultural drought conditions are assessed in our analyses and a climate change signal is obtained for the SPI by calculating gamma functions for future SPI with respect to a baseline present climate. Results for the RCP4.5 and RCP8.5 scenarios are inter-compared to assess uncertainties in the future projections. We show that there is a pronounced sensitivity to the choice of forcing GCM which indicates that assessments of future drought conditions in Africa would benefit from large model ensembles. We also note that the results are sensitive to the choice of drought index. We discuss both spatial and temporal variability of drought extremes for different climate zones of Africa and the importance of the ensemble mean. Our study highlights the usefulness of CORDEX simulations in identifying possible future impacts of climate at local and regional scales.

  18. A simulation-based and analytic analysis of the off-Hugoniot response of alternative inertial confinement fusion ablator materials

    NASA Astrophysics Data System (ADS)

    Moore, Alastair S.; Prisbrey, Shon; Baker, Kevin L.; Celliers, Peter M.; Fry, Jonathan; Dittrich, Thomas R.; Wu, Kuang-Jen J.; Kervin, Margaret L.; Schoff, Michael E.; Farrell, Mike; Nikroo, Abbas; Hurricane, Omar A.

    2016-09-01

    The attainment of self-propagating fusion burn in an inertial confinement target at the National Ignition Facility will require the use of an ablator with high rocket-efficiency and ablation pressure. The ablation material used during the National Ignition Campaign (Lindl et al. 2014) [1], a glow-discharge polymer (GDP), does not couple as efficiently as simulations indicated to the multiple-shock inducing radiation drive environment created by laser power profile (Robey et al., 2012). We investigate the performance of two other ablators, boron carbide (B4C) and high-density carbon (HDC) compared to the performance of GDP under the same hohlraum conditions. Ablation performance is determined through measurement of the shock speed produced in planar samples of the ablator material subjected to the identical multiple-shock inducing radiation drive environments that are similar to a generic three-shock ignition drive. Simulations are in better agreement with the off-Hugoniot performance of B4C than either HDC or GDP, and analytic estimations of the ablation pressure indicate that while the pressure produced by B4C and GDP is similar when the ablator is allowed to release, the pressure reached by B4C seems to exceed that of HDC when backed by a Au/quartz layer.

  19. The SIMRI project: a versatile and interactive MRI simulator.

    PubMed

    Benoit-Cattin, H; Collewet, G; Belaroussi, B; Saint-Jalmes, H; Odet, C

    2005-03-01

    This paper gives an overview of SIMRI, a new 3D MRI simulator based on the Bloch equation. This simulator proposes an efficient management of the T2* effect, and in a unique simulator integrates most of the simulation features that are offered in different simulators. It takes into account the main static field value and enables realistic simulations of the chemical shift artifact, including off-resonance phenomena. It also simulates the artifacts linked to the static field inhomogeneity like those induced by susceptibility variation within an object. It is implemented in the C language and the MRI sequence programming is done using high level C functions with a simple programming interface. To manage large simulations, the magnetization kernel is implemented in a parallelized way that enables simulation on PC grid architecture. Furthermore, this simulator includes a 1D interactive interface for pedagogic purpose illustrating the magnetization vector motion as well as the MRI contrasts.

  20. Model-data fusion across ecosystems: from multi-site optimizations to global simulations

    NASA Astrophysics Data System (ADS)

    Kuppel, S.; Peylin, P.; Maignan, F.; Chevallier, F.; Kiely, G.; Montagnani, L.; Cescatti, A.

    2014-05-01

    This study uses a variational data assimilation framework to simultaneously constrain a global ecosystem model with eddy covariance measurements of daily net carbon (NEE) and latent heat (LE) fluxes from a large number of sites grouped in seven plant functional types (PFTs). It is an attempt to bridge the gap between the numerous site-specific parameter optimization works found in the literature and the generic parameterization used by most land surface models within each PFT. The present multi-site approach allows deriving PFT-generic sets of optimized parameters enhancing the agreement between measured and simulated fluxes at most of the sites considered, with performances often comparable to those of the corresponding site-specific optimizations. Besides reducing the PFT-averaged model-data root-mean-square difference (RMSD) and the associated daily output uncertainty, the optimization improves the simulated CO2 balance at tropical and temperate forests sites. The major site-level NEE adjustments at the seasonal scale are: reduced amplitude in C3 grasslands and boreal forests, increased seasonality in temperate evergreen forests, and better model-data phasing in temperate deciduous broadleaf forests. Conversely, the poorer performances in tropical evergreen broadleaf forests points to deficiencies regarding the modeling of phenology and soil water stress for this PFT. An evaluation with data-oriented estimates of photosynthesis (GPP) and ecosystem respiration (Reco) rates indicates distinctively improved simulations of both gross fluxes. The multi-site parameter sets are then tested against CO2 concentrations measured at 53 locations around the globe, showing significant adjustments of the modeled seasonality of atmospheric CO2 concentration, whose relevance seems PFT-dependent, along with an improved interannual variability. Lastly, a global scale evaluation with remote sensing NDVI measurements indicates an improvement of the simulated seasonal variations of

  1. Model-data fusion across ecosystems: from multisite optimizations to global simulations

    NASA Astrophysics Data System (ADS)

    Kuppel, S.; Peylin, P.; Maignan, F.; Chevallier, F.; Kiely, G.; Montagnani, L.; Cescatti, A.

    2014-11-01

    This study uses a variational data assimilation framework to simultaneously constrain a global ecosystem model with eddy covariance measurements of daily net ecosystem exchange (NEE) and latent heat (LE) fluxes from a large number of sites grouped in seven plant functional types (PFTs). It is an attempt to bridge the gap between the numerous site-specific parameter optimization works found in the literature and the generic parameterization used by most land surface models within each PFT. The present multisite approach allows deriving PFT-generic sets of optimized parameters enhancing the agreement between measured and simulated fluxes at most of the sites considered, with performances often comparable to those of the corresponding site-specific optimizations. Besides reducing the PFT-averaged model-data root-mean-square difference (RMSD) and the associated daily output uncertainty, the optimization improves the simulated CO2 balance at tropical and temperate forests sites. The major site-level NEE adjustments at the seasonal scale are reduced amplitude in C3 grasslands and boreal forests, increased seasonality in temperate evergreen forests, and better model-data phasing in temperate deciduous broadleaf forests. Conversely, the poorer performances in tropical evergreen broadleaf forests points to deficiencies regarding the modelling of phenology and soil water stress for this PFT. An evaluation with data-oriented estimates of photosynthesis (GPP - gross primary productivity) and ecosystem respiration (Reco) rates indicates distinctively improved simulations of both gross fluxes. The multisite parameter sets are then tested against CO2 concentrations measured at 53 locations around the globe, showing significant adjustments of the modelled seasonality of atmospheric CO2 concentration, whose relevance seems PFT-dependent, along with an improved interannual variability. Lastly, a global-scale evaluation with remote sensing NDVI (normalized difference vegetation index

  2. Fusion facility siting considerations

    NASA Astrophysics Data System (ADS)

    Bussell, G. T.

    1985-02-01

    Inherent in the fusion program's transition from hydrogen devices to commercial power machines is a general increase in the size and scope of succeeding projects. This growth will lead to increased emphasis on safety, environmental impact, and the external effects of fusion in general, and of each new device in particular. An important consideration in this regard is site selection. Major siting issues that may affect the economics, safety, and environmental impact of fusion are examined.

  3. Teaching Engineering Statistics with Technology, Group Learning, Contextual Projects, Simulation Models and Student Presentations

    ERIC Educational Resources Information Center

    Romeu, Jorge Luis

    2008-01-01

    This article discusses our teaching approach in graduate level Engineering Statistics. It is based on the use of modern technology, learning groups, contextual projects, simulation models, and statistical and simulation software to entice student motivation. The use of technology to facilitate group projects and presentations, and to generate,…

  4. Reflex Project: Using Model-Data Fusion to Characterize Confidence in Analyzes and Forecasts of Terrestrial C Dynamics

    NASA Astrophysics Data System (ADS)

    Fox, A. M.; Williams, M.; Richardson, A.; Cameron, D.; Gove, J. H.; Ricciuto, D. M.; Tomalleri, E.; Trudinger, C.; van Wijk, M.; Quaife, T.; Li, Z.

    2008-12-01

    The Regional Flux Estimation Experiment, REFLEX, is a model-data fusion inter-comparison project, aimed at comparing the strengths and weaknesses of various model-data fusion techniques for estimating carbon model parameters and predicting carbon fluxes and states. The key question addressed here is: what are the confidence intervals on (a) model parameters calibrated from eddy covariance (EC) and leaf area index (LAI) data and (b) on model analyses and predictions of net ecosystem C exchange (NEE) and carbon stocks? The experiment has an explicit focus on how different algorithms and protocols quantify the confidence intervals on parameter estimates and model forecasts, given the same model and data. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. Both observed daily NEE data from FluxNet sites and synthetic NEE data, generated by a model, were used to estimate the parameters and states of a simple C dynamics model. The results of the analyses supported the hypothesis that parameters linked to fast-response processes that mostly determine net ecosystem exchange of CO2 (NEE) were well constrained and well characterised. Parameters associated with turnover of wood and allocation to roots, only indirectly related to NEE, were poorly characterised. There was only weak agreement on estimations of uncertainty on NEE and its components, photosynthesis and ecosystem respiration, with some algorithms successfully locating the true values of these fluxes from synthetic experiments within relatively narrow 90% confidence intervals. This exercise has demonstrated that a range of techniques exist that can generate useful estimates of parameter probability density functions for C models from eddy covariance time series data. When these parameter PDFs are propagated to generate estimates of annual C fluxes there was a wide variation in size of the 90% confidence intervals. However, some algorithms were able to make

  5. Cyclokinetic models and simulations for high-frequency turbulence in fusion plasmas

    NASA Astrophysics Data System (ADS)

    Deng, Zhao; Waltz, R. E.; Wang, Xiaogang

    2016-10-01

    Gyrokinetics is widely applied in plasma physics. However, this framework is limited to weak turbulence levels and low drift-wave frequencies because high-frequency gyro-motion is reduced by the gyro-phase averaging. In order to test where gyrokinetics breaks down, Waltz and Zhao developed a new theory, called cyclokinetics [R. E. Waltz and Zhao Deng, Phys. Plasmas 20, 012507 (2013)]. Cyclokinetics dynamically follows the high-frequency ion gyro-motion which is nonlinearly coupled to the low-frequency drift-waves interrupting and suppressing gyro-averaging. Cyclokinetics is valid in the high-frequency (ion cyclotron frequency) regime or for high turbulence levels. The ratio of the cyclokinetic perturbed distribution function over equilibrium distribution function δf/ F can approach 1. This work presents, for the first time, a numerical simulation of nonlinear cyclokinetic theory for ions, and describes the first attempt to completely solve the ion gyro-phase motion in a nonlinear turbulence system. Simulations are performed [Zhao Deng and R. E. Waltz, Phys. Plasmas 22(5), 056101 (2015)] in a local flux-tube geometry with the parallel motion and variation suppressed by using a newly developed code named rCYCLO, which is executed in parallel by using an implicit time-advanced Eulerian (or continuum) scheme [Zhao Deng and R. E. Waltz, Comp. Phys. Comm. 195, 23 (2015)]. A novel numerical treatment of the magnetic moment velocity space derivative operator guarantee saccurate conservation of incremental entropy. By comparing the more fundamental cyclokinetic simulations with the corresponding gyrokinetic simulations, the gyrokinetics breakdown condition is quantitatively tested. Gyrokinetic transport and turbulence level recover those of cyclokinetics at high relative ion cyclotron frequencies and low turbulence levels, as required. Cyclokinetic transport and turbulence level are found to be lower than those of gyrokinetics at high turbulence levels and low- Ω* values

  6. Subcascade formation in displacement cascade simulations: Implications for fusion reactor materials

    SciTech Connect

    Stoller, R.E.; Greenwood, L.R.

    1998-03-01

    Primary radiation damage formation in iron has been investigated by the method of molecular dynamics (MD) for cascade energies up to 40 keV. The initial energy EMD given to the simulated PKA is approximately equivalent to the damage energy in the standard secondary displacement model by Norgett, Robinson, and Torrens (NRT); hence, EMD is less than the corresponding PKA energy. Using the values of EMD in Table 1, the corresponding EPKA and the NRT defects in iron have been calculated using the procedure described in Ref. 1 with the recommended 40 eV displacement threshold. These values are also listed in Table 1. Note that the difference between the EMD and the PKA energy increases as the PKA energy increases and that the highest simulated PKA energy of 61.3 keV is the average for a collision with a 1.77 MeV neutron. Thus, these simulations have reached well into the fast neutron energy regime. For purposes of comparison, the parameters for the maximum DT neutron energy of 14.1 MeV are also included in Table 1. Although the primary damage parameters derived from the MD cascades exhibited a strong dependence on cascade energy up to 10 keV, this dependence was diminished and slightly reversed between 20 and 40 keV, apparently due to the formation of well-defined subcascades in this energy region. Such an explanation is only qualitative at this time, and additional analysis of the high energy cascades is underway in an attempt to obtain a quantitative measure of the relationship between cascade morphology and defect survival.

  7. Four-Dimensional Continuum Gyrokinetic Code: Neoclassical Simulation of Fusion Edge Plasmas

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.

    2005-10-01

    We are developing a continuum gyrokinetic code, TEMPEST, to simulate edge plasmas. Our code represents velocity space via a grid in equilibrium energy and magnetic moment variables, and configuration space via poloidal magnetic flux and poloidal angle. The geometry is that of a fully diverted tokamak (single or double null) and so includes boundary conditions for both closed magnetic flux surfaces and open field lines. The 4-dimensional code includes kinetic electrons and ions, and electrostatic field-solver options, and simulates neoclassical transport. The present implementation is a Method of Lines approach where spatial finite-differences (higher order upwinding) and implicit time advancement are used. We present results of initial verification and validation studies: transition from collisional to collisionless limits of parallel end-loss in the scrape-off layer, self-consistent electric field, and the effect of the real X-point geometry and edge plasma conditions on the standard neoclassical theory, including a comparison of our 4D code with other kinetic neoclassical codes and experiments.

  8. Simulation of plasma–surface interactions in a fusion reactor by means of QSPA plasma streams: recent results and prospects

    NASA Astrophysics Data System (ADS)

    Garkusha, I. E.; Aksenov, N. N.; Byrka, O. V.; Makhlaj, V. A.; Herashchenko, S. S.; Malykhin, S. V.; Petrov, Yu V.; Staltsov, V. V.; Surovitskiy, S. V.; Wirtz, M.; Linke, J.; Sadowski, M. J.; Skladnik-Sadowska, E.

    2016-09-01

    This paper is devoted to plasma–surface interaction issues at high heat-loads which are typical for fusion reactors. For the International Thermonuclear Experimental Reactor (ITER), which is now under construction, the knowledge of erosion processes and the behaviour of various constructional materials under extreme conditions is a very critical issue, which will determine a successful realization of the project. The most important plasma–surface interaction (PSI) effects in 3D geometry have been studied using a QSPA Kh-50 powerful quasi-stationary plasma accelerator. Mechanisms of the droplet and dust generation have been investigated in detail. It was found that the droplets emission from castellated surfaces has a threshold character and a cyclic nature. It begins only after a certain number of the irradiating plasma pulses when molten and shifted material is accumulated at the edges of the castellated structure. This new erosion mechanism, connected with the edge effects, results in an increase in the size of the emitted droplets (as compared with those emitted from a flat surface). This mechanism can even induce the ejection of sub-mm particles. A concept of a new-generation QSPA facility, the current status of this device maintenance, and prospects for further experiments are also presented.

  9. Simulation of plasma-surface interactions in a fusion reactor by means of QSPA plasma streams: recent results and prospects

    NASA Astrophysics Data System (ADS)

    Garkusha, I. E.; Aksenov, N. N.; Byrka, O. V.; Makhlaj, V. A.; Herashchenko, S. S.; Malykhin, S. V.; Petrov, Yu V.; Staltsov, V. V.; Surovitskiy, S. V.; Wirtz, M.; Linke, J.; Sadowski, M. J.; Skladnik-Sadowska, E.

    2016-09-01

    This paper is devoted to plasma-surface interaction issues at high heat-loads which are typical for fusion reactors. For the International Thermonuclear Experimental Reactor (ITER), which is now under construction, the knowledge of erosion processes and the behaviour of various constructional materials under extreme conditions is a very critical issue, which will determine a successful realization of the project. The most important plasma-surface interaction (PSI) effects in 3D geometry have been studied using a QSPA Kh-50 powerful quasi-stationary plasma accelerator. Mechanisms of the droplet and dust generation have been investigated in detail. It was found that the droplets emission from castellated surfaces has a threshold character and a cyclic nature. It begins only after a certain number of the irradiating plasma pulses when molten and shifted material is accumulated at the edges of the castellated structure. This new erosion mechanism, connected with the edge effects, results in an increase in the size of the emitted droplets (as compared with those emitted from a flat surface). This mechanism can even induce the ejection of sub-mm particles. A concept of a new-generation QSPA facility, the current status of this device maintenance, and prospects for further experiments are also presented.

  10. Particle-in-cell simulations of an alpha channeling scenario: electron current drive arising from lower hybrid drift instability of fusion-born ions

    NASA Astrophysics Data System (ADS)

    Cook, James; Chapman, Sandra; Dendy, Richard

    2010-11-01

    Particle-in-cell (PIC) simulations of fusion-born protons in deuterium plasmas demonstrate a key alpha channeling phenomenon for tokamak fusion plasmas. We focus on obliquely propagating modes at the plasma edge, excited by centrally born fusion products on banana orbits, known to be responsible for observations of ion cyclotron emission in JET and TFTR. A fully self-consistent electromagnetic 1D3V PIC code evolves a ring-beam distribution of 3MeV protons in a 10keV thermal deuterium-electron plasma with realistic mass ratio. A collective instability occurs, giving rise to electromagnetic field activity in the lower hybrid range of frequencies. Waves spontaneously excited by this lower hybrid drift instability undergo Landau damping on resonant electrons, drawing out an asymmetric tail in the distribution of electron parallel velocities, which constitutes a net current. These simulations demonstrate a key building block of some alpha channeling scenarios: the direct collisionless coupling of fusion product energy into a form which can help sustain the equilibrium of the tokamak.

  11. Kinetic simulations of stimulated Raman backscattering and related processes for the shock-ignition approach to inertial confinement fusion

    SciTech Connect

    Riconda, C.; Weber, S.; Tikhonchuk, V. T.; Heron, A.

    2011-09-15

    A detailed description of stimulated Raman backscattering and related processes for the purpose of inertial confinement fusion requires multi-dimensional kinetic simulations of a full speckle in a high-temperature, large-scale, inhomogeneous plasma. In particular for the shock-ignition scheme operating at high laser intensities, kinetic aspects are predominant. High- (I{lambda}{sub o}{sup 2}{approx}5x10{sup 15}W{mu}m{sup 2}/cm{sup 2}) as well as low-intensity (I{lambda}{sub o}{sup 2}{approx}10{sup 15}W{mu}m{sup 2}/cm{sup 2}) cases show the predominance of collisionless, collective processes for the interaction. While the two-plasmon decay instability and the cavitation scenario are hardly affected by intensity variation, inflationary Raman backscattering proves to be very sensitive. Brillouin backscattering evolves on longer time scales and dominates the reflectivities, although it is sensitive to the intensity. Filamentation and self-focusing do occur for all cases but on time scales too long to affect Raman backscattering.

  12. Exponential yield sensitivity to long-wavelength asymmetries in three-dimensional simulations of inertial confinement fusion capsule implosions

    SciTech Connect

    Haines, Brian M.

    2015-08-15

    In this paper, we perform a series of high-resolution 3D simulations of an OMEGA-type inertial confinement fusion (ICF) capsule implosion with varying levels of initial long-wavelength asymmetries in order to establish the physical energy loss mechanism for observed yield degradation due to long-wavelength asymmetries in symcap (gas-filled capsule) implosions. These simulations demonstrate that, as the magnitude of the initial asymmetries is increased, shell kinetic energy is increasingly retained in the shell instead of being converted to fuel internal energy. This is caused by the displacement of fuel mass away from and shell material into the center of the implosion due to complex vortical flows seeded by the long-wavelength asymmetries. These flows are not fully turbulent, but demonstrate mode coupling through non-linear instability development during shell stagnation and late-time shock interactions with the shell interface. We quantify this effect by defining a separation lengthscale between the fuel mass and internal energy and show that this is correlated with yield degradation. The yield degradation shows an exponential sensitivity to the RMS magnitude of the long-wavelength asymmetries. This strong dependence may explain the lack of repeatability frequently observed in OMEGA ICF experiments. In contrast to previously reported mechanisms for yield degradation due to turbulent instability growth, yield degradation is not correlated with mixing between shell and fuel material. Indeed, an integrated measure of mixing decreases with increasing initial asymmetry magnitude due to delayed shock interactions caused by growth of the long-wavelength asymmetries without a corresponding delay in disassembly.

  13. Fusion Studies in Japan

    NASA Astrophysics Data System (ADS)

    Ogawa, Yuichi

    2016-05-01

    A new strategic energy plan decided by the Japanese Cabinet in 2014 strongly supports the steady promotion of nuclear fusion development activities, including the ITER project and the Broader Approach activities from the long-term viewpoint. Atomic Energy Commission (AEC) in Japan formulated the Third Phase Basic Program so as to promote an experimental fusion reactor project. In 2005 AEC has reviewed this Program, and discussed on selection and concentration among many projects of fusion reactor development. In addition to the promotion of ITER project, advanced tokamak research by JT-60SA, helical plasma experiment by LHD, FIREX project in laser fusion research and fusion engineering by IFMIF were highly prioritized. Although the basic concept is quite different between tokamak, helical and laser fusion researches, there exist a lot of common features such as plasma physics on 3-D magnetic geometry, high power heat load on plasma facing component and so on. Therefore, a synergetic scenario on fusion reactor development among various plasma confinement concepts would be important.

  14. Big fusion, little fusion

    NASA Astrophysics Data System (ADS)

    Chen, Frank; ddtuttle

    2016-08-01

    In reply to correspondence from George Scott and Adam Costley about the Physics World focus issue on nuclear energy, and to news of construction delays at ITER, the fusion reactor being built in France.

  15. Fission thrust sail as booster for high Δv fusion based propulsion

    NASA Astrophysics Data System (ADS)

    Ceyssens, Frederik; Wouters, Kristof; Driesen, Maarten

    2015-12-01

    The fission thrust sail as booster for nuclear fusion-based rocket propulsion for future starships is introduced and studied. First order calculations are used together with Monte Carlo simulations to assess system performance. If a D-D fusion rocket such as e.g. considered in Project Icarus has relatively low efficiency (~30%) in converting fusion fuel to a directed exhaust, adding a fission sail is shown to be beneficial for the obtainable delta-v. In addition, this type of fission-fusion hybrid propulsion has the potential to improve acceleration and act as a micrometeorite shield.

  16. Secretarial Administration: Project In/Vest: Insurance Simulation Insures Learning

    ERIC Educational Resources Information Center

    Geier, Charlene

    1978-01-01

    Describes a simulated model office to replicate various insurance occupations set up in Greenfield High School, Wisconsin. Local insurance agents and students from other disciplines, such as distributive education, are involved in the simulation. The training is applicable to other business office positions, as it models not only an insurance…

  17. Project Shuttle simulation math model coordination catalog, revision 1

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A catalog is presented of subsystem and environment math models used or planned for space shuttle simulations. The purpose is to facilitate sharing of similar math models between shuttle simulations. It provides information on mach model requirements, formulations, schedules, and contact persons for further information.

  18. Numerical analysis of applied magnetic field dependence in Malmberg-Penning Trap for compact simulator of energy driver in heavy ion fusion

    NASA Astrophysics Data System (ADS)

    Sato, T.; Park, Y.; Soga, Y.; Takahashi, K.; Sasaki, T.; Kikuchi, T.; Harada, Nob

    2016-05-01

    To simulate a pulse compression process of space charge dominated beams in heavy ion fusion, we have demonstrated a multi-particle numerical simulation as an equivalent beam using the Malmberg-Penning trap device. The results show that both transverse and longitudinal velocities as a function of external magnetic field strength are increasing during the longitudinal compression. The influence of space-charge effect, which is related to the external magnetic field, was observed as the increase of high velocity particles at the weak external magnetic field.

  19. The Maya Project: Numerical Simulations of Black Hole Collisions

    NASA Astrophysics Data System (ADS)

    Smith, Kenneth; Calabrese, Gioel; Garrison, David; Kelly, Bernard; Laguna, Pablo; Lockitch, Keith; Pullin, Jorge; Shoemaker, Deirdre; Tiglio, Manuel

    2001-04-01

    The main objective of the MAYA project is the development of a numerical code to solve the vacuum Einstein's field equations for spacetimes containing multiple black hole singularities. Incorporating knowledge gained from previous similar efforts (Binary Black Holes Alliance and the AGAVE project) as well as one-dimensional numerical studies, MAYA has been built from the ground up within the architecture of Cactus 4.0, with particular attention paid to the software engineering aspects of code development. The goal of this new effort is to ultimately have a robust, efficient, readable, and stable numerical code for black hole evolution. This poster presents an overview of the project, focusing on the innovative aspects of the project as well as its current development status.

  20. A Student Project to use Geant4 Simulations for a TMS-PET combination

    NASA Astrophysics Data System (ADS)

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Rueda, A.; Solano Salinas, C. J.; Wahl, D.; Zamudio, A.

    2007-10-01

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  1. A Student Project to use Geant4 Simulations for a TMS-PET combination

    SciTech Connect

    Altamirano, A.; Chamorro, A.; Hurtado, K.; Romero, C.; Wahl, D.; Zamudio, A.; Rueda, A.; Solano Salinas, C. J.

    2007-10-26

    Geant4 is one of the most powerful tools for MC simulation of detectors and their applications. We present a student project to simulate a combined Transcranial Magnetic Stimulation-Positron Emission Tomography (TMS-PET) system using Geant4. This project aims to study PET-TMS systems by implementing a model for the brain response to the TMS pulse and studying the simulated PET response. In order to increase the speed of the simulations we parallelise our programs and investigate the possibility of using GRID computing.

  2. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  3. Warm starting the projected Gauss-Seidel algorithm for granular matter simulation

    NASA Astrophysics Data System (ADS)

    Wang, Da; Servin, Martin; Berglund, Tomas

    2016-03-01

    The effect on the convergence of warm starting the projected Gauss-Seidel solver for nonsmooth discrete element simulation of granular matter are investigated. It is found that the computational performance can be increased by a factor 2-5.

  4. How historic simulation-observation discrepancy affects future warming projections in a very large model ensemble

    NASA Astrophysics Data System (ADS)

    Goodwin, Philip

    2016-10-01

    Projections of future climate made by model-ensembles have credibility because the historic simulations by these models are consistent with, or near-consistent with, historic observations. However, it is not known how small inconsistencies between the ranges of observed and simulated historic climate change affects the future projections made by a model ensemble. Here, the impact of historical simulation-observation inconsistencies on future warming projections is quantified in a 4-million member Monte Carlo ensemble from a new efficient Earth System Model (ESM). Of the 4-million ensemble members, a subset of 182,500 are consistent with historic ranges of warming, heat uptake and carbon uptake simulated by the Climate Model Intercomparison Project 5 (CMIP5) ensemble. This simulation-consistent subset projects similar future warming ranges to the CMIP5 ensemble for all four RCP scenarios, indicating the new ESM represents an efficient tool to explore parameter space for future warming projections based on historic performance. A second subset of 14,500 ensemble members are consistent with historic observations for warming, heat uptake and carbon uptake. This observation-consistent subset projects a narrower range for future warming, with the lower bounds of projected warming still similar to CMIP5, but the upper warming bounds reduced by 20-35 %. These findings suggest that part of the upper range of twenty-first century CMIP5 warming projections may reflect historical simulation-observation inconsistencies. However, the agreement of lower bounds for projected warming implies that the likelihood of warming exceeding dangerous levels over the twenty-first century is unaffected by small discrepancies between CMIP5 models and observations.

  5. Non-Gaussian fluctuations and non-Markovian effects in the nuclear fusion process: Langevin dynamics emerging from quantum molecular dynamics simulations.

    PubMed

    Wen, Kai; Sakata, Fumihiko; Li, Zhu-Xia; Wu, Xi-Zhen; Zhang, Ying-Xun; Zhou, Shan-Gui

    2013-07-01

    Macroscopic parameters as well as precise information on the random force characterizing the Langevin-type description of the nuclear fusion process around the Coulomb barrier are extracted from the microscopic dynamics of individual nucleons by exploiting the numerical simulation of the improved quantum molecular dynamics. It turns out that the dissipation dynamics of the relative motion between two fusing nuclei is caused by a non-Gaussian distribution of the random force. We find that the friction coefficient as well as the time correlation function of the random force takes particularly large values in a region a little bit inside of the Coulomb barrier. A clear non-Markovian effect is observed in the time correlation function of the random force. It is further shown that an emergent dynamics of the fusion process can be described by the generalized Langevin equation with memory effects by appropriately incorporating the microscopic information of individual nucleons through the random force and its time correlation function.

  6. Simulation of a Forensic Chemistry Problem: A Multidisciplinary Project for Secondary School Chemistry Students

    NASA Astrophysics Data System (ADS)

    Long, G. A.

    1995-09-01

    A multidisciplinary chemistry project for high school students is presented based upon a forensic theme and team problem solving approach. The project involves data collection and interpretation using FTIR, HPLC, NMR, and TLC. Simulated evidence sample formulations and a sample assignment scheme are presented.

  7. Improving Faculty Perceptions of and Intent to Use Simulation: An Intervention Project

    ERIC Educational Resources Information Center

    Tucker, Charles

    2013-01-01

    Human patient simulation is an innovative teaching strategy that can facilitate practice development and preparation for entry into today's healthcare environment for nursing students. Unfortunately, the use of human patient simulation has been limited due to the perceptions of nursing faculty members. This project sought to explore those…

  8. The Multi-SAG project: filling the MultiDark simulations with semi-analytic galaxies

    NASA Astrophysics Data System (ADS)

    Vega-Martínez, C. A.; Cora, S. A.; Padilla, N. D.; Muñoz Arancibia, A. M.; Orsi, A. A.; Ruiz, A. N.

    2016-08-01

    The semi-analytical model sag is a code of galaxy formation and evolution which is applied to halo catalogs and merger trees extracted from cosmological -body simulations of dark matter. This contribution describes the project of constructing a catalog of simulated galaxies by adapting and applying the model sag over two dark matter simulations of the spanish MultiDark Project publicly available. Those simulations have particles, each, in boxes with sizes of 1000 Mpc and 400 Mpc respectively with Planck cosmological parameters. They cover a large range of masses and have halo mass resolutions of , therefore each simulation is able to produce more than 150 millions of simulated galaxies. A detailed description of the method is explained, and the first statistical results are shown.

  9. The BOUT Project: Validation and Benchmark of BOUT Code and Experimental Diagnostic Tools for Fusion Boundary Turbulence

    SciTech Connect

    Xu, X Q

    2001-08-09

    A boundary plasma turbulence code BOUT is presented. The preliminary encouraging results have been obtained when comparing with probe measurements for a typical Ohmic discharge in CT-7 tokamak. The validation and benchmark of BOUT code and experimental diagnostic tools for fusion boundary plasma turbulence is proposed.

  10. The BOUT Project; Validation and Benchmark of BOUT Code and Experimental Diagnostic Tools for Fusion Boundary Turbulence

    NASA Astrophysics Data System (ADS)

    Xu, Xue-qiao

    2001-10-01

    A boundary plasma turbulence code BOUT is presented. The preliminary encouraging results have been obtained when comparing with probe measurements for a typical Ohmic discharge in HT-7 tokamak. The validation and benchmark of BOUT code and experimental diagnostic tools for fusion boundary plasma turbulence is proposed.

  11. The Tokamak Fusion Test Reactor decontamination and decommissioning project and the Tokamak Physics Experiment at the Princeton Plasma Physics Laboratory. Environmental Assessment

    SciTech Connect

    1994-05-27

    If the US is to meet the energy needs of the future, it is essential that new technologies emerge to compensate for dwindling supplies of fossil fuels and the eventual depletion of fissionable uranium used in present-day nuclear reactors. Fusion energy has the potential to become a major source of energy for the future. Power from fusion energy would provide a substantially reduced environmental impact as compared with other forms of energy generation. Since fusion utilizes no fossil fuels, there would be no release of chemical combustion products to the atmosphere. Additionally, there are no fission products formed to present handling and disposal problems, and runaway fuel reactions are impossible due to the small amounts of deuterium and tritium present. The purpose of the TPX Project is to support the development of the physics and technology to extend tokamak operation into the continuously operating (steady-state) regime, and to demonstrate advances in fundamental tokamak performance. The purpose of TFTR D&D is to ensure compliance with DOE Order 5820.2A ``Radioactive Waste Management`` and to remove environmental and health hazards posed by the TFTR in a non-operational mode. There are two proposed actions evaluated in this environmental assessment (EA). The actions are related because one must take place before the other can proceed. The proposed actions assessed in this EA are: the decontamination and decommissioning (D&D) of the Tokamak Fusion Test Reactor (TFTR); to be followed by the construction and operation of the Tokamak Physics Experiment (TPX). Both of these proposed actions would take place primarily within the TFTR Test Cell Complex at the Princeton Plasma Physics Laboratory (PPPL). The TFTR is located on ``D-site`` at the James Forrestal Campus of Princeton University in Plainsboro Township, Middlesex County, New Jersey, and is operated by PPPL under contract with the United States Department of Energy (DOE).

  12. A system simulation development project: Leveraging resources through partnerships

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.; Owen, A. Karl; Davis, Milt W.

    1995-01-01

    Partnerships between government agencies are an intellectually attractive method of conducting scientific research; the goal is to establish mutually beneficial participant roles for technology exchange that ultimately pays-off in a stronger R&D program for each partner. Anticipated and current aerospace research budgetary pressures through the 90's provide additional impetus for Government research agencies to candidly assess their R&D for those simulation activities no longer unique enough to warrant 'going it alone,' or for those elements where partnerships or teams can offset development costs. This paper describes a specific inter-agency system simulation activity that leverages the development cost of mutually beneficial R&D. While the direct positive influence of partnerships on complex technology developments is our main thesis, we also address on-going teaming issues and hope to impart to the reader the immense indirect (sometimes immeasurable) benefits that meaningful interagency partnerships can produce.

  13. Final Technical Report for Center for Plasma Edge Simulation Research

    SciTech Connect

    Pankin, Alexei Y.; Bateman, Glenn; Kritz, Arnold H.

    2012-02-29

    The CPES research carried out by the Lehigh fusion group has sought to satisfy the evolving requirements of the CPES project. Overall, the Lehigh group has focused on verification and validation of the codes developed and/or integrated in the CPES project. Consequently, contacts and interaction with experimentalists have been maintained during the course of the project. Prof. Arnold Kritz, the leader of the Lehigh Fusion Group, has participated in the executive management of the CPES project. The code development and simulation studies carried out by the Lehigh fusion group are described in more detail in the sections below.

  14. Simulation of slag control for the Plasma Hearth Project

    SciTech Connect

    Power, M.A.; Carney, K.P.; Peters. G.G.

    1996-12-31

    The goal of the Plasma Hearth Project is to stabilize alpha-emitting radionuclides in a vitreous slag and to reduce the effective storage volume of actinide-containing waste for long-term burial. The actinides have been shown to partition into the vitreous slag phase of the melt. The slag composition may be changed by adding glass-former elements to ensure that this removable slag has the most desired physical and chemical properties for long-term burial. A data acquisition and control system has been designed to regulate the composition of five elements in the slag.

  15. Fast Simulation of X-ray Projections of Spline-based Surfaces using an Append Buffer

    PubMed Central

    Maier, Andreas; Hofmann, Hannes G.; Schwemmer, Chris; Hornegger, Joachim; Keil, Andreas; Fahrig, Rebecca

    2012-01-01

    Many scientists in the field of x-ray imaging rely on the simulation of x-ray images. As the phantom models become more and more realistic, their projection requires high computational effort. Since x-ray images are based on transmission, many standard graphics acceleration algorithms cannot be applied to this task. However, if adapted properly, simulation speed can be increased dramatically using state-of-the-art graphics hardware. A custom graphics pipeline that simulates transmission projections for tomographic reconstruction was implemented based on moving spline surface models. All steps from tessellation of the splines, projection onto the detector, and drawing are implemented in OpenCL. We introduced a special append buffer for increased performance in order to store the intersections with the scene for every ray. Intersections are then sorted and resolved to materials. Lastly, an absorption model is evaluated to yield an absorption value for each projection pixel. Projection of a moving spline structure is fast and accurate. Projections of size 640×480 can be generated within 254 ms. Reconstructions using the projections show errors below 1 HU with a sharp reconstruction kernel. Traditional GPU-based acceleration schemes are not suitable for our reconstruction task. Even in the absence of noise, they result in errors up to 9 HU on average, although projection images appear to be correct under visual examination. Projections generated with our new method are suitable for the validation of novel CT reconstruction algorithms. For complex simulations, such as the evaluation of motion-compensated reconstruction algorithms, this kind of x-ray simulation will reduce the computation time dramatically. Source code is available at http://conrad.stanford.edu/ PMID:22975431

  16. Label fusion strategy selection.

    PubMed

    Robitaille, Nicolas; Duchesne, Simon

    2012-01-01

    Label fusion is used in medical image segmentation to combine several different labels of the same entity into a single discrete label, potentially more accurate, with respect to the exact, sought segmentation, than the best input element. Using simulated data, we compared three existing label fusion techniques-STAPLE, Voting, and Shape-Based Averaging (SBA)-and observed that none could be considered superior depending on the dissimilarity between the input elements. We thus developed an empirical, hybrid technique called SVS, which selects the most appropriate technique to apply based on this dissimilarity. We evaluated the label fusion strategies on two- and three-dimensional simulated data and showed that SVS is superior to any of the three existing methods examined. On real data, we used SVS to perform fusions of 10 segmentations of the hippocampus and amygdala in 78 subjects from the ICBM dataset. SVS selected SBA in almost all cases, which was the most appropriate method overall. PMID:22518113

  17. Scenario Based Education as a Framework for Understanding Students Engagement and Learning in a Project Management Simulation Game

    ERIC Educational Resources Information Center

    Misfeldt, Morten

    2015-01-01

    In this paper I describe how students use a project management simulation game based on an attack-defense mechanism where two teams of players compete by challenging each other's projects. The project management simulation game is intended to be played by pre-service construction workers and engineers. The gameplay has two parts: a planning part,…

  18. Model and simulation of fringe projection measurements as part of an assistance system for multi-component fringe projection sensors

    NASA Astrophysics Data System (ADS)

    Weckenmann, Albert; Hartmann, Wito; Weickmann, Johannes

    2008-09-01

    Multi-component fringe projection sensors allow the fast, holistic, exact, robust, contact free sampling of a workpiece surface. The success of an inspection relies on the skills, diligence and experience of the inspection planner. For setting up an inspection, there is no standardized method established yet. Therefore there is a need for assistance systems to support the operator. A prototype of an such assistance system for multi-component fringe projection sensors is introduced. The assistance system supports the inspection planner in determining the ideal sighting- and positioningstrategy. As key element, the result of a planned inspection is simulated. First, the optical performance of the designated fringe projection sensor is calculated by use of raytracing software. Then the measurement result and the measurement uncertainty for specific measurement tasks and a chosen measuring pose, is simulated. Fundament for this simulation is a complete mathematical-physical model of the measurement. Building on this and on the knowledge of influences, which were previously inscribed in entry masks, the measurement uncertainty can be estimated and displayed individually for each point of a workpiece surface. Thus the inspection planner can easily evaluate the quality of the planned inspection setup. Additional optimizing algorithms were implemented. The aim of the multi-criteria optimization is to determine the best configuration for the measurement device and the ideal sighting- and positioning-strategy. As measure of quality serves hereby the reduction of the measurement uncertainty.

  19. Nuclear Fusion

    NASA Astrophysics Data System (ADS)

    Veres, G.

    This chapter is devoted to the fundamental concepts of nuclear fusion. To be more precise, it is devoted to the theoretical basics of fusion reactions between light nuclei such as hydrogen, helium, boron, and lithium. The discussion is limited because our purpose is to focus on laboratory-scale fusion experiments that aim at gaining energy from the fusion process. After discussing the methods of calculating the fusion cross section, it will be shown that sustained fusion reactions with energy gain must happen in a thermal medium because, in beam-target experiments, the energy of the beam is randomized faster than the fusion rate. Following a brief introduction to the elements of plasma physics, the chapter is concluded with the introduction of the most prominent fusion reactions ongoing in the Sun.

  20. Project ARGO: Gas phase formation in simulated microgravity

    NASA Technical Reports Server (NTRS)

    Powell, Michael R.; Waligora, James M.; Norfleet, William T.; Kumar, K. Vasantha

    1993-01-01

    The ARGO study investigated the reduced incidence of joint pain decompression sickness (DCS) encountered in microgravity as compared with an expected incidence of joint pain DCS experienced by test subjects in Earth-based laboratories (unit gravity) with similar protocols. Individuals who are decompressed from saturated conditions usually acquire joint pain DCS in the lower extremities. Our hypothesis is that the incidence of joint pain DCS can be limited by a significant reduction in the tissue gas micronuclei formed by stress-assisted nucleation. Reductions in dynamic and kinetic stresses in vivo are linked to hypokinetic and adynamic conditions of individuals in zero g. We employed the Doppler ultrasound bubble detection technique in simulated microgravity studies to determine quantitatively the degree of gas phase formation in the upper and lower extremities of test subjects during decompression. We found no evidence of right-to-left shunting through pulmonary vasculature. The volume of gas bubble following decompression was examined and compared with the number following saline contrast injection. From this, we predict a reduced incidence of DCS on orbit, although the incidence of predicted mild DCS still remains larger than that encountered on orbit.

  1. Modeling and Simulation Optimization and Feasibility Studies for the Neutron Detection without Helium-3 Project

    SciTech Connect

    Ely, James H.; Siciliano, Edward R.; Swinhoe, Martyn T.; Lintereur, Azaree T.

    2013-01-01

    This report details the results of the modeling and simulation work accomplished for the ‘Neutron Detection without Helium-3’ project during the 2011 and 2012 fiscal years. The primary focus of the project is to investigate commercially available technologies that might be used in safeguards applications in the relatively near term. Other technologies that are being developed may be more applicable in the future, but are outside the scope of this study.

  2. Particle-in-cell simulations of the excitation mechanism for fusion-product-driven ion cyclotron emission from tokamaks

    NASA Astrophysics Data System (ADS)

    Dendy, Richard; Cook, James; Chapman, Sandra

    2009-11-01

    Suprathermal ion cyclotron emission (ICE) was the first collective radiative instability, driven by fusion products, observed on JET and TFTR. Strong emission occurs at sequential cyclotron harmonics of the energetic ion population at the outer mid-plane. Its intensity scales linearly with fusion reactivity, including its time evolution during a discharge. The emission mechanism is probably the magnetoacoustic cyclotron instability (MCI), involving resonance between: fast Alfv'en waves; cyclotron harmonic waves supported by the energetic particle population and by the background thermal plasma; and a subset of the centrally born fusion products, just inside the trapped-passing boundary, whose drift orbits make large radial excursions. The linear growth rate of the MCI has been intensively studied analytically, and yields good agreement with several key observational features of ICE. To address outstanding issues in the nonlinear ICE regime, we have developed a particle-in-cell code which self-consistently evolves electron and multi-species ion macroparticles and the electromagnetic field. We focus on the growth rate of the MCI, as it evolves from the linear into the nonlinear regime for JET-like parameters.

  3. Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project

    NASA Technical Reports Server (NTRS)

    Davis, Jody L.; Striepe, Scott A.; Maddock, Robert W.; Hines, Glenn D.; Paschall, Stephen, II; Cohanim, Babak E.; Fill, Thomas; Johnson, Michael C.; Bishop, Robert H.; DeMars, Kyle J.; Sostaric, Ronald r.; Johnson, Andrew E.

    2008-01-01

    Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining design and integration capability and system performance of the lunar descent and landing system and environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. The POST2 simulation provides a six degree-of-freedom capability necessary to test, design and operate a descent and landing system for successful lunar landing. This paper presents advances in the development and model-implementation of the POST2 simulation, as well as preliminary system performance analysis, used for the testing and evaluation of ALHAT project system models.

  4. A Strategy for Autogeneration of Space Shuttle Ground Processing Simulation Models for Project Makespan Estimations

    NASA Technical Reports Server (NTRS)

    Madden, Michael G.; Wyrick, Roberta; O'Neill, Dale E.

    2005-01-01

    Space Shuttle Processing is a complicated and highly variable project. The planning and scheduling problem, categorized as a Resource Constrained - Stochastic Project Scheduling Problem (RC-SPSP), has a great deal of variability in the Orbiter Processing Facility (OPF) process flow from one flight to the next. Simulation Modeling is a useful tool in estimation of the makespan of the overall process. However, simulation requires a model to be developed, which itself is a labor and time consuming effort. With such a dynamic process, often the model would potentially be out of synchronization with the actual process, limiting the applicability of the simulation answers in solving the actual estimation problem. Integration of TEAMS model enabling software with our existing schedule program software is the basis of our solution. This paper explains the approach used to develop an auto-generated simulation model from planning and schedule efforts and available data.

  5. Ten years of computer visual simulations on large scale projects in the western United States

    SciTech Connect

    Ellsworth, J.C.

    1999-07-01

    Computer visual simulations are used to portray proposed landscape changes with true color, photo-realistic quality, and high levels of accuracy and credibility. this sophisticated technology is a valuable tool for planners, landscape architects, architects, engineers, environmental consultants, government agencies and private operators in the design and planning of surface mining operations. This paper presents examples of the application of computer visual simulations on large scale projects in the western United States, including those which generally require environmental impact statement under the National Environmental Policy Act of 1969 (e.g., open pit coal mines, gold surface mines, highways and bridges, oil and gas development, and alpine ski areas). This presentation will describe the development criteria, process, and use of the computer visual simulations of these types of projects. The issues of computer visual simulation accuracy, bias, credibility, ethics, and realism will be discussed with emphasis on application in real world situations. the use of computer visual simulations as a tool in the planning and design of these types of projects will be presented, along with discussion of their use in project permitting and public involvement.

  6. Final Report for LDRD Project on Rapid Problem Setup for Mesh-Based Simulation (Rapsodi)

    SciTech Connect

    Brown, D L; Henshaw, W; Petersson, N A; Fast, P; Chand, K

    2003-02-07

    Under LLNL Exploratory Research LDRD funding, the Rapsodi project developed rapid setup technology for computational physics and engineering problems that require computational representations of complex geometry. Many simulation projects at LLNL involve the solution of partial differential equations in complex 3-D geometries. A significant bottleneck in carrying out these simulations arises in converting some specification of a geometry, such as a computer-aided design (CAD) drawing to a computationally appropriate 3-D mesh that can be used for simulation and analysis. Even using state-of-the-art mesh generation software, this problem setup step typically has required weeks or months, which is often much longer than required to carry out the computational simulation itself. The Rapsodi project built computational tools and designed algorithms that help to significantly reduce this setup time to less than a day for many realistic problems. The project targeted rapid setup technology for computational physics and engineering problems that use mixed-element unstructured meshes, overset meshes or Cartesian-embedded boundary (EB) meshes to represent complex geometry. It also built tools that aid in constructing computational representations of geometry for problems that do not require a mesh. While completely automatic mesh generation is extremely difficult, the amount of manual labor required can be significantly reduced. By developing novel, automated, component-based mesh construction procedures and automated CAD geometry repair and cleanup tools, Rapsodi has significantly reduced the amount of hand crafting required to generate geometry and meshes for scientific simulation codes.

  7. Fusion in diffusion MRI for improved fibre orientation estimation: An application to the 3T and 7T data of the Human Connectome Project.

    PubMed

    Sotiropoulos, Stamatios N; Hernández-Fernández, Moisés; Vu, An T; Andersson, Jesper L; Moeller, Steen; Yacoub, Essa; Lenglet, Christophe; Ugurbil, Kamil; Behrens, Timothy E J; Jbabdi, Saad

    2016-07-01

    Determining the acquisition parameters in diffusion magnetic resonance imaging (dMRI) is governed by a series of trade-offs. Images of lower resolution have less spatial specificity but higher signal to noise ratio (SNR). At the same time higher angular contrast, important for resolving complex fibre patterns, also yields lower SNR. Considering these trade-offs, the Human Connectome Project (HCP) acquires high quality dMRI data for the same subjects at different field strengths (3T and 7T), which are publically released. Due to differences in the signal behavior and in the underlying scanner hardware, the HCP 3T and 7T data have complementary features in k- and q-space. The 3T dMRI has higher angular contrast and resolution, while the 7T dMRI has higher spatial resolution. Given the availability of these datasets, we explore the idea of fusing them together with the aim of combining their benefits. We extend a previously proposed data-fusion framework and apply it to integrate both datasets from the same subject into a single joint analysis. We use a generative model for performing parametric spherical deconvolution and estimate fibre orientations by simultaneously using data acquired under different protocols. We illustrate unique features from each dataset and how they are retained after fusion. We further show that this allows us to complement benefits and improve brain connectivity analysis compared to analyzing each of the datasets individually. PMID:27071694

  8. Simulating the magnetized liner inertial fusion plasma confinement with smaller-scale experiments [Simulating the MagLIF plasma confinement with smaller-scale experiments

    SciTech Connect

    Ryutov, D. D.; Cuneo, M. E.; Herrmann, M. C.; Sinars, D. B.; Slutz, S. A.

    2012-06-20

    The recently proposed magnetized liner inertial fusion approach to a Z-pinch driven fusion [Slutz et al., Phys. Plasmas17, 056303 (2010)] is based on the use of an axial magnetic field to provide plasma thermal insulation from the walls of the imploding liner. The characteristic plasma transport regimes in the proposed approach cover parameter domains that have not been studied yet in either magnetic confinement or inertial confinement experiments. In this article, an analysis is presented of the scalability of the key physical processes that determine the plasma confinement. The dimensionless scaling parameters are identified and conclusion is drawn that the plasma behavior in scaled-down experiments can correctly represent the full-scale plasma, provided these parameters are approximately the same in two systems. Furthermore, this observation is important in that smaller-scale experiments typically have better diagnostic access and more experiments per year are possible.

  9. Retrieval process development and enhancements project Fiscal year 1995: Simulant development technology task progress report

    SciTech Connect

    Golcar, G.R.; Bontha, J.R.; Darab, J.G.

    1997-01-01

    The mission of the Retrieval Process Development and Enhancements (RPD&E) project is to develop an understanding of retrieval processes, including emerging and existing technologies, gather data on these technologies, and relate the data to specific tank problems such that end-users have the requisite technical bases to make retrieval and closure decisions. The development of waste simulants is an integral part of this effort. The work of the RPD&E simulant-development task is described in this document. The key FY95 accomplishments of the RPD&E simulant-development task are summarized below.

  10. Final Report for "Community Petascale Project for Accelerator Science and Simulations".

    SciTech Connect

    Cary, J. R.; Bruhwiler, D. L.; Stoltz, P. H.; Cormier-Michel, E.; Cowan, B.; Schwartz, B. T.; Bell, G.; Paul, K.; Veitzer, S.

    2013-04-19

    This final report describes the work that has been accomplished over the past 5 years under the Community Petascale Project for Accelerator and Simulations (ComPASS) at Tech-X Corporation. Tech-X had been involved in the full range of ComPASS activities with simulation of laser plasma accelerator concepts, mainly in collaboration with LOASIS program at LBNL, simulation of coherent electron cooling in collaboration with BNL, modeling of electron clouds in high intensity accelerators, in collaboration with researchers at Fermilab and accurate modeling of superconducting RF cavity in collaboration with Fermilab, JLab and Cockcroft Institute in the UK.

  11. Sensor fusion for synthetic vision

    NASA Technical Reports Server (NTRS)

    Pavel, M.; Larimer, J.; Ahumada, A.

    1991-01-01

    Display methodologies are explored for fusing images gathered by millimeter wave sensors with images rendered from an on-board terrain data base to facilitate visually guided flight and ground operations in low visibility conditions. An approach to fusion based on multiresolution image representation and processing is described which facilitates fusion of images differing in resolution within and between images. To investigate possible fusion methods, a workstation-based simulation environment is being developed.

  12. The Virtual Liver Project: Simulating Tissue Injury Through Molecular and Cellular Processes

    EPA Science Inventory

    Efficiently and humanely testing the safety of thousands of environmental chemicals is a challenge. The US EPA Virtual Liver Project (v-Liver™) is aimed at simulating the effects of environmental chemicals computationally in order to estimate the risk of toxic outcomes in humans...

  13. The Virtual Liver Project: Modeling Tissue Response To Chemicals Through Multiscale Simulation

    EPA Science Inventory

    The US EPA Virtual Liver Project is aimed at simulating the risk of toxic effects from environmental chemicals in silico. The computational systems model of organ injury due to chronic chemical exposure is based on: (i) the dynamics of perturbed molecular pathways, (ii) their lin...

  14. Simulating Limb Formation in the U.S. EPA Virtual Embryo - Risk Assessment Project

    EPA Science Inventory

    The U.S. EPA’s Virtual Embryo project (v-Embryo™) is a computer model simulation of morphogenesis that integrates cell and molecular level data from mechanistic and in vitro assays with knowledge about normal development processes to assess in silico the effects of chemicals on d...

  15. Verification of gyrokinetic particle simulation of current-driven instability in fusion plasmas. I. Internal kink mode

    SciTech Connect

    McClenaghan, J.; Lin, Z.; Holod, I.; Deng, W.; Wang, Z.

    2014-12-15

    The gyrokinetic toroidal code (GTC) capability has been extended for simulating internal kink instability with kinetic effects in toroidal geometry. The global simulation domain covers the magnetic axis, which is necessary for simulating current-driven instabilities. GTC simulation in the fluid limit of the kink modes in cylindrical geometry is verified by benchmarking with a magnetohydrodynamic eigenvalue code. Gyrokinetic simulations of the kink modes in the toroidal geometry find that ion kinetic effects significantly reduce the growth rate even when the banana orbit width is much smaller than the radial width of the perturbed current layer at the mode rational surface.

  16. Progress in the study of mesh refinement for particle-in-cell plasma simulations and its application to heavy ion fusion

    SciTech Connect

    Vay, J.-L.; Friedman, A.; Grote, D.P.

    2002-09-15

    The numerical simulation of the driving beams in a heavy ion fusion power plant is a challenging task, and, despite rapid progress in computer power, one must consider the use of the most advanced numerical techniques. One of the difficulties of these simulations resides in the disparity of scales in time and in space which must be resolved. When these disparities are in distinctive zones of the simulation region, a method which has proven to be effective in other areas (e.g. fluid dynamics simulations) is the Adaptive-Mesh-Refinement (AMR) technique. We follow in this article the progress accomplished in the last few months in the merging of the AMR technique with Particle-In-Cell (PIC) method. This includes a detailed modeling of the Lampel-Tiefenback solution for the one-dimensional diode using novel techniques to suppress undesirable numerical oscillations and an AMR patch to follow the head of the particle distribution. We also report new results concerning the modeling of ion sources using the axisymmetric WARPRZ-AMR prototype showing the utility of an AMR patch resolving the emitter vicinity and the beam edge.

  17. Toward the credibility of Northeast United States summer precipitation projections in CMIP5 and NARCCAP simulations

    NASA Astrophysics Data System (ADS)

    Thibeault, Jeanne M.; Seth, A.

    2015-10-01

    Precipitation projections for the northeast United States and nearby Canada (Northeast) are examined for 15 Fifth Phase of the Coupled Model Intercomparison Project (CMIP5) models. A process-based evaluation of atmospheric circulation features associated with wet Northeast summers is performed to examine whether credibility can be differentiated within the multimodel ensemble. Based on these evaluations, and an analysis of the interannual statistical properties of area-averaged precipitation, model subsets were formed. Multimodel precipitation projections from each subset were compared to the multimodel projection from all of the models. Higher-resolution North American Regional Climate Change Assessment Program (NARCCAP) regional climate models (RCMs) were subjected to a similar evaluation, grouping into subsets, and examination of future projections. CMIP5 models adequately simulate most large-scale circulation features associated with wet Northeast summers, though all have errors in simulating observed sea level pressure and moisture divergence anomalies in the western tropical Atlantic/Gulf of Mexico. Relevant large-scale processes simulated by the RCMs resemble those of their driving global climate models (GCMs), which are not always realistic. Future RCM studies could benefit from a process analysis of potential driving GCMs prior to dynamical downscaling. No CMIP5 or NARCCAP models were identified as clearly more credible, but six GCMs and four RCMs performed consistently better. Among the "Better" models, there is no consistency in the direction of future summer precipitation change. CMIP5 projections suggest that the Northeast precipitation response depends on the dynamics of the North Atlantic anticyclone and associated circulation and moisture convergence patterns, which vary among "Better" models. Even when model credibility cannot be clearly differentiated, examination of simulated processes provides important insights into their evolution under

  18. Improved Arctic Sea Ice Thickness Projections Using Bias Corrected CMIP5 Simulations

    NASA Astrophysics Data System (ADS)

    Melia, N.; Hawkins, E.; Haines, K.

    2015-12-01

    Projections of Arctic sea ice thickness (SIT) have the potential to inform stakeholders about accessibility to the region, but are currently rather uncertain. The latest suite of CMIP5 Global Climate Models (GCMs) produce a wide range of simulated SIT in the historical period (1979-2014) and exhibit various spatial and temporal biases when compared with the Pan-Arctic Ice Ocean Modelling and Assimilation System (PIOMAS) sea ice reanalysis. We present a new method to constrain such GCM simulations of SIT to narrow projection uncertainty via a statistical bias correction technique. The bias correction successfully constrains the spatial SIT distribution and temporal variability in the CMIP5 projections whilst retaining the climatic fluctuations from individual ensemble members. The bias correction acts to reduce the uncertainty in projections of SIT and reveals the significant contributions of sea ice internal variability in the first half of the century and of scenario uncertainty from mid-century onwards. The projected date of ice-free conditions in the Arctic under the RCP8.5 high emission scenario occurs in the 2050s, which is a decade earlier than without the bias correction, with potentially significant implications for stakeholders in the Arctic such as the shipping industry. The bias correction methodology developed could be similarly applied to other variables to narrow uncertainty in climate projections more generally.

  19. Comparisons of Simulated Hydrodynamics and Water Quality for Projected Demands in 2046, Pueblo Reservoir, Southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.; Galloway, Joel M.; Miller, Lisa D.; Mau, David P.

    2008-01-01

    Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Bureau of Reclamation is working to meet its goal to issue a Final Environmental Impact Statement (EIS) on the Southern Delivery System project (SDS). SDS is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various EIS alternatives and plans by Pueblo West to discharge treated water into the reservoir. Plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (water years 2000 through

  20. Description of convective-scale numerical weather simulation use in a flight simulator within the Flysafe project

    NASA Astrophysics Data System (ADS)

    Pradier-Vabre, S.; Forster, C.; Heesbeen, W. W. M.; Pagé, C.; Sénési, S.; Tafferner, A.; Bernard-Bouissières, I.; Caumont, O.; Drouin, A.; Ducrocq, V.; Guillou, Y.; Josse, P.

    2009-03-01

    Within the framework of the Flysafe project, dedicated tools aiming at improving flight safety are developed. In particular, efforts are directed towards the development of the Next Generation-Integrated Surveillance System (NG-ISS), i.e. a combination of new on-board systems and ground-based tools which provides the pilot with integrated information on three risks playing a major role in aircraft accidents: collision with another aircraft, collision with terrain, and adverse weather conditions. For the latter, Weather Information Management Systems (WIMSs) based on nowcasts of atmospheric hazards are developed. This paper describes the set-up of a test-bed for the NG-ISS incorporating two types of WIMS data, those related to aircraft in-flight icing and thunderstorm risks. The test-bed is based on convective-scale numerical simulations of a particular weather scenario with thunderstorms and icing in the area of the Innsbruck airport. Raw simulated fields as well as more elaborate diagnostics (synthetic reflectivity and satellite brightness temperature) feed both the flight simulator including the NG-ISS and the algorithms in charge of producing WIMS data. WIMS outputs based on the synthetic data are discussed, and it is indicated that the high-resolution simulated fields are beneficial for the NG-ISS test-bed purposes and its technical feasibility.

  1. Information integration for data fusion

    SciTech Connect

    Bray, O.H.

    1997-01-01

    Data fusion has been identified by the Department of Defense as a critical technology for the U.S. defense industry. Data fusion requires combining expertise in two areas - sensors and information integration. Although data fusion is a rapidly growing area, there is little synergy and use of common, reusable, and/or tailorable objects and models, especially across different disciplines. The Laboratory-Directed Research and Development project had two purposes: to see if a natural language-based information modeling methodology could be used for data fusion problems, and if so, to determine whether this methodology would help identify commonalities across areas and achieve greater synergy. The project confirmed both of the initial hypotheses: that the natural language-based information modeling methodology could be used effectively in data fusion areas and that commonalities could be found that would allow synergy across various data fusion areas. The project found five common objects that are the basis for all of the data fusion areas examined: targets, behaviors, environments, signatures, and sensors. Many of the objects and the specific facts related to these objects were common across several areas and could easily be reused. In some cases, even the terminology remained the same. In other cases, different areas had their own terminology, but the concepts were the same. This commonality is important with the growing use of multisensor data fusion. Data fusion is much more difficult if each type of sensor uses its own objects and models rather than building on a common set. This report introduces data fusion, discusses how the synergy generated by this LDRD would have benefited an earlier successful project and contains a summary information model from that project, describes a preliminary management information model, and explains how information integration can facilitate cross-treaty synergy for various arms control treaties.

  2. Fusion Implementation

    SciTech Connect

    J.A. Schmidt

    2002-02-20

    If a fusion DEMO reactor can be brought into operation during the first half of this century, fusion power production can have a significant impact on carbon dioxide production during the latter half of the century. An assessment of fusion implementation scenarios shows that the resource demands and waste production associated with these scenarios are manageable factors. If fusion is implemented during the latter half of this century it will be one element of a portfolio of (hopefully) carbon dioxide limiting sources of electrical power. It is time to assess the regional implications of fusion power implementation. An important attribute of fusion power is the wide range of possible regions of the country, or countries in the world, where power plants can be located. Unlike most renewable energy options, fusion energy will function within a local distribution system and not require costly, and difficult, long distance transmission systems. For example, the East Coast of the United States is a prime candidate for fusion power deployment by virtue of its distance from renewable energy sources. As fossil fuels become less and less available as an energy option, the transmission of energy across bodies of water will become very expensive. On a global scale, fusion power will be particularly attractive for regions separated from sources of renewable energy by oceans.

  3. A fusion of minds

    NASA Astrophysics Data System (ADS)

    Corfield, Richard

    2013-02-01

    Mystery still surrounds the visit of the astronomer Sir Bernard Lovell to the Soviet Union in 1963. But his collaboration - and that of other British scientists - eased geopolitical tensions at the height of the Cold War and paved the way for today's global ITER fusion project, as Richard Corfield explains.

  4. Post-processing of multi-hydrologic model simulations for improved streamflow projections

    NASA Astrophysics Data System (ADS)

    khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid

    2016-04-01

    Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.

  5. A Hybrid Parachute Simulation Environment for the Orion Parachute Development Project

    NASA Technical Reports Server (NTRS)

    Moore, James W.

    2011-01-01

    A parachute simulation environment (PSE) has been developed that aims to take advantage of legacy parachute simulation codes and modern object-oriented programming techniques. This hybrid simulation environment provides the parachute analyst with a natural and intuitive way to construct simulation tasks while preserving the pedigree and authority of established parachute simulations. NASA currently employs four simulation tools for developing and analyzing air-drop tests performed by the CEV Parachute Assembly System (CPAS) Project. These tools were developed at different times, in different languages, and with different capabilities in mind. As a result, each tool has a distinct interface and set of inputs and outputs. However, regardless of the simulation code that is most appropriate for the type of test, engineers typically perform similar tasks for each drop test such as prediction of loads, assessment of altitude, and sequencing of disreefs or cut-aways. An object-oriented approach to simulation configuration allows the analyst to choose models of real physical test articles (parachutes, vehicles, etc.) and sequence them to achieve the desired test conditions. Once configured, these objects are translated into traditional input lists and processed by the legacy simulation codes. This approach minimizes the number of sim inputs that the engineer must track while configuring an input file. An object oriented approach to simulation output allows a common set of post-processing functions to perform routine tasks such as plotting and timeline generation with minimal sensitivity to the simulation that generated the data. Flight test data may also be translated into the common output class to simplify test reconstruction and analysis.

  6. ITER Fusion Energy

    ScienceCinema

    Dr. Norbert Holtkamp

    2016-07-12

    ITER (in Latin “the way”) is designed to demonstrate the scientific and technological feasibility of fusion energy. Fusion is the process by which two light atomic nuclei combine to form a heavier over one and thus release energy. In the fusion process two isotopes of hydrogen – deuterium and tritium – fuse together to form a helium atom and a neutron. Thus fusion could provide large scale energy production without greenhouse effects; essentially limitless fuel would be available all over the world. The principal goals of ITER are to generate 500 megawatts of fusion power for periods of 300 to 500 seconds with a fusion power multiplication factor, Q, of at least 10. Q ? 10 (input power 50 MW / output power 500 MW). The ITER Organization was officially established in Cadarache, France, on 24 October 2007. The seven members engaged in the project – China, the European Union, India, Japan, Korea, Russia and the United States – represent more than half the world’s population. The costs for ITER are shared by the seven members. The cost for the construction will be approximately 5.5 billion Euros, a similar amount is foreseen for the twenty-year phase of operation and the subsequent decommissioning.

  7. Project Report on DOE Young Investigator Grant (Contract No. DE-FG02-02ER25525) Dynamic Scheduling and Fusion of Irregular Computation (August 15, 2002 to August 14, 2005)

    SciTech Connect

    Ding, Chen

    2005-08-16

    Computer simulation has become increasingly important in many scientiï¬ c disciplines, but its performance and scalability are severely limited by the memory throughput on today's computer systems. With the support of this grant, we ï¬ rst designed training-based prediction, which accurately predicts the memory performance of large applications before their execution. Then we developed optimization techniques using dynamic computation fusion and large-scale data transformation. The research work has three major components. The ï¬ rst is modeling and prediction of cache behav- ior. We have developed a new technique, which uses reuse distance information from training inputs then extracts a parameterized model of the program's cache miss rates for any input size and for any size of fully associative cache. Using the model we have built a web-based tool using three dimensional visualization. The new model can help to build cost-effective computer systems, design better benchmark suites, and improve task scheduling on heterogeneous systems. The second component is global computation for improving cache performance. We have developed an algorithm for dynamic data partitioning using sampling theory and probability distribution. Recent work from a number of groups show that manual or semi-manual computation fusion has signiï¬ cant beneï¬ ts in physical, mechanical, and biological simulations as well as information retrieval and machine veriï¬ cation. We have developed an au- tomatic tool that measures the potential of computation fusion. The new system can be used by high-performance application programmers to estimate the potential of locality improvement for a program before trying complex transformations for a speciï¬ c cache system. The last component studies models of spatial locality and the problem of data layout. In scientific programs, most data are stored in arrays. Grand challenge problems such as hydrodynamics simulation and data mining may use

  8. Laser fusion monthly -- August 1980

    SciTech Connect

    Ahlstrom, H.G.

    1980-08-01

    This report documents the monthly progress for the laser fusion research at Lawrence Livermore National Laboratory. First it gives facilities report for both the Shiva and Argus projects. Topics discussed include; laser system for the Nova Project; the fusion experiments analysis facility; optical/x-ray streak camera; Shiva Dante System temporal response; 2{omega}{sub 0} experiment; and planning for an ICF engineering test facility.

  9. Hardware Accelerated Simulated Radiography

    SciTech Connect

    Laney, D; Callahan, S; Max, N; Silva, C; Langer, S; Frank, R

    2005-04-12

    We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32 bit floating point texture capabilities to obtain validated solutions to the radiative transport equation for X-rays. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedra that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester. We show that the hardware accelerated solution is faster than the current technique used by scientists.

  10. Improved Arctic sea ice thickness projections using bias-corrected CMIP5 simulations

    NASA Astrophysics Data System (ADS)

    Melia, N.; Haines, K.; Hawkins, E.

    2015-12-01

    Projections of Arctic sea ice thickness (SIT) have the potential to inform stakeholders about accessibility to the region, but are currently rather uncertain. The latest suite of CMIP5 global climate models (GCMs) produce a wide range of simulated SIT in the historical period (1979-2014) and exhibit various biases when compared with the Pan-Arctic Ice-Ocean Modelling and Assimilation System (PIOMAS) sea ice reanalysis. We present a new method to constrain such GCM simulations of SIT via a statistical bias correction technique. The bias correction successfully constrains the spatial SIT distribution and temporal variability in the CMIP5 projections whilst retaining the climatic fluctuations from individual ensemble members. The bias correction acts to reduce the spread in projections of SIT and reveals the significant contributions of climate internal variability in the first half of the century and of scenario uncertainty from the mid-century onwards. The projected date of ice-free conditions in the Arctic under the RCP8.5 high emission scenario occurs in the 2050s, which is a decade earlier than without the bias correction, with potentially significant implications for stakeholders in the Arctic such as the shipping industry. The bias correction methodology developed could be similarly applied to other variables to reduce spread in climate projections more generally.

  11. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  12. Cosmic rays Monte Carlo simulations for the Extreme Energy Events Project

    NASA Astrophysics Data System (ADS)

    Abbrescia, M.; Agocs, A.; Aiola, S.; Antolini, R.; Avanzini, C.; Baldini Ferroli, R.; Bencivenni, G.; Bossini, E.; Bressan, E.; Chiavassa, A.; Cicalò, C.; Cifarelli, L.; Coccia, E.; De Gruttola, D.; De Pasquale, S.; Di Giovanni, A.; D'Incecco, M.; Dreucci, M.; Fabbri, F. L.; Frolov, V.; Garbini, M.; Gemme, G.; Gnesi, I.; Gustavino, C.; Hatzifotiadou, D.; La Rocca, P.; Li, S.; Librizzi, F.; Maggiora, A.; Massai, M.; Miozzi, S.; Panareo, M.; Paoletti, R.; Perasso, L.; Pilo, F.; Piragino, G.; Regano, A.; Riggi, F.; Righini, G. C.; Sartorelli, G.; Scapparone, E.; Scribano, A.; Selvi, M.; Serci, S.; Siddi, E.; Spandre, G.; Squarcia, S.; Taiuti, M.; Tosello, F.; Votano, L.; Williams, M. C. S.; Yánez, G.; Zichichi, A.; Zuyeuski, R.

    2014-08-01

    The Extreme Energy Events Project (EEE Project) is an innovative experiment to study very high energy cosmic rays by means of the detection of the associated air shower muon component. It consists of a network of tracking detectors installed inside Italian High Schools. Each tracking detector, called EEE telescope, is composed of three Multigap Resistive Plate Chambers (MRPCs). At present, 43 telescopes are installed and taking data, opening the way for the detection of far away coincidences over a total area of about 3 × 105 km2. In this paper we present the Monte Carlo simulations that have been performed to predict the expected coincidence rate between distant EEE telescopes.

  13. Beam dynamics simulations and measurements at the Project X Test Facility

    SciTech Connect

    Gianfelice-Wendt, E.; Scarpine, V.E.; Webber, R.C.; /Fermilab

    2011-03-01

    Project X, under study at Fermilab, is a multitask high-power superconducting RF proton beam facility, aiming to provide high intensity protons for rare processes experiments and nuclear physics at low energy, and simultaneously for the production of neutrinos, as well as muon beams in the long term. A beam test facility - former known as High Intensity Neutrino Source (HINS) - is under commissioning for testing critical components of the project, e.g. dynamics and diagnostics at low beam energies, broadband beam chopping, RF power generation and distribution. In this paper we describe the layout of the test facility and present beam dynamics simulations and measurements.

  14. Magnetized Target Fusion

    NASA Technical Reports Server (NTRS)

    Griffin, Steven T.

    2002-01-01

    Magnetized target fusion (MTF) is under consideration as a means of building a low mass, high specific impulse, and high thrust propulsion system for interplanetary travel. This unique combination is the result of the generation of a high temperature plasma by the nuclear fusion process. This plasma can then be deflected by magnetic fields to provide thrust. Fusion is initiated by a small traction of the energy generated in the magnetic coils due to the plasma's compression of the magnetic field. The power gain from a fusion reaction is such that inefficiencies due to thermal neutrons and coil losses can be overcome. Since the fusion reaction products are directly used for propulsion and the power to initiate the reaction is directly obtained from the thrust generation, no massive power supply for energy conversion is required. The result should be a low engine mass, high specific impulse and high thrust system. The key is to successfully initiate fusion as a proof-of-principle for this application. Currently MSFC is implementing MTF proof-of-principle experiments. This involves many technical details and ancillary investigations. Of these, selected pertinent issues include the properties, orientation and timing of the plasma guns and the convergence and interface development of the "pusher" plasma. Computer simulations of the target plasma's behavior under compression and the convergence and mixing of the gun plasma are under investigation. This work is to focus on the gun characterization and development as it relates to plasma initiation and repeatability.

  15. Integrated Vehicle Health Management Project-Modeling and Simulation for Wireless Sensor Applications

    NASA Technical Reports Server (NTRS)

    Wallett, Thomas M.; Mueller, Carl H.; Griner, James H., Jr.

    2009-01-01

    This paper describes the efforts in modeling and simulating electromagnetic transmission and reception as in a wireless sensor network through a realistic wing model for the Integrated Vehicle Health Management project at the Glenn Research Center. A computer model in a standard format for an S-3 Viking aircraft was obtained, converted to a Microwave Studio software format, and scaled to proper dimensions in Microwave Studio. The left wing portion of the model was used with two antenna models, one transmitting and one receiving, to simulate radio frequency transmission through the wing. Transmission and reception results were inconclusive.

  16. Overview of High-Fidelity Modeling Activities in the Numerical Propulsion System Simulations (NPSS) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2002-01-01

    A high-fidelity simulation of a commercial turbofan engine has been created as part of the Numerical Propulsion System Simulation Project. The high-fidelity computer simulation utilizes computer models that were developed at NASA Glenn Research Center in cooperation with turbofan engine manufacturers. The average-passage (APNASA) Navier-Stokes based viscous flow computer code is used to simulate the 3D flow in the compressors and turbines of the advanced commercial turbofan engine. The 3D National Combustion Code (NCC) is used to simulate the flow and chemistry in the advanced aircraft combustor. The APNASA turbomachinery code and the NCC combustor code exchange boundary conditions at the interface planes at the combustor inlet and exit. This computer simulation technique can evaluate engine performance at steady operating conditions. The 3D flow models provide detailed knowledge of the airflow within the fan and compressor, the high and low pressure turbines, and the flow and chemistry within the combustor. The models simulate the performance of the engine at operating conditions that include sea level takeoff and the altitude cruise condition.

  17. Image fusion

    NASA Technical Reports Server (NTRS)

    Pavel, M.

    1993-01-01

    The topics covered include the following: a system overview of the basic components of a system designed to improve the ability of a pilot to fly through low-visibility conditions such as fog; the role of visual sciences; fusion issues; sensor characterization; sources of information; image processing; and image fusion.

  18. Modelling and Simulation of National Electronic Product Code Network Demonstrator Project

    NASA Astrophysics Data System (ADS)

    Mo, John P. T.

    The National Electronic Product Code (EPC) Network Demonstrator Project (NDP) was the first large scale consumer goods track and trace investigation in the world using full EPC protocol system for applying RFID technology in supply chains. The NDP demonstrated the methods of sharing information securely using EPC Network, providing authentication to interacting parties, and enhancing the ability to track and trace movement of goods within the entire supply chain involving transactions among multiple enterprise. Due to project constraints, the actual run of the NDP was 3 months only and was unable to consolidate with quantitative results. This paper discusses the modelling and simulation of activities in the NDP in a discrete event simulation environment and provides an estimation of the potential benefits that can be derived from the NDP if it was continued for one whole year.

  19. Hanford Waste Simulants Created to Support the Research and Development on the River Protection Project - Waste Treatment Plant

    SciTech Connect

    Eibling, R.E.

    2001-07-26

    The development of nonradioactive waste simulants to support the River Protection Project - Waste Treatment Plant bench and pilot-scale testing is crucial to the design of the facility. The report documents the simulants development to support the SRTC programs and the strategies used to produce the simulants.

  20. Haughton-Mars Project (HMP)/NASA 2006 Lunar Medical Contingency Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Scheuring, R. A.; Jones, J. A.; Lee, P.; Comtois, J. M.; Chappell, S.; Rafiq, A.; Braham, S.; Hodgson, E.; Sullivan, P.; Wilkinson, N.

    2006-01-01

    Medical requirements are currently being developed for NASA's space exploration program. Lunar surface operations for crews returning to the moon will be performed on a daily basis to conduct scientific research and construct a lunar habitat. Inherent to aggressive surface activities is the potential risk of injury to crew members. To develop an evidence-base for handling medical contingencies on the lunar surface, a simulation project was conducted using the moon-Mars analog environment at Devon Island, Nunavut, high Canadian Arctic. A review of the Apollo lunar surface activities and personal communications with Apollo lunar crew members provided a knowledge base of plausible scenarios that could potentially injure an astronaut during a lunar extravehicular activity. Objectives were established to 1) demonstrate stabilization, field extraction and transfer an injured crew member to the habitat and 2) evaluate audio, visual and biomedical communication capabilities with ground controllers at multiple mission control centers. The simulation project s objectives were achieved. Among these objectives were 1) extracting a crew member from a sloped terrain by a two-member team in a 1-g analog environment, 2) establishing real-time communication to multiple space centers, 3) providing biomedical data to flight controllers and crew members, and 4) establishing a medical diagnosis and treatment plan from a remote site. The simulation project provided evidence for the types of equipment and methods needed for planetary space exploration. During the project, the crew members were confronted with a number of unexpected scenarios including environmental, communications, EVA suit, and navigation challenges. These trials provided insight into the challenges of carrying out a medical contingency in an austere environment. The knowledge gained from completing the objectives of this project will be incorporated into the exploration medical requirements involving an incapacited

  1. The fusion of gerontology and technology in nursing education: History and demonstration of the Gerontological Informatics Reasoning Project--GRIP.

    PubMed

    Dreher, H Michael; Cornelius, Fran; Draper, Judy; Pitkar, Harshad; Manco, Janet; Song, Il-Yeol

    2006-01-01

    Phase I of our Gerontological Reasoning Informatics Project (GRIP) began in the summer of 2002 when all 37 senior undergraduate nursing students in our accelerated BSN nursing program were given PDAs. These students were oriented to use a digitalized geriatric nursing assessment tool embedded into their PDA in a variety of geriatric clinical agencies. This informatics project was developed to make geriatric nursing more technology oriented and focused on seven modules of geriatric assessment: intellect (I), nutrition (N), self-concept (S), physical activity (P), interpersonal functioning (I), restful sleep (R), and elimination (E)--INSPIRE. Through phase II and now phase III, the GRIP Project has become a major collaboration between the College of Nursing & Health Professions and College of Information Science and Technology at Drexel University. The digitalized geriatric nursing health assessment tool has undergone a second round of reliability and validity testing and is now used to conduct a 20 minute comprehensive geriatric health assessment on the PDA, making our undergraduate gerontology course the most high tech clinical course in our nursing curriculum.

  2. EDITORIAL: Plasma Surface Interactions for Fusion

    NASA Astrophysics Data System (ADS)

    2006-05-01

    Because plasma-boundary physics encompasses some of the most important unresolved issues for both the International Thermonuclear Experimental Reactor (ITER) project and future fusion power reactors, there is a strong interest in the fusion community for better understanding and characterization of plasma wall interactions. Chemical and physical sputtering cause the erosion of the limiters/divertor plates and vacuum vessel walls (made of C, Be and W, for example) and degrade fusion performance by diluting the fusion fuel and excessively cooling the core, while carbon redeposition could produce long-term in-vessel tritium retention, degrading the superior thermo-mechanical properties of the carbon materials. Mixed plasma-facing materials are proposed, requiring optimization for different power and particle flux characteristics. Knowledge of material properties as well as characteristics of the plasma material interaction are prerequisites for such optimizations. Computational power will soon reach hundreds of teraflops, so that theoretical and plasma science expertise can be matched with new experimental capabilities in order to mount a strong response to these challenges. To begin to address such questions, a Workshop on New Directions for Advanced Computer Simulations and Experiments in Fusion-Related Plasma Surface Interactions for Fusion (PSIF) was held at the Oak Ridge National Laboratory from 21 to 23 March, 2005. The purpose of the workshop was to bring together researchers in fusion related plasma wall interactions in order to address these topics and to identify the most needed and promising directions for study, to exchange opinions on the present depth of knowledge of surface properties for the main fusion-related materials, e.g., C, Be and W, especially for sputtering, reflection, and deuterium (tritium) retention properties. The goal was to suggest the most important next steps needed for such basic computational and experimental work to be facilitated

  3. Fusion of cone-beam CT and 3D photographic images for soft tissue simulation in maxillofacial surgery

    NASA Astrophysics Data System (ADS)

    Chung, Soyoung; Kim, Joojin; Hong, Helen

    2016-03-01

    During maxillofacial surgery, prediction of the facial outcome after surgery is main concern for both surgeons and patients. However, registration of the facial CBCT images and 3D photographic images has some difficulties that regions around the eyes and mouth are affected by facial expressions or the registration speed is low due to their dense clouds of points on surfaces. Therefore, we propose a framework for the fusion of facial CBCT images and 3D photos with skin segmentation and two-stage surface registration. Our method is composed of three major steps. First, to obtain a CBCT skin surface for the registration with 3D photographic surface, skin is automatically segmented from CBCT images and the skin surface is generated by surface modeling. Second, to roughly align the scale and the orientation of the CBCT skin surface and 3D photographic surface, point-based registration with four corresponding landmarks which are located around the mouth is performed. Finally, to merge the CBCT skin surface and 3D photographic surface, Gaussian-weight-based surface registration is performed within narrow-band of 3D photographic surface.

  4. Simulations in the Introductory Astronomy Laboratory: Six Years of Project CLEA

    NASA Astrophysics Data System (ADS)

    Marschall, L. A.

    1998-12-01

    Since 1992, Project CLEA (Contemporary Laboratory Experiences in Astronomy) has been developing introductory computer-based exercises aimed at the introductory astronomy laboratory. These exercises simulate important techniques of astronomical research using digital data and Windows- based software. Each of the 9 exercises developed to date consists of software, technical guides for teachers, and student manuals for the exercises. CLEA software is used widely at many institutions, and at a variety of setting from middle school to upperclass astronomy classes. The current design philosophy and goals of Project CLEA will be discussed, along with the results of both formal and informal assessments of the strengths and weaknesses of is approach. Plans for future development will be presented. Project CLEA is supported by grants from Gettysburg College and the National Science Foundation

  5. Improved point scale climate projections using a block bootstrap simulation and quantile matching method

    NASA Astrophysics Data System (ADS)

    Kokic, Philip; Jin, Huidong; Crimp, Steven

    2013-08-01

    Statistical downscaling methods are commonly used to address the scale mismatch between coarse resolution Global Climate Model output and the regional or local scales required for climate change impact assessments. The effectiveness of a downscaling method can be measured against four broad criteria: consistency with the existing baseline data in terms of means, trends and distributional characteristics; consistency with the broader scale climate data used to generate the projections; the degree of transparency and repeatability; and the plausibility of results produced. Many existing downscaling methods fail to fulfil all of these criteria. In this paper we examine a block bootstrap simulation technique combined with a quantile prediction and matching method for simulating future daily climate data. By utilising this method the distributional properties of the projected data will be influenced by the distribution of the observed data, the trends in predictors derived from the Global Climate Models and the relationship of these predictors to the observed data. Using observed data from several climate stations in Vanuatu and Fiji and out-of-sample validation techniques, we show that the method is successful at projecting various climate characteristics including the variability and auto-correlation of daily temperature and rainfall, the correlations between these variables and between spatial locations. This paper also illustrates how this novel method can produce more effective point scale projections and a more credible alternative to other approaches in the Pacific region.

  6. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    SciTech Connect

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  7. On the transverse-traceless projection in lattice simulations of gravitational wave production

    NASA Astrophysics Data System (ADS)

    Figueroa, Daniel G.; García-Bellido, Juan; Rajantie, Arttu

    2011-11-01

    It has recently been pointed out that the usual procedure employed in order to obtain the transverse-traceless (TT) part of metric perturbations in lattice simulations was inconsistent with the fact that those fields live in the lattice and not in the continuum. It was claimed that this could lead to a larger amplitude and a wrong shape for the gravitational wave (GW) spectra obtained in numerical simulations of (p)reheating. In order to address this issue, we have defined a consistent prescription in the lattice for extracting the TT part of the metric perturbations. We demonstrate explicitly that the GW spectra obtained with the old continuum-based TT projection only differ marginally in amplitude and shape with respect to the new lattice-based ones. We conclude that one can therefore trust the predictions appearing in the literature on the spectra of GW produced during (p)reheating and similar scenarios simulated on a lattice.

  8. Ion-kinetic simulations of D-3He gas-filled inertial confinement fusion target implosions with moderate to large Knudsen number

    DOE PAGESBeta

    Larroche, O.; Rinderknecht, H. G.; Rosenberg, M. J.; Hoffman, N. M.; Atzeni, S.; Petrasso, R. D.; Amendt, P. A.; Seguin, F. H.

    2016-01-06

    Experiments designed to investigate the transition to non-collisional behavior in D3He-gas inertial confinement fusion target implosions display increasingly large discrepancies with respect to simulations by standard hydrodynamics codes as the expected ion mean-free-paths λc increase with respect to the target radius R (i.e., when the Knudsen number NK = λc/R grows). To take properly into account large NK's, multi-ion-species Vlasov-Fokker-Planck computations of the inner gas in the capsules have been performed, for two different values of NK, one moderate and one large. The results, including nuclear yield, reactivity-weighted ion temperatures, nuclear emissivities, and surface brightness, have been compared with themore » experimental data and with the results of hydrodynamical simulations, some of which include an ad hocmodeling of kinetic effects. The experimental results are quite accurately rendered by the kinetic calculations in the smaller-NK case, much better than by the hydrodynamical calculations. The kinetic effects at play in this case are thus correctly understood. However, in the higher-NK case, the agreement is much worse. Furthermore, the remaining discrepancies are shown to arise from kinetic phenomena (e.g., inter-species diffusion) occurring at the gas-pusher interface, which should be investigated in the future work.« less

  9. CRYOGENICS FOR FUSION

    SciTech Connect

    Dauguet, P.; Bonneton, M.; Fauve, E.; Bernhardt, J. M.; Beauvisage, J.; Andrieu, F.; Gistau-Baguer, G. M.; Boissin, J. C.

    2008-03-16

    Fusion of Hydrogen to produce energy is one of the technologies under study to meet the mankind raising need in energy and as a substitute to fossil fuels for the future. This technology is under investigation for more than 30 years already, with, for example, the former construction of the experimental reactors Tore Supra, DIII-D and JET. With the construction of ITER to start, the next step to 'fusion for energy' will be done. In these projects, an extensive use of cryogenic systems is requested. Air Liquide has been involved as cryogenic partner in most of former and presently constructed fusion reactors. In the present paper, a review of the cryogenic systems we delivered to Tore Supra, JET, IPR and KSTAR will be presented.

  10. Early Career. Harnessing nanotechnology for fusion plasma-material interface research in an in-situ particle-surface interaction facility

    SciTech Connect

    Allain, Jean Paul

    2014-08-08

    This project consisted of fundamental and applied research of advanced in-situ particle-beam interactions with surfaces/interfaces to discover novel materials able to tolerate intense conditions at the plasma-material interface (PMI) in future fusion burning plasma devices. The project established a novel facility that is capable of not only characterizing new fusion nanomaterials but, more importantly probing and manipulating materials at the nanoscale while performing subsequent single-effect in-situ testing of their performance under simulated environments in fusion PMI.

  11. Geophysical Simulations Conducted by the SEG Advanced Modeling Project (SEAM) for a Deepwater Subsalt Resource

    NASA Astrophysics Data System (ADS)

    Fehler, M. C.

    2010-12-01

    Geophysical simulations are playing an increasingly large role in both predicting the future evolution of complex systems and for providing benchmark data to test new analysis approaches. As geophysical inversion schemes for determining model structure become increasingly sophisticated, and their ability to incorporate multiple types of geophysical data increases, there is need for challenging benchmark datasets to be used for testing and validating the schemes. If simulated datasets are to be used to evaluate the robustness and reliability of inversion schemes, the simulations must be conducted on realistic models and some estimate of the reliability of the simulations must be made. We have developed a model that contains a major salt body and a suite of petroleum reservoirs. A suite of geophysical simulations is being conducted on the model. The goal at the start of the SEAM project was to capture as much physics and realism as possible in a 3D model that was relevant to geophysical oil and gas exploration. Certain facets of the model were designed to go beyond the capabilities of current geophysical modeling and imaging technology. The philosophy behind this was that enhanced imaging capabilities would evolve and become available over the 10 or more years of the expected lifetime of the model. An important design goal for the SEAM earth model is internal consistency across the domains of rock properties (e.g. fundamental parameters like Vshale, porosity, and pore fluid type), the intermediate level elastic and electromagnetic parameters, and the output simulations for seismic, electromagnetic and gravity fields. By rooting the ultimate simulation back to the rock properties, any changes in the latter are guaranteed to change all the elastic and other parameters automatically, consistently, and with the appropriate correlations. A model founded on rock properties provides a test bed not just for the inversion of seismic data for reflectivity, but also for the

  12. Differentiating self-projection from simulation during mentalizing: evidence from fMRI.

    PubMed

    Schurz, Matthias; Kogler, Christoph; Scherndl, Thomas; Kronbichler, Martin; Kühberger, Anton

    2015-01-01

    We asked participants to predict which of two colors a similar other (student) and a dissimilar other (retiree) likes better. We manipulated if color-pairs were two hues from the same color-category (e.g. green) or two conceptually different colors (e.g. green versus blue). In the former case, the mental state that has to be represented (i.e., the percept of two different hues of green) is predominantly non-conceptual or phenomenal in nature, which should promote mental simulation as a strategy for mentalizing. In the latter case, the mental state (i.e. the percept of green versus blue) can be captured in thought by concepts, which facilitates the use of theories for mentalizing. In line with the self-projection hypothesis, we found that cortical midline areas including vmPFC / orbitofrontal cortex and precuneus were preferentially activated for mentalizing about a similar other. However, activation was not affected by the nature of the color-difference, suggesting that self-projection subsumes simulation-like processes but is not limited to them. This indicates that self-projection is a universal strategy applied in different contexts--irrespective of the availability of theories for mentalizing. Along with midline activations linked to self-projection, we also observed activation in right lateral frontal and dorsal parietal areas showing a theory-like pattern. Taken together, this shows that mentalizing does not operate based on simulation or theory, but that both strategies are used concurrently to predict the choices of others. PMID:25807390

  13. Using historical and projected future climate model simulations as drivers of agricultural and biological models (Invited)

    NASA Astrophysics Data System (ADS)

    Stefanova, L. B.

    2013-12-01

    Climate model evaluation is frequently performed as a first step in analyzing climate change simulations. Atmospheric scientists are accustomed to evaluating climate models through the assessment of model climatology and biases, the models' representation of large-scale modes of variability (such as ENSO, PDO, AMO, etc) and the relationship between these modes and local variability (e.g. the connection between ENSO and the wintertime precipitation in the Southeast US). While these provide valuable information about the fidelity of historical and projected climate model simulations from an atmospheric scientist's point of view, the application of climate model data to fields such as agriculture, ecology and biology may require additional analyses focused on the particular application's requirements and sensitivities. Typically, historical climate simulations are used to determine a mapping between the model and observed climate, either through a simple (additive for temperature or multiplicative for precipitation) or a more sophisticated (such as quantile matching) bias correction on a monthly or seasonal time scale. Plants, animals and humans however are not directly affected by monthly or seasonal means. To assess the impact of projected climate change on living organisms and related industries (e.g. agriculture, forestry, conservation, utilities, etc.), derivative measures such as the heating degree-days (HDD), cooling degree-days (CDD), growing degree-days (GDD), accumulated chill hours (ACH), wet season onset (WSO) and duration (WSD), among others, are frequently useful. We will present a comparison of the projected changes in such derivative measures calculated by applying: (a) the traditional temperature/precipitation bias correction described above versus (b) a bias correction based on the mapping between the historical model and observed derivative measures themselves. In addition, we will present and discuss examples of various application-based climate

  14. Fusion Power.

    ERIC Educational Resources Information Center

    Dingee, David A.

    1979-01-01

    Discusses the extraordinary potential, the technical difficulties, and the financial problems that are associated with research and development of fusion power plants as a major source of energy. (GA)

  15. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6). Simulation Design and Preliminary Results

    SciTech Connect

    Kravitz, Benjamin S.; Robock, Alan; Tilmes, S.; Boucher, Olivier; English, J.; Irvine, Peter; Jones, Andrew; Lawrence, M. G.; Maccracken, Michael C.; Muri, Helene O.; Moore, John; Niemeier, Ulrike; Phipps, Steven; Sillmann, Jana; Storelvmo, Trude; Wang, Hailong; Watanabe, Shingo

    2015-10-27

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  16. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): Simulation design and preliminary results

    DOE PAGESBeta

    Kravitz, Benjamin S.; Robock, Alan; Tilmes, S.; Boucher, Olivier; English, J. M.; Irvine, Peter J.; Jones, Andrew; Lawrence, M. G.; MacCracken, Michael C.; Muri, Helene O.; et al

    2015-10-27

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more long wave radiation to escape to space. We discuss experiment designs, as well as the rationale formore » those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. In conclusion, this is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.« less

  17. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): Simulation design and preliminary results

    SciTech Connect

    Kravitz, Benjamin S.; Robock, Alan; Tilmes, S.; Boucher, Olivier; English, J. M.; Irvine, Peter J.; Jones, Andrew; Lawrence, M. G.; MacCracken, Michael C.; Muri, Helene O.; Moore, John C.; Niemeier, Ulrike; Phipps, Steven J.; Sillmann, Jana; Storelvmo, Trude; Wang, Hailong; Watanabe, Shingo

    2015-10-27

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more long wave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. In conclusion, this is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  18. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): simulation design and preliminary results

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Robock, A.; Tilmes, S.; Boucher, O.; English, J. M.; Irvine, P. J.; Jones, A.; Lawrence, M. G.; MacCracken, M.; Muri, H.; Moore, J. C.; Niemeier, U.; Phipps, S. J.; Sillmann, J.; Storelvmo, T.; Wang, H.; Watanabe, S.

    2015-10-01

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP project simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  19. Toward Unanimous Projections for Sea Ice Using CMIP5 Multi-model Simulations

    NASA Astrophysics Data System (ADS)

    Yang, S.; Christensen, J. H.; Langen, P. P.; Thejll, P.

    2015-12-01

    Coupled global climate models have been used to provide future climate projections as major objective tools based on physical laws that govern the dynamics and thermodynamics of the climate system. However, while climate models in general predict declines in Arctic sea ice cover (i.e., ice extent and volume) from late 20th century through the next decades in response to increase of anthropogenic forcing, the model simulated Arctic sea ice demonstrates considerable biases in both the mean and the declining trend in comparison with the observations over the satellite era (1979-present). The models also show wide inter-model spread in hindcast and projected sea ice decline, raising the question of uncertainty in model predicted polar climate. In order to address the model uncertainty in the Arctic sea ice projection, we analyze the Arctic sea ice extent under the context of surface air temperature (SAT) as simulated in the historical, RCP4.5 and RCP8.5 experiments by 27 CMIP5 models. These 27 models are all we could obtain from the CMIP5 archive with sufficient gird information for processing the sea ice data. Unlike many previous studies in which only limited number of models were selected based on metrics of modeled sea ice characteristics for getting projected ice with reduced uncertainty, our analysis is applied to all model simulations with no discrimination. It is found that the changes in total Arctic sea ice in various seasons from one model are closely related to the changes in global mean SAT in the corresponding model. This relationship appears very similar in all models and agrees well with that in the observational data. In particular, the ratio of the total Arctic sea ice changes in March, September and annual mean with respect to the baseline climatology (1979-2008) are seen to linearly correlate to the global mean annual SAT anomaly, suggesting unanimous projection of the sea ice extent may be possible with this relationship. Further analysis is

  20. Projected changes in atmospheric river events in Arizona as simulated by global and regional climate models

    NASA Astrophysics Data System (ADS)

    Rivera, Erick R.; Dominguez, Francina

    2015-12-01

    Inland-penetrating atmospheric rivers (ARs) affect the United States Southwest and significantly contribute to cool season precipitation. In this study, we examine the results from an ensemble of dynamically downscaled simulations from the North American Regional Climate Change Assessment Program (NARCCAP) and their driving general circulation models (GCMs) in order to determine statistically significant changes in the intensity of the cool season ARs impacting Arizona and the associated precipitation. Future greenhouse gas emissions follow the A2 emission scenario from the Intergovernmental Panel on Climate Change Fourth Assessment Report simulations. We find that there is a consistent and clear intensification of the AR-related water vapor transport in both the global and regional simulations which reflects the increase in water vapor content due to warmer atmospheric temperatures, according to the Clausius-Clapeyron relationship. However, the response of AR-related precipitation intensity to increased moisture flux and column-integrated water vapor is weak and no significant changes are projected either by the GCMs or the NARCCAP models. This lack of robust precipitation variations can be explained in part by the absence of meaningful changes in both the large-scale water vapor flux convergence and the maximum positive relative vorticity in the GCMs. Additionally, some global models show a robust decrease in relative humidity which may also be responsible for the projected precipitation patterns.

  1. Projected strengthening of Amazonian dry season by constrained climate model simulations

    NASA Astrophysics Data System (ADS)

    Boisier, Juan P.; Ciais, Philippe; Ducharne, Agnès; Guimberteau, Matthieu

    2015-07-01

    The vulnerability of Amazonian rainforest, and the ecological services it provides, depends on an adequate supply of dry-season water, either as precipitation or stored soil moisture. How the rain-bearing South American monsoon will evolve across the twenty-first century is thus a question of major interest. Extensive savanization, with its loss of forest carbon stock and uptake capacity, is an extreme although very uncertain scenario. We show that the contrasting rainfall projections simulated for Amazonia by 36 global climate models (GCMs) can be reproduced with empirical precipitation models, calibrated with historical GCM data as functions of the large-scale circulation. A set of these simple models was therefore calibrated with observations and used to constrain the GCM simulations. In agreement with the current hydrologic trends, the resulting projection towards the end of the twenty-first century is for a strengthening of the monsoon seasonal cycle, and a dry-season lengthening in southern Amazonia. With this approach, the increase in the area subjected to lengthy--savannah-prone--dry seasons is substantially larger than the GCM-simulated one. Our results confirm the dominant picture shown by the state-of-the-art GCMs, but suggest that the `model democracy' view of these impacts can be significantly underestimated.

  2. Projected changes in atmospheric river events in Arizona as simulated by global and regional climate models

    NASA Astrophysics Data System (ADS)

    Rivera, Erick R.; Dominguez, Francina

    2016-09-01

    Inland-penetrating atmospheric rivers (ARs) affect the United States Southwest and significantly contribute to cool season precipitation. In this study, we examine the results from an ensemble of dynamically downscaled simulations from the North American Regional Climate Change Assessment Program (NARCCAP) and their driving general circulation models (GCMs) in order to determine statistically significant changes in the intensity of the cool season ARs impacting Arizona and the associated precipitation. Future greenhouse gas emissions follow the A2 emission scenario from the Intergovernmental Panel on Climate Change Fourth Assessment Report simulations. We find that there is a consistent and clear intensification of the AR-related water vapor transport in both the global and regional simulations which reflects the increase in water vapor content due to warmer atmospheric temperatures, according to the Clausius-Clapeyron relationship. However, the response of AR-related precipitation intensity to increased moisture flux and column-integrated water vapor is weak and no significant changes are projected either by the GCMs or the NARCCAP models. This lack of robust precipitation variations can be explained in part by the absence of meaningful changes in both the large-scale water vapor flux convergence and the maximum positive relative vorticity in the GCMs. Additionally, some global models show a robust decrease in relative humidity which may also be responsible for the projected precipitation patterns.

  3. A National Collaboratory to Advance the Science of High Temperature Plasma Physics for Magnetic Fusion

    SciTech Connect

    Schissel, David P.; Abla, G.; Burruss, J. R.; Feibush, E.; Fredian, T. W.; Goode, M. M.; Greenwald, M. J.; Keahey, K.; Leggett, T.; Li, K.; McCune, D. C.; Papka, M. E.; Randerson, L.; Sanderson, A.; Stillerman, J.; Thompson, M. R.; Uram, T.; Wallace, G.

    2012-12-20

    This report summarizes the work of the National Fusion Collaboratory (NFC) Project to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. The original objective of the NFC project was to develop and deploy a national FES Grid (FusionGrid) that would be a system for secure sharing of computation, visualization, and data resources over the Internet. The goal of FusionGrid was to allow scientists at remote sites to participate as fully in experiments and computational activities as if they were working on site thereby creating a unified virtual organization of the geographically dispersed U.S. fusion community. The vision for FusionGrid was that experimental and simulation data, computer codes, analysis routines, visualization tools, and remote collaboration tools are to be thought of as network services. In this model, an application service provider (ASP provides and maintains software resources as well as the necessary hardware resources. The project would create a robust, user-friendly collaborative software environment and make it available to the US FES community. This Grid's resources would be protected by a shared security infrastructure including strong authentication to identify users and authorization to allow stakeholders to control their own resources. In this environment, access to services is stressed rather than data or software portability.

  4. High Level Information Fusion (HLIF) with nested fusion loops

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  5. Transfer matrices combined with Green's functions for the multiple-scattering simulation of electronic projection imaging

    NASA Astrophysics Data System (ADS)

    Mayer, A.; Vigneron, J.-P.

    1999-07-01

    Electronic projection imaging is described in the framework of a multiple-scattering theory, by using a combination of transfer-matrix and Green's-function formalisms. The transfer-matrix methodology is used to compute the wave propagation within the tip and object scattering region, while the Green's-function formalism is used to describe the electron projection from the scatterers towards a distant imaging screen. This full-order theory is needed to overcome the limits of the first Born approximation and deal with three-dimensional effects. In particular, this approach is able to account for sucking-in and standing-wave effects taking place close to or inside the object. The simulation of the electronic diffraction by a model nanoscopic carbon rod, eventually containing inhomogeneities, is considered in detail.

  6. The APOSTLE project: Local Group kinematic mass constraints and simulation candidate selection

    NASA Astrophysics Data System (ADS)

    Fattahi, Azadeh; Navarro, Julio F.; Sawala, Till; Frenk, Carlos S.; Oman, Kyle A.; Crain, Robert A.; Furlong, Michelle; Schaller, Matthieu; Schaye, Joop; Theuns, Tom; Jenkins, Adrian

    2016-03-01

    We use a large sample of isolated dark matter halo pairs drawn from cosmological N-body simulations to identify candidate systems whose kinematics match that of the Local Group (LG) of galaxies. We find, in agreement with the `timing argument' and earlier work, that the separation and approach velocity of the Milky Way (MW) and Andromeda (M31) galaxies favour a total mass for the pair of ˜5 × 1012 M⊙. A mass this large, however, is difficult to reconcile with the small relative tangential velocity of the pair, as well as with the small deceleration from the Hubble flow observed for the most distant LG members. Halo pairs that match these three criteria have average masses a factor of ˜2 times smaller than suggested by the timing argument, but with large dispersion. Guided by these results, we have selected 12 halo pairs with total mass in the range 1.6-3.6 × 1012 M⊙ for the APOSTLE project (A Project Of Simulating The Local Environment), a suite of hydrodynamical resimulations at various numerical resolution levels (reaching up to ˜104 M⊙ per gas particle) that use the subgrid physics developed for the EAGLE project. These simulations reproduce, by construction, the main kinematics of the MW-M31 pair, and produce satellite populations whose overall number, luminosities, and kinematics are in good agreement with observations of the MW and M31 companions. The APOSTLE candidate systems thus provide an excellent testbed to confront directly many of the predictions of the Λ cold dark matter cosmology with observations of our local Universe.

  7. Education and Public Outreach at The Pavilion Lake Research Project: Fusion of Science and Education using Web 2.0

    NASA Astrophysics Data System (ADS)

    Cowie, B. R.; Lim, D. S.; Pendery, R.; Laval, B.; Slater, G. F.; Brady, A. L.; Dearing, W. L.; Downs, M.; Forrest, A.; Lees, D. S.; Lind, R. A.; Marinova, M.; Reid, D.; Seibert, M. A.; Shepard, R.; Williams, D.

    2009-12-01

    The Pavilion Lake Research Project (PLRP) is an international multi-disciplinary science and exploration effort to explain the origin and preservation potential of freshwater microbialites in Pavilion Lake, British Columbia, Canada. Using multiple exploration platforms including one person DeepWorker submersibles, Autonomous Underwater Vehicles, and SCUBA divers, the PLRP acts as an analogue research site for conducting science in extreme environments, such as the Moon or Mars. In 2009, the PLRP integrated several Web 2.0 technologies to provide a pilot-scale Education and Public Outreach (EPO) program targeting the internet savvy generation. The seamless integration of multiple technologies including Google Earth, Wordpress, Youtube, Twitter and Facebook, facilitated the rapid distribution of exciting and accessible science and exploration information over multiple channels. Field updates, science reports, and multimedia including videos, interactive maps, and immersive visualization were rapidly available through multiple social media channels, partly due to the ease of integration of these multiple technologies. Additionally, the successful application of videoconferencing via a readily available technology (Skype) has greatly increased the capacity of our team to conduct real-time education and public outreach from remote locations. The improved communication afforded by Web 2.0 has increased the quality of EPO provided by the PLRP, and has enabled a higher level of interaction between the science team and the community at large. Feedback from these online interactions suggest that remote communication via Web 2.0 technologies were effective tools for increasing public discourse and awareness of the science and exploration activity at Pavilion Lake.

  8. World, We Have Problems: Simulation for Large Complex, Risky Projects, and Events

    NASA Technical Reports Server (NTRS)

    Elfrey, Priscilla

    2010-01-01

    Prior to a spacewalk during the NASA STS/129 mission in November 2009, Columbia Broadcasting System (CBS) correspondent William Harwood reported astronauts, "were awakened again", as they had been the day previously. Fearing something not properly connected was causing a leak, the crew, both on the ground and in space, stopped and checked everything. The alarm proved false. The crew did complete its work ahead of schedule, but the incident reminds us that correctly connecting hundreds and thousands of entities, subsystems and systems, finding leaks, loosening stuck valves, and adding replacements to very large complex systems over time does not occur magically. Everywhere major projects present similar pressures. Lives are at - risk. Responsibility is heavy. Large natural and human-created disasters introduce parallel difficulties as people work across boundaries their countries, disciplines, languages, and cultures with known immediate dangers as well as the unexpected. NASA has long accepted that when humans have to go where humans cannot go that simulation is the sole solution. The Agency uses simulation to achieve consensus, reduce ambiguity and uncertainty, understand problems, make decisions, support design, do planning and troubleshooting, as well as for operations, training, testing, and evaluation. Simulation is at the heart of all such complex systems, products, projects, programs, and events. Difficult, hazardous short and, especially, long-term activities have a persistent need for simulation from the first insight into a possibly workable idea or answer until the final report perhaps beyond our lifetime is put in the archive. With simulation we create a common mental model, try-out breakdowns of machinery or teamwork, and find opportunity for improvement. Lifecycle simulation proves to be increasingly important as risks and consequences intensify. Across the world, disasters are increasing. We anticipate more of them, as the results of global warming

  9. Changing Climate Extremes in the Northeast: CMIP5 Simulations and Projections

    NASA Astrophysics Data System (ADS)

    Thibeault, J. M.; Seth, A.

    2013-12-01

    Extreme climate events are known to have severe impacts on human and natural systems. As greenhouse warming progresses, a major concern is the potential for an increase in the frequency and intensity of extreme events. The Northeast (defined as the Northeast US, southern Quebec, and southeastern Ontario) is sensitive to climate extremes. The region is prone to flooding and drought, which poses challenges for infrastructure and water resource management, and increases risks to agriculture and forests. Extreme heat can be dangerous to human health, especially in the large urban centers of the Northeast. Annual average temperatures have steadily increased since the 1970s, accompanied by more frequent extremely hot weather, a longer growing season, and fewer frost days. Heavy precipitation events have become more frequent in recent decades. This research examines multi-model projections of annual and monthly extreme indices for the Northeast, using extreme indices computed by the Expert Team on Climate Change Detection and Indices (ETCCDI) for twenty-three global climate models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) for the 20th century historical and RCP8.5 experiments. Model simulations are compared to HadEX2 and ERA-interim gridded observations. CMIP5 simulations are consistent with observations - conditions in the Northeast are already becoming warmer and wetter. Projections indicate significant shifts toward warmer and wetter conditions by the middle century (2041-2070). Most indices are projected to be largely outside their late 20th century ranges by the late century (2071-2099). These results provide important information to stakeholders developing plans to lessen the adverse impacts of a warmer and wetter climate in the Northeast.

  10. Design and simulation of a micromirror array for a projection TV

    NASA Astrophysics Data System (ADS)

    Choi, Bumkyoo; Lee, Junghoon; Jung, Kyuwon; Shin, Hyungjae

    1999-10-01

    The design of a micromirror for a projection TV is investigated. A static structural analysis is performed to give an optimal shape of the micromirror using the FEM commercial package, ANSYS. A solid modeling is created, and mapped meshes are applied to it in order to satisfy a symmetric condition. A stress analysis shows that maximum stress does not exceed an allowable stress, which is the yield strength. A modal analysis is also executed to find the approximate natural frequencies with different design parameters. The result can be utilized to see which design parameter is strongly dominant. The micromirror was fabricated by Samsung Electronics. Dynamic deflection experiments confirm the results of the simulation.

  11. eScience for molecular-scale simulations and the eMinerals project.

    PubMed

    Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H

    2009-03-13

    We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.

  12. Simulation of the time-projection chamber with triple GEMs for the LAMPS at RAON

    NASA Astrophysics Data System (ADS)

    Jhang, Genie; Lee, Jung Woo; Moon, Byul; Hong, Byungsik; Ahn, Jung Keun; Lee, Jong-Won; Lee, Kyong Sei; Kim, Young Jin; Lee, Hyo Sang

    2016-03-01

    The time-projection chamber (TPC) with triple gas-electron multipliers (GEMs) is designed for the large-acceptance multipurpose spectrometer (LAMPS) at the new radioactive ion-beam facility RAON, a pure Korean term for the accelerator complex, in Korea. The simulation environment has been set up to test the performance of the designed chamber, and the software package for analysis has been developed. Particle identification has been demonstrated to be possible up to 2 GeV/ c in momentum for particles with the charge number 1 and 2 by using the simulated heavy-ion events. The transverse-momentum resolutions are expected to be about 2% for protons and about 1.3% for pions in the relatively high-momentum region. The total reconstruction efficiencies are estimated to be about 90 and 80% for charged pions and protons, respectively.

  13. MEchatronic REspiratory System SImulator for Neonatal Applications (MERESSINA) project: a novel bioengineering goal

    PubMed Central

    Scaramuzzo, Rosa T; Ciantelli, Massimiliano; Baldoli, Ilaria; Bellanti, Lisa; Gentile, Marzia; Cecchi, Francesca; Sigali, Emilio; Tognarelli, Selene; Ghirri, Paolo; Mazzoleni, Stefano; Menciassi, Arianna; Cuttano, Armando; Boldrini, Antonio; Laschi, Cecilia; Dario, Paolo

    2013-01-01

    Respiratory function is mandatory for extrauterine life, but is sometimes impaired in newborns due to prematurity, congenital malformations, or acquired pathologies. Mechanical ventilation is standard care, but long-term complications, such as bronchopulmonary dysplasia, are still largely reported. Therefore, continuous medical education is mandatory to correctly manage devices for assistance. Commercially available breathing function simulators are rarely suitable for the anatomical and physiological realities. The aim of this study is to develop a high-fidelity mechatronic simulator of neonatal airways and lungs for staff training and mechanical ventilator testing. The project is divided into three different phases: (1) a review study on respiratory physiology and pathophysiology and on already available single and multi-compartment models; (2) the prototyping phase; and (3) the on-field system validation. PMID:23966804

  14. Time domain holography: Forward projection of simulated and measured sound pressure fields

    NASA Astrophysics Data System (ADS)

    de La Rochefoucauld, Ombeline; Melon, Manuel; Garcia, Alexandre

    2004-07-01

    In this article, the fundamental principles of forward projecting time domain acoustic pressure fields are summarized. Four different numerical approaches are presented and compared both with simulated and measured signals. The approaches differ in their definition domain: Frequency/time and space/wave vector domains. The simulated source is a planar baffled piston excited with a Gaussian pulsed velocity. The pressure radiated by two different real sources has been measured: The first source is made up of two baffled loudspeakers (a Gaussian white noise can be radiated by a third loudspeaker). The second one is a baffled aluminum plate excited by a short impact at its center. The influence of parameters such as the sound source radius, the array size, the number of microphones and the propagation distance is studied. Finally, results concerning the optimization of the sampling of the sound field are presented.

  15. Introducing the Illustris Project: simulating the coevolution of dark and visible matter in the Universe

    NASA Astrophysics Data System (ADS)

    Vogelsberger, Mark; Genel, Shy; Springel, Volker; Torrey, Paul; Sijacki, Debora; Xu, Dandan; Snyder, Greg; Nelson, Dylan; Hernquist, Lars

    2014-10-01

    We introduce the Illustris Project, a series of large-scale hydrodynamical simulations of galaxy formation. The highest resolution simulation, Illustris-1, covers a volume of (106.5 Mpc)3, has a dark mass resolution of 6.26 × 106 M⊙, and an initial baryonic matter mass resolution of 1.26 × 106 M⊙. At z = 0 gravitational forces are softened on scales of 710 pc, and the smallest hydrodynamical gas cells have an extent of 48 pc. We follow the dynamical evolution of 2 × 18203 resolution elements and in addition passively evolve 18203 Monte Carlo tracer particles reaching a total particle count of more than 18 billion. The galaxy formation model includes: primordial and metal-line cooling with self-shielding corrections, stellar evolution, stellar feedback, gas recycling, chemical enrichment, supermassive black hole growth, and feedback from active galactic nuclei. Here we describe the simulation suite, and contrast basic predictions of our model for the present-day galaxy population with observations of the local universe. At z = 0 our simulation volume contains about 40 000 well-resolved galaxies covering a diverse range of morphologies and colours including early-type, late-type and irregular galaxies. The simulation reproduces reasonably well the cosmic star formation rate density, the galaxy luminosity function, and baryon conversion efficiency at z = 0. It also qualitatively captures the impact of galaxy environment on the red fractions of galaxies. The internal velocity structure of selected well-resolved disc galaxies obeys the stellar and baryonic Tully-Fisher relation together with flat circular velocity curves. In the well-resolved regime, the simulation reproduces the observed mix of early-type and late-type galaxies. Our model predicts a halo mass dependent impact of baryonic effects on the halo mass function and the masses of haloes caused by feedback from supernova and active galactic nuclei.

  16. Validation of Solar Sail Simulations for the NASA Solar Sail Demonstration Project

    NASA Technical Reports Server (NTRS)

    Braafladt, Alexander C.; Artusio-Glimpse, Alexandra B.; Heaton, Andrew F.

    2014-01-01

    NASA's Solar Sail Demonstration project partner L'Garde is currently assembling a flight-like sail assembly for a series of ground demonstration tests beginning in 2015. For future missions of this sail that might validate solar sail technology, it is necessary to have an accurate sail thrust model. One of the primary requirements of a proposed potential technology validation mission will be to demonstrate solar sail thrust over a set time period, which for this project is nominally 30 days. This requirement would be met by comparing a L'Garde-developed trajectory simulation to the as-flown trajectory. The current sail simulation baseline for L'Garde is a Systems Tool Kit (STK) plug-in that includes a custom-designed model of the L'Garde sail. The STK simulation has been verified for a flat plate model by comparing it to the NASA-developed Solar Sail Spaceflight Simulation Software (S5). S5 matched STK with a high degree of accuracy and the results of the validation indicate that the L'Garde STK model is accurate enough to meet the potential future mission requirements. Additionally, since the L'Garde sail deviates considerably from a flat plate, a force model for a non-flat sail provided by L'Garde sail was also tested and compared to a flat plate model in S5. This result will be used in the future as a basis of comparison to the non-flat sail model being developed for STK.

  17. Future of Inertial Fusion Energy

    SciTech Connect

    Nuckolls, J H; Wood, L L

    2002-09-04

    In the past 50 years, fusion R&D programs have made enormous technical progress. Projected billion-dollar scale research facilities are designed to approach net energy production. In this century, scientific and engineering progress must continue until the economics of fusion power plants improves sufficiently to win large scale private funding in competition with fission and non-nuclear energy systems. This economic advantage must be sustained: trillion dollar investments will be required to build enough fusion power plants to generate ten percent of the world's energy. For Inertial Fusion Energy, multi-billion dollar driver costs must be reduced by up to an order of magnitude, to a small fraction of the total cost of the power plant. Major cost reductions could be achieved via substantial improvements in target performance-both higher gain and reduced ignition energy. Large target performance improvements may be feasible through a combination of design innovations, e.g., ''fast ignition,'' propagation down density gradients, and compression of fusion fuel with a combination of driver and chemical energy. The assumptions that limit projected performance of fusion targets should be carefully examined. The National Ignition Facility will enable development and testing of revolutionary targets designed to make possible economically competitive fusion power plants.

  18. (Fusion energy research)

    SciTech Connect

    Phillips, C.A.

    1988-01-01

    This report discusses the following topics: principal parameters achieved in experimental devices (FY88); tokamak fusion test reactor; Princeton beta Experiment-Modification; S-1 Spheromak; current drive experiment; x-ray laser studies; spacecraft glow experiment; plasma deposition and etching of thin films; theoretical plasma; tokamak modeling; compact ignition tokamak; international thermonuclear experimental reactor; Engineering Department; Project Planning and Safety Office; quality assurance and reliability; and technology transfer.

  19. Evaluation of Tropospheric Water Vapor Simulations from the Atmospheric Model Intercomparison Project

    NASA Technical Reports Server (NTRS)

    Gaffen, Dian J.; Rosen, Richard D.; Salstein, David A.; Boyle, James S.

    1997-01-01

    Simulations of humidity from 28 general circulation models for the period 1979-88 from the Atmospheric Model Intercomparison Project are compared with observations from radiosondes over North America and the globe and with satellite microwave observations over the Pacific basin. The simulations of decadal mean values of precipitable water (W) integrated over each of these regions tend to be less moist than the real atmosphere in all three cases; the median model values are approximately 5% less than the observed values. The spread among the simulations is larger over regions of high terrain, which suggests that differences in methods of resolving topographic features are important. The mean elevation of the North American continent is substantially higher in the models than is observed, which may contribute to the overall dry bias of the models over that area. The authors do not find a clear association between the mean topography of a model and its mean W simulation, however, which suggests that the bias over land is not purely a matter of orography. The seasonal cycle of W is reasonably well simulated by the models, although over North America they have a tendency to become moister more quickly in the spring than is observed. The interannual component of the variability of W is not well captured by the models over North America. Globally, the simulated W values show a signal correlated with the Southern Oscillation index but the observations do not. This discrepancy may be related to deficiencies in the radiosonde network, which does not sample the tropical ocean regions well. Overall, the interannual variability of W, as well as its climatology and mean seasonal cycle, are better described by the median of the 28 simulations than by individual members of the ensemble. Tests to learn whether simulated precipitable water, evaporation, and precipitation values may be related to aspects of model formulation yield few clear signals, although the authors find, for

  20. Fusion Data Grid Service

    NASA Astrophysics Data System (ADS)

    Shasharina, Svetlana; Wang, Nanbor

    2004-11-01

    Simulations and experiments in the fusion and plasma physics community generate large datasets at remote sites. Visualization and analysis of these datasets are difficult because of the incompatibility among the various data formats adopted by simulation, experiments, and analysis tools, and the large sizes of analyzed data. Grids and Web Services technologies are capable of providing solutions for such heterogeneous settings, but need to be customized to the field-specific needs and merged with distributed technologies currently used by the community. This paper describes how we are addressing these issues in the Fusion Grid Service under development. We also present performance results of relevant data transfer mechanisms including binary SOAP, DIME, GridFTP and MDSplus and CORBA. We will describe the status of data converters (between HDF5 and MDSplus data types), developed in collaboration with MIT (J. Stillerman). Finally, we will analyze bottlenecks of MDSplus data transfer mechanism (work performed in collaboration with General Atomics (D. Schissel and M. Qian).

  1. Two-dimensional simulations of thermonuclear burn in ignition-scale inertial confinement fusion targets under compressed axial magnetic fields

    SciTech Connect

    Perkins, L. J.; Logan, B. G.; Zimmerman, G. B.; Werner, C. J.

    2013-07-15

    We report for the first time on full 2-D radiation-hydrodynamic implosion simulations that explore the impact of highly compressed imposed magnetic fields on the ignition and burn of perturbed spherical implosions of ignition-scale cryogenic capsules. Using perturbations that highly convolute the cold fuel boundary of the hotspot and prevent ignition without applied fields, we impose initial axial seed fields of 20–100 T (potentially attainable using present experimental methods) that compress to greater than 4 × 10{sup 4} T (400 MG) under implosion, thereby relaxing hotspot areal densities and pressures required for ignition and propagating burn by ∼50%. The compressed field is high enough to suppress transverse electron heat conduction, and to allow alphas to couple energy into the hotspot even when highly deformed by large low-mode amplitudes. This might permit the recovery of ignition, or at least significant alpha particle heating, in submarginal capsules that would otherwise fail because of adverse hydrodynamic instabilities.

  2. Three-dimensional visualization simulation assessment system based on multi-source data fusion for the Wenchuan earthquake

    NASA Astrophysics Data System (ADS)

    Fan, Xiangtao; Du, Xiaoping; Tan, Jian; Zhu, Junjie

    2009-05-01

    We have developed a visual three-dimensional simulation and assessment system of the Wenchuan earthquake zone using the Digital Earth concept and highly efficient spatial information visualization technology. Our goal is to provide a foundation for earthquake disaster relief and reconstruction. A number of major collapses, landslides, mud-rock flows and other geological disasters are analysed in the system, and the system can be used to monitor and analyse earthquake relief, rescue and insurance to provide first-hand source material for disaster situation assessment. Multi-sensor, multi-temporal, multi-resolution remote sensing data and multi-scale DEM data are integrated into this system, and produce a sequence of comprehensive resolution datasets. We present management methods for a dynamic multi-resolution terrain model and multi-level scene. These methods are based on the model division, delaminating algorithm, and make use of high-speed buffering mechanisms to achieve a multi-level large-scale real-time rendering of scenes in the disaster areas, which also effectively reduces the rendering delay, and achieves smooth, seamless roaming at different model levels.

  3. Accelerator & Fusion Research Division 1991 summary of activities

    SciTech Connect

    Not Available

    1991-12-01

    This report discusses research projects in the following areas: Heavy-ion fusion accelerator research; magnetic fusion energy; advanced light source; center for x-ray optics; exploratory studies; superconducting magnets; and bevalac operations.

  4. Accelerator Fusion Research Division 1991 summary of activities

    SciTech Connect

    Berkner, Klaus H.

    1991-12-01

    This report discusses research projects in the following areas: Heavy-ion fusion accelerator research; magnetic fusion energy; advanced light source; center for x-ray optics; exploratory studies; superconducting magnets; and bevalac operations.

  5. The Geoengineering Model Intercomparison Project Phase 6 (GeoMIP6): simulation design and preliminary results

    NASA Astrophysics Data System (ADS)

    Kravitz, B.; Robock, A.; Tilmes, S.; Boucher, O.; English, J. M.; Irvine, P. J.; Jones, A.; Lawrence, M. G.; MacCracken, M.; Muri, H.; Moore, J. C.; Niemeier, U.; Phipps, S. J.; Sillmann, J.; Storelvmo, T.; Wang, H.; Watanabe, S.

    2015-06-01

    We present a suite of new climate model experiment designs for the Geoengineering Model Intercomparison Project (GeoMIP). This set of experiments, named GeoMIP6 (to be consistent with the Coupled Model Intercomparison Project Phase 6), builds on the previous GeoMIP simulations, and has been expanded to address several further important topics, including key uncertainties in extreme events, the use of geoengineering as part of a portfolio of responses to climate change, and the relatively new idea of cirrus cloud thinning to allow more longwave radiation to escape to space. We discuss experiment designs, as well as the rationale for those designs, showing preliminary results from individual models when available. We also introduce a new feature, called the GeoMIP Testbed, which provides a platform for simulations that will be performed with a few models and subsequently assessed to determine whether the proposed experiment designs will be adopted as core (Tier 1) GeoMIP experiments. This is meant to encourage various stakeholders to propose new targeted experiments that address their key open science questions, with the goal of making GeoMIP more relevant to a broader set of communities.

  6. Final Report on Project 01-ERD-017 ''Smart Nanostructures From Computer Simulations''

    SciTech Connect

    Grossman, J C; Williamson, A J

    2004-02-13

    This project had two main objectives. The first major goal was to develop new, powerful computational simulation capabilities. It was important that these tools have the combination of the accuracy needed to describe the quantum mechanical nature of nanoscale systems and the efficiency required to be applied to realistic, experimentally derived materials. The second major goal was to apply these computational methods to calculate and predict the properties of quantum dots--initially composed of silicon, but then of other elements--which could be used to build novel nanotechnology devices. The driving factor of our purpose has been that, through the development and successful application of these tools, we would generate a new capability at LLNL that could be used to make nanostructured materials ''smarter'', e.g., by selectively predicting how to engineering specific, desired properties. To carry out the necessary work to successfully complete this project and deliver on our goals, we established a two-pronged effort from the beginning: (1) to work on developing new, more efficient algorithms and quantum simulation tools, and (2) to solve problems and make predictions regarding properties of quantum dots which were being studied experimentally here at Livermore.

  7. Effects of baseline conditions on the simulated hydrologic response to projected climate change

    USGS Publications Warehouse

    Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.

    2011-01-01

    Changes in temperature and precipitation projected from five general circulation models, using one late-twentieth-century and three twenty-first-century emission scenarios, were downscaled to three different baseline conditions. Baseline conditions are periods of measured temperature and precipitation data selected to represent twentieth-century climate. The hydrologic effects of the climate projections are evaluated using the Precipitation-Runoff Modeling System (PRMS), which is a watershed hydrology simulation model. The Almanor Catchment in the North Fork of the Feather River basin, California, is used as a case study. Differences and similarities between PRMS simulations of hydrologic components (i.e., snowpack formation and melt, evapotranspiration, and streamflow) are examined, and results indicate that the selection of a specific time period used for baseline conditions has a substantial effect on some, but not all, hydrologic variables. This effect seems to be amplified in hydrologic variables, which accumulate over time, such as soil-moisture content. Results also indicate that uncertainty related to the selection of baseline conditions should be evaluated using a range of different baseline conditions. This is particularly important for studies in basins with highly variable climate, such as the Almanor Catchment.

  8. Converting Snow Depth to SWE: The Fusion of Simulated Data with Remote Sensing Retrievals and the Airborne Snow Observatory

    NASA Astrophysics Data System (ADS)

    Bormann, K.; Marks, D. G.; Painter, T. H.; Hedrick, A. R.; Deems, J. S.

    2015-12-01

    Snow cover monitoring has greatly benefited from remote sensing technology but, despite their critical importance, spatially distributed measurements of snow water equivalent (SWE) in mountain terrain remain elusive. Current methods of monitoring SWE rely on point measurements and are insufficient for distributed snow science and effective management of water resources. Many studies have shown that the spatial variability in SWE is largely controlled by the spatial variability in snow depth. JPL's Airborne Snow Observatory mission (ASO) combines LiDAR and spectrometer instruments to retrieve accurate and very high-resolution snow depth measurements at the watershed scale, along with other products such as snow albedo. To make best use of these high-resolution snow depths, spatially distributed snow density data are required to leverage SWE from the measured snow depths. Snow density is a spatially and temporally variable property that cannot yet be reliably extracted from remote sensing techniques, and is difficult to extrapolate to basin scales. However, some physically based snow models have shown skill in simulating bulk snow densities and therefore provide a pathway for snow depth to SWE conversion. Leveraging model ability where remote sensing options are non-existent, ASO employs a physically based snow model (iSnobal) to resolve distributed snow density dynamics across the basin. After an adjustment scheme guided by in-situ data, these density estimates are used to derive the elusive spatial distribution of SWE from the observed snow depth distributions from ASO. In this study, we describe how the process of fusing model data with remote sensing retrievals is undertaken in the context of ASO along with estimates of uncertainty in the final SWE volume products. This work will likely be of interest to those working in snow hydrology, water resource management and the broader remote sensing community.

  9. A NATIONAL COLLABORATORY TO ADVANCE THE SCIENCE OF HIGH TEMPERATURE PLASMA PHYSICS FOR MAGNETIC FUSION

    SciTech Connect

    Allen R. Sanderson; Christopher R. Johnson

    2006-08-01

    This report summarizes the work of the University of Utah, which was a member of the National Fusion Collaboratory (NFC) Project funded by the United States Department of Energy (DOE) under the Scientific Discovery through Advanced Computing Program (SciDAC) to develop a persistent infrastructure to enable scientific collaboration for magnetic fusion research. A five year project that was initiated in 2001, it the NFC built on the past collaborative work performed within the U.S. fusion community and added the component of computer science research done with the USDOE Office of Science, Office of Advanced Scientific Computer Research. The project was itself a collaboration, itself uniting fusion scientists from General Atomics, MIT, and PPPL and computer scientists from ANL, LBNL, and Princeton University, and the University of Utah to form a coordinated team. The group leveraged existing computer science technology where possible and extended or created new capabilities where required. The complete finial report is attached as an addendum. The In the collaboration, the primary technical responsibility of the University of Utah in the collaboration was to develop and deploy an advanced scientific visualization service. To achieve this goal, the SCIRun Problem Solving Environment (PSE) is used on FusionGrid for an advanced scientific visualization service. SCIRun is open source software that gives the user the ability to create complex 3D visualizations and 2D graphics. This capability allows for the exploration of complex simulation results and the comparison of simulation and experimental data. SCIRun on FusionGrid gives the scientist a no-license-cost visualization capability that rivals present day commercial visualization packages. To accelerate the usage of SCIRun within the fusion community, a stand-alone application built on top of SCIRun was developed and deployed. This application, FusionViewer, allows users who are unfamiliar with SCIRun to quickly create

  10. Simulated effect of deep-sea sedimentation and terrestrial weathering on projections of ocean acidification

    NASA Astrophysics Data System (ADS)

    Cao, Long; Zheng, Meidi; Caldeira, Ken

    2016-04-01

    Projections of ocean acidification have often been based on ocean carbon cycle models that do not represent deep-sea sedimentation and terrestrial weathering. Here we use an Earth system model of intermediate complexity to quantify the effect of sedimentation and weathering on projections of ocean acidification under an intensive CO2 emission scenario that releases 5000 Pg C after year 2000. In our simulations, atmospheric CO2 reaches a peak concentration of 2123 ppm near year 2300 with a maximum reduction in surface pH of 0.8. Consideration of deep-sea sedimentation and terrestrial weathering has negligible effect on these peak changes. Only after several millenniums, sedimentation and weathering feedbacks substantially affect projected ocean acidification. Ten thousand years from today, in the constant-alkalinity simulation, surface pH is reduced by ˜0.7 with 95% of the polar oceans undersaturated with respect to calcite, and no ocean has a calcite saturation horizon (CSH) that is deeper than 1000 m. With the consideration of sediment feedback alone, surface pH is reduced by ˜0.5 with 35% of the polar oceans experiencing calcite undersaturation, and 8% global ocean has a CSH deeper than 1000 m. With the addition of weathering feedback, depending on the weathering parameterizations, surface pH is reduced by 0.2-0.4 with no polar oceans experiencing calcite undersaturation, and 30-80% ocean has a CSH that is deeper than 1000 m. Our results indicate that deep-sea sedimentation and terrestrial weathering play an important role in long-term ocean acidification, but have little effect on mitigating ocean acidification in the coming centuries.

  11. Massively parallel simulation with DOE's ASCI supercomputers : an overview of the Los Alamos Crestone project

    SciTech Connect

    Weaver, R. P.; Gittings, M. L.

    2004-01-01

    The Los Alamos Crestone Project is part of the Department of Energy's (DOE) Accelerated Strategic Computing Initiative, or ASCI Program. The main goal of this software development project is to investigate the use of continuous adaptive mesh refinement (CAMR) techniques for application to problems of interest to the Laboratory. There are many code development efforts in the Crestone Project, both unclassified and classified codes. In this overview I will discuss the unclassified SAGE and the RAGE codes. The SAGE (SAIC adaptive grid Eulerian) code is a one-, two-, and three-dimensional multimaterial Eulerian massively parallel hydrodynamics code for use in solving a variety of high-deformation flow problems. The RAGE CAMR code is built from the SAGE code by adding various radiation packages, improved setup utilities and graphics packages and is used for problems in which radiation transport of energy is important. The goal of these massively-parallel versions of the codes is to run extremely large problems in a reasonable amount of calendar time. Our target is scalable performance to {approx}10,000 processors on a 1 billion CAMR computational cell problem that requires hundreds of variables per cell, multiple physics packages (e.g. radiation and hydrodynamics), and implicit matrix solves for each cycle. A general description of the RAGE code has been published in [l],[ 2], [3] and [4]. Currently, the largest simulations we do are three-dimensional, using around 500 million computation cells and running for literally months of calendar time using {approx}2000 processors. Current ASCI platforms range from several 3-teraOPS supercomputers to one 12-teraOPS machine at Lawrence Livermore National Laboratory, the White machine, and one 20-teraOPS machine installed at Los Alamos, the Q machine. Each machine is a system comprised of many component parts that must perform in unity for the successful run of these simulations. Key features of any massively parallel system

  12. A New Approach to Image Fusion Based on Cokriging

    NASA Technical Reports Server (NTRS)

    Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.

    2005-01-01

    We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.

  13. Hardware-Accelerated Simulated Radiography

    SciTech Connect

    Laney, D; Callahan, S; Max, N; Silva, C; Langer, S; Frank, R

    2005-08-04

    We present the application of hardware accelerated volume rendering algorithms to the simulation of radiographs as an aid to scientists designing experiments, validating simulation codes, and understanding experimental data. The techniques presented take advantage of 32-bit floating point texture capabilities to obtain solutions to the radiative transport equation for X-rays. The hardware accelerated solutions are accurate enough to enable scientists to explore the experimental design space with greater efficiency than the methods currently in use. An unsorted hexahedron projection algorithm is presented for curvilinear hexahedral meshes that produces simulated radiographs in the absorption-only regime. A sorted tetrahedral projection algorithm is presented that simulates radiographs of emissive materials. We apply the tetrahedral projection algorithm to the simulation of experimental diagnostics for inertial confinement fusion experiments on a laser at the University of Rochester.

  14. Simulation technology used for risky assessment in deep exploration project in China

    NASA Astrophysics Data System (ADS)

    jiao, J.; Huang, D.; Liu, J.

    2013-12-01

    Deep exploration has been carried out in China for five years in which various heavy duty instruments and equipments are employed for gravity, magnetic, seismic and electromagnetic data prospecting as well as ultra deep drilling rig established for obtaining deep samples, and so on. The deep exploration is a large and complex system engineering crossing multiple subjects with great investment. It is necessary to employ advanced technical means technology for verification, appraisal, and optimization of geographical prospecting equipment development. To reduce risk of the application and exploration, efficient and allegeable management concept and skills have to be enhanced in order to consolidate management measure and workflow to benefit the ambitious project. Therefore, evidence, prediction, evaluation and related decision strategies have to be taken into accouter simultaneously to meet practical scientific requests and technique limits and extendable attempts. Simulation technique is then proposed as a tool that can be used to carry out dynamic test on actual or imagined system. In practice, it is necessary to combine the simulation technique with the instruments and equipment to accomplish R&D tasks. In this paper, simulation technique is introduced into the R&D process of heavy-duty equipment and high-end engineering project technology. Based on the information provided by a drilling group recently, a digital model is constructed by combination of geographical data, 3d visualization, database management, and visual reality technologies together. It result in push ahead a R&D strategy, in which data processing , instrument application, expected result and uncertainty, and even operation workflow effect environment atmosphere are simulated systematically or simultaneously, in order to obtain an optimal consequence as well as equipment updating strategy. The simulation technology is able to adjust, verify, appraise and optimize the primary plan due to changing in

  15. Security on the US Fusion Grid

    SciTech Connect

    Burruss, Justin R.; Fredian, Tom W.; Thompson, Mary R.

    2005-06-01

    The National Fusion Collaboratory project is developing and deploying new distributed computing and remote collaboration technologies with the goal of advancing magnetic fusion energy research. This work has led to the development of the US Fusion Grid (FusionGrid), a computational grid composed of collaborative, compute, and data resources from the three large US fusion research facilities and with users both in the US and in Europe. Critical to the development of FusionGrid was the creation and deployment of technologies to ensure security in a heterogeneous environment. These solutions to the problems of authentication, authorization, data transfer, and secure data storage, as well as the lessons learned during the development of these solutions, may be applied outside of FusionGrid and scale to future computing infrastructures such as those for next-generation devices like ITER.

  16. Data security on the national fusion grid

    SciTech Connect

    Burruss, Justine R.; Fredian, Tom W.; Thompson, Mary R.

    2005-06-01

    The National Fusion Collaboratory project is developing and deploying new distributed computing and remote collaboration technologies with the goal of advancing magnetic fusion energy research. This work has led to the development of the US Fusion Grid (FusionGrid), a computational grid composed of collaborative, compute, and data resources from the three large US fusion research facilities and with users both in the US and in Europe. Critical to the development of FusionGrid was the creation and deployment of technologies to ensure security in a heterogeneous environment. These solutions to the problems of authentication, authorization, data transfer, and secure data storage, as well as the lessons learned during the development of these solutions, may be applied outside of FusionGrid and scale to future computing infrastructures such as those for next-generation devices like ITER.

  17. POST2 End-To-End Descent and Landing Simulation for the Autonomous Landing and Hazard Avoidance Technology Project

    NASA Technical Reports Server (NTRS)

    Fisher, Jody l.; Striepe, Scott A.

    2007-01-01

    The Program to Optimize Simulated Trajectories II (POST2) is used as a basis for an end-to-end descent and landing trajectory simulation that is essential in determining the design and performance capability of lunar descent and landing system models and lunar environment models for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) project. This POST2-based ALHAT simulation provides descent and landing simulation capability by integrating lunar environment and lander system models (including terrain, sensor, guidance, navigation, and control models), along with the data necessary to design and operate a landing system for robotic, human, and cargo lunar-landing success. This paper presents the current and planned development and model validation of the POST2-based end-to-end trajectory simulation used for the testing, performance and evaluation of ALHAT project system and models.

  18. To Create Space on Earth: The Space Environment Simulation Laboratory and Project Apollo

    NASA Technical Reports Server (NTRS)

    Walters, Lori C.

    2003-01-01

    Few undertakings in the history of humanity can compare to the great technological achievement known as Project Apollo. Among those who witnessed Armstrong#s flickering television image were thousands of people who had directly contributed to this historic moment. Amongst those in this vast anonymous cadre were the personnel of the Space Environment Simulation Laboratory (SESL) at the Manned Spacecraft Center (MSC) in Houston, Texas. SESL houses two large thermal-vacuum chambers with solar simulation capabilities. At a time when NASA engineers had a limited understanding of the effects of extremes of space on hardware and crews, SESL was designed to literally create the conditions of space on Earth. With interior dimensions of 90 feet in height and a 55-foot diameter, Chamber A dwarfed the Apollo command/service module (CSM) it was constructed to test. The chamber#s vacuum pumping capacity of 1 x 10(exp -6) torr can simulate an altitude greater than 130 miles above the Earth. A "lunar plane" capable of rotating a 150,000-pound test vehicle 180 deg replicates the revolution of a craft in space. To reproduce the temperature extremes of space, interior chamber walls cool to -280F as two banks of carbon arc modules simulate the unfiltered solar light/heat of the Sun. With capabilities similar to that of Chamber A, early Chamber B tests included the Gemini modular maneuvering unit, Apollo EVA mobility unit and the lunar module. Since Gemini astronaut Charles Bassett first ventured into the chamber in 1966, Chamber B has assisted astronauts in testing hardware and preparing them for work in the harsh extremes of space.

  19. Simulation of extreme rainfall and projection of future changes using the GLIMCLIM model

    NASA Astrophysics Data System (ADS)

    Rashid, Md. Mamunur; Beecham, Simon; Chowdhury, Rezaul Kabir

    2016-08-01

    In this study, the performance of the Generalized LInear Modelling of daily CLImate sequence (GLIMCLIM) statistical downscaling model was assessed to simulate extreme rainfall indices and annual maximum daily rainfall (AMDR) when downscaled daily rainfall from National Centers for Environmental Prediction (NCEP) reanalysis and Coupled Model Intercomparison Project Phase 5 (CMIP5) general circulation models (GCM) (four GCMs and two scenarios) output datasets and then their changes were estimated for the future period 2041-2060. The model was able to reproduce the monthly variations in the extreme rainfall indices reasonably well when forced by the NCEP reanalysis datasets. Frequency Adapted Quantile Mapping (FAQM) was used to remove bias in the simulated daily rainfall when forced by CMIP5 GCMs, which reduced the discrepancy between observed and simulated extreme rainfall indices. Although the observed AMDR were within the 2.5th and 97.5th percentiles of the simulated AMDR, the model consistently under-predicted the inter-annual variability of AMDR. A non-stationary model was developed using the generalized linear model for local, shape and scale to estimate the AMDR with an annual exceedance probability of 0.01. The study shows that in general, AMDR is likely to decrease in the future. The Onkaparinga catchment will also experience drier conditions due to an increase in consecutive dry days coinciding with decreases in heavy (>long term 90th percentile) rainfall days, empirical 90th quantile of rainfall and maximum 5-day consecutive total rainfall for the future period (2041-2060) compared to the base period (1961-2000).

  20. Review of alternative concepts for magnetic fusion

    SciTech Connect

    Krakowski, R.A.; Miller, R.L.; Hagenson, R.L.

    1980-01-01

    Although the Tokamak represents the mainstay of the world's quest for magnetic fusion power, with the tandem mirror serving as a primary backup concept in the US fusion program, a wide range of alternative fusion concepts (AFC's) have been and are being pursued. This review presents a summary of past and present reactor projections of a majority of AFC's. Whenever possible, quantitative results are given.

  1. Simulator Network Project Report: A tool for improvement of teaching materials and targeted resource usage in Skills Labs

    PubMed Central

    Damanakis, Alexander; Blaum, Wolf E.; Stosch, Christoph; Lauener, Hansjörg; Richter, Sabine; Schnabel, Kai P.

    2013-01-01

    During the last decade, medical education in the German-speaking world has been striving to become more practice-oriented. This is currently being achieved in many schools through the implementation of simulation-based instruction in Skills Labs. Simulators are thus an essential part of this type of medical training, and their acquisition and operation by a Skills Lab require a large outlay of resources. Therefore, the Practical Skills Committee of the Medical Education Society (GMA) introduced a new project, which aims to improve the flow of information between the Skills Labs and enable a transparent assessment of the simulators via an online database (the Simulator Network). PMID:23467581

  2. Simulator Network project report: a tool for improvement of teaching materials and targeted resource usage in Skills Labs.

    PubMed

    Damanakis, Alexander; Blaum, Wolf E; Stosch, Christoph; Lauener, Hansjörg; Richter, Sabine; Schnabel, Kai P

    2013-01-01

    During the last decade, medical education in the German-speaking world has been striving to become more practice-oriented. This is currently being achieved in many schools through the implementation of simulation-based instruction in Skills Labs. Simulators are thus an essential part of this type of medical training, and their acquisition and operation by a Skills Lab require a large outlay of resources. Therefore, the Practical Skills Committee of the Medical Education Society (GMA) introduced a new project, which aims to improve the flow of information between the Skills Labs and enable a transparent assessment of the simulators via an online database (the Simulator Network).

  3. Is social projection based on simulation or theory? Why new methods are needed for differentiating.

    PubMed

    Bazinger, Claudia; Kühberger, Anton

    2012-12-01

    The literature on social cognition reports many instances of a phenomenon titled 'social projection' or 'egocentric bias'. These terms indicate egocentric predictions, i.e., an over-reliance on the self when predicting the cognition, emotion, or behavior of other people. The classic method to diagnose egocentric prediction is to establish high correlations between our own and other people's cognition, emotion, or behavior. We argue that this method is incorrect because there is a different way to come to a correlation between own and predicted states, namely, through the use of theoretical knowledge. Thus, the use of correlational measures is not sufficient to identify the source of social predictions. Based on the distinction between simulation theory and theory theory, we propose the following alternative methods for inferring prediction strategies: independent vs. juxtaposed predictions, the use of 'hot' mental processes, and the use of participants' self-reports.

  4. Terascale Optimal PDE Simulations

    SciTech Connect

    David Keyes

    2009-07-28

    The Terascale Optimal PDE Solvers (TOPS) Integrated Software Infrastructure Center (ISIC) was created to develop and implement algorithms and support scientific investigations performed by DOE-sponsored researchers. These simulations often involve the solution of partial differential equations (PDEs) on terascale computers. The TOPS Center researched, developed and deployed an integrated toolkit of open-source, optimal complexity solvers for the nonlinear partial differential equations that arise in many DOE application areas, including fusion, accelerator design, global climate change and reactive chemistry. The algorithms created as part of this project were also designed to reduce current computational bottlenecks by orders of magnitude on terascale computers, enabling scientific simulation on a scale heretofore impossible.

  5. Cold fusion, Alchemist's dream

    SciTech Connect

    Clayton, E.D.

    1989-09-01

    In this report the following topics relating to cold fusion are discussed: muon catalysed cold fusion; piezonuclear fusion; sundry explanations pertaining to cold fusion; cosmic ray muon catalysed cold fusion; vibrational mechanisms in excited states of D{sub 2} molecules; barrier penetration probabilities within the hydrogenated metal lattice/piezonuclear fusion; branching ratios of D{sub 2} fusion at low energies; fusion of deuterons into {sup 4}He; secondary D+T fusion within the hydrogenated metal lattice; {sup 3}He to {sup 4}He ratio within the metal lattice; shock induced fusion; and anomalously high isotopic ratios of {sup 3}He/{sup 4}He.

  6. Cirrus Parcel Model Comparison Project. Phase 1; The Critical Components to Simulate Cirrus Initiation Explicitly

    NASA Technical Reports Server (NTRS)

    Lin, Ruei-Fong; Starr, David OC; DeMott, Paul J.; Cotton, Richard; Sassen, Kenneth; Jensen, Eric; Einaudi, Franco (Technical Monitor)

    2001-01-01

    The Cirrus Parcel Model Comparison Project, a project of the GCSS (GEWEX Cloud System Studies) Working Group on Cirrus Cloud Systems, involves the systematic comparison of current models of ice crystal nucleation and growth for specified, typical, cirrus cloud environments. In Phase I of the project reported here, simulated cirrus cloud microphysical properties are compared for situations of "warm" (40 C) and "cold" (-60 C) cirrus, both subject to updrafts of 4, 20 and 100 centimeters per second. Five models participated. The various models employ explicit microphysical schemes wherein the size distribution of each class of particles (aerosols and ice crystals) is resolved into bins or treated separately. Simulations are made including both the homogeneous and heterogeneous ice nucleation mechanisms. A single initial aerosol population of sulfuric acid particles is prescribed for all simulations. To isolate the treatment of the homogeneous freezing (of haze droplets) nucleation process, the heterogeneous nucleation mechanism is disabled for a second parallel set of simulations. Qualitative agreement is found for the homogeneous-nucleation- only simulations, e.g., the number density of nucleated ice crystals increases with the strength of the prescribed updraft. However, significant quantitative differences are found. Detailed analysis reveals that the homogeneous nucleation rate, haze particle solution concentration, and water vapor uptake rate by ice crystal growth (particularly as controlled by the deposition coefficient) are critical components that lead to differences in predicted microphysics. Systematic bias exists between results based on a modified classical theory approach and models using an effective freezing temperature approach to the treatment of nucleation. Each approach is constrained by critical freezing data from laboratory studies, but each includes assumptions that can only be justified by further laboratory research. Consequently, it is not yet

  7. Simulated lesion, human observer performance comparison between thin-section dedicated breast CT images versus computed thick-section simulated projection images of the breast.

    PubMed

    Chen, L; Boone, J M; Abbey, C K; Hargreaves, J; Bateni, C; Lindfors, K K; Yang, K; Nosratieh, A; Hernandez, A; Gazi, P

    2015-04-21

    The objective of this study was to compare the lesion detection performance of human observers between thin-section computed tomography images of the breast, with thick-section (>40 mm) simulated projection images of the breast. Three radiologists and six physicists each executed a two alterative force choice (2AFC) study involving simulated spherical lesions placed mathematically into breast images produced on a prototype dedicated breast CT scanner. The breast image data sets from 88 patients were used to create 352 pairs of image data. Spherical lesions with diameters of 1, 2, 3, 5, and 11 mm were simulated and adaptively positioned into 3D breast CT image data sets; the native thin section (0.33 mm) images were averaged to produce images with different slice thicknesses; average section thicknesses of 0.33, 0.71, 1.5 and 2.9 mm were representative of breast CT; the average 43 mm slice thickness served to simulate simulated projection images of the breast.The percent correct of the human observer's responses were evaluated in the 2AFC experiments. Radiologists lesion detection performance was significantly (p < 0.05) better in the case of thin-section images, compared to thick section images similar to mammography, for all but the 1 mm lesion diameter lesions. For example, the average of three radiologist's performance for 3 mm diameter lesions was 92% correct for thin section breast CT images while it was 67% for the simulated projection images. A gradual reduction in observer performance was observed as the section thickness increased beyond about 1 mm. While a performance difference based on breast density was seen in both breast CT and the projection image results, the average radiologist performance using breast CT images in dense breasts outperformed the performance using simulated projection images in fatty breasts for all lesion diameters except 11 mm. The average radiologist performance outperformed that of the average physicist observer, however trends

  8. Simulated lesion, human observer performance comparison between thin-section dedicated breast CT images versus computed thick-section simulated projection images of the breast

    PubMed Central

    Chen, L; Boone, JM; Abbey, CK; Hargreaves, J; Bateni, C; Lindfors, KK; Yang, K; Nosratieh, A; Hernandez, A; Gazi, P

    2015-01-01

    Objectives The objective of this study was to compare the lesion detection performance of human observers between thin-section computed tomography images of the breast, with thick-section (>40 mm) simulated projection images of the breast. Methods Three radiologists and six physicists each executed a two alterative force choice (2AFC) study involving simulated spherical lesions placed mathematically into breast images produced on a prototype dedicated breast CT scanner. The breast image data sets from 88 patients were used to create 352 pairs of image data. Spherical lesions with diameters of 1, 2, 3, 5, and 11 mm were simulated and adaptively positioned into 3D breast CT image data sets; the native thin section (0.33 mm) images were averaged to produce images with different slice thicknesses; average section thicknesses of 0.33 mm, 0.71 mm, 1.5 mm, and 2.9 mm were representative of breast CT; the average 43 mm slice thickness served to simulate simulated projection images of the breast. Results The percent correct of the human observer’s responses were evaluated in the 2AFC experiments. Radiologists lesion detection performance was significantly (p<0.05) better in the case of thin-section images, compared to thick section images similar to mammography, for all but the 1 mm lesion diameter lesions. For example, the average of three radiologist’s performance for 3 mm diameter lesions was 92 % correct for thin section breast CT images while it was 67 % for the simulated projection images. A gradual reduction in observer performance was observed as the section thickness increased beyond about 1 mm. While a performance difference based on breast density was seen in both breast CT and the projection image results, the average radiologist performance using breast CT images in dense breasts outperformed the performance using simulated projection images in fatty breasts for all lesion diameters except 11 mm. The average radiologist performance outperformed that of the

  9. PROJECT CLEA: Two Decades of Astrophysics Research Simulations for Astronomy Education

    NASA Astrophysics Data System (ADS)

    Marschall, Laurence A.; Snyder, G.; Cooper, P.

    2013-01-01

    Since 1992, Project CLEA (Contemporary Laboratory Experiences in Astronomy) has been developing simulations for the astronomy laboratory that engage students in the experience of modern astrophysical research. Though designed for introductory undergraduate courses, CLEA software can be flexibly configured for use in high-school classes and in upper-level observational astronomy classes, and has found usage in a wide spectrum of classrooms and on-line courses throughout the world. Now at the two-decade mark, CLEA has produced 16 exercises covering a variety of planetary, stellar, and extragalactic research topics at wavelengths from radio to X-ray. Project CLEA’s most recent product, VIREO, the Virtual Educational Observatory, is a flexible all-sky environment for developing a variety of further exercises. We review the current CLEA offerings and look to the future, especially describing further challenges in developing and maintaining the functionality of CLEA and similar activities as the current investigators wind down the funded development process. This research was sponsored throughout the world. by the National Science Foundation, Gettysburg College, and NASA's XMM-Newton mission.

  10. A system for testing distributed information fusion applications for maritime surveillance

    NASA Astrophysics Data System (ADS)

    Wehn, Hans; Happe, Jens; Guitouni, Adel; Valin, Pierre; Bossé, Éloi

    2008-03-01

    A PRECARN partnership project, called CanCoastWatch (CCW), is bringing together a team of researchers from industry, government, and academia for creating an advanced simulation test bed for the purpose of evaluating the effectiveness of Network Enabled Operations in a Coastal Wide Area Surveillance situation. The test bed allows experimenting with higher-level distributed information fusion, dynamic resource management and configuration management given multiple constraints on the resources and their communications networks. The test bed provides general services that are useful for testing many fusion applications. This includes a multi-layer plug-and-play architecture, and a general multi-agent framework based on John Boyd's OODA loop.

  11. Container cargo simulation modeling for measuring impacts of infrastructure investment projects in Pearl River Delta

    NASA Astrophysics Data System (ADS)

    Li, Jia-Qi; Shibasaki, Ryuichi; Li, Bo-Wei

    2010-03-01

    In the Pearl River Delta (PRD), there is severe competition between container ports, particularly those in Hong Kong, Shenzhen, and Guangzhou, for collecting international maritime container cargo. In addition, the second phase of the Nansha terminal in Guangzhou’s port and the first phase of the Da Chang Bay container terminal in Shenzhen opened last year. Under these circumstances, there is an increasing need to quantitatively measure the impact these infrastructure investments have on regional cargo flows. The analysis should include the effects of container terminal construction, berth deepening, and access road construction. The authors have been developing a model for international cargo simulation (MICS) which can simulate the movement of cargo. The volume of origin-destination (OD) container cargo in the East Asian region was used as an input, in order to evaluate the effects of international freight transportation policies. This paper focuses on the PRD area and, by incorporating a more detailed network, evaluates the impact of several infrastructure investment projects on freight movement.

  12. The boreal summer intraseasonal oscillation simulated by four Chinese AGCMs participating in the CMIP5 project

    NASA Astrophysics Data System (ADS)

    Zhao, Chongbo; Zhou, Tianjun; Song, Lianchun; Ren, Hongli

    2014-09-01

    The performances of four Chinese AGCMs participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5) in the simulation of the boreal summer intraseasonal oscillation (BSISO) are assessed. The authors focus on the major characteristics of BSISO: the intensity, significant period, and propagation. The results show that the four AGCMs can reproduce boreal summer intraseasonal signals of precipitation; however their limitations are also evident. Compared with the Climate Prediction Center Merged Analysis of Precipitation (CMAP) data, the models underestimate the strength of the intraseasonal oscillation (ISO) over the eastern equatorial Indian Ocean (IO) during the boreal summer (May to October), but overestimate the intraseasonal variability over the western Pacific (WP). In the model results, the westward propagation dominates, whereas the eastward propagation dominates in the CMAP data. The northward propagation in these models is tilted southwest-northeast, which is also different from the CMAP result. Thus, there is not a northeast-southwest tilted rain belt revolution off the equator during the BSISO's eastward journey in the models. The biases of the BSISO are consistent with the summer mean state, especially the vertical shear. Analysis also shows that there is a positive feedback between the intraseasonal precipitation and the summer mean precipitation. The positive feedback processes may amplify the models' biases in the BSISO simulation.

  13. Monte Carlo simulations for the space radiation superconducting shield project (SR2S)

    NASA Astrophysics Data System (ADS)

    Vuolo, M.; Giraudo, M.; Musenich, R.; Calvelli, V.; Ambroglini, F.; Burger, W. J.; Battiston, R.

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield - a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat.

  14. Collaborative Simulation and Testing of the Superconducting Dipole Prototype Magnet for the FAIR Project

    NASA Astrophysics Data System (ADS)

    Zhu, Yinfeng; Zhu, Zhe; Xu, Houchang; Wu, Weiyue

    2012-08-01

    The superconducting dipole prototype magnet of the collector ring for the Facility for Antiproton and Ion Research (FAIR) is an international cooperation project. The collaborative simulation and testing of the developed prototype magnet is presented in this paper. To evaluate the mechanical strength of the coil case during quench, a 3-dimensional (3D) electromagnetic (EM) model was developed based on the solid97 magnetic vector element in the ANSYS commercial software, which includes the air region, coil and yoke. EM analysis was carried out with a peak operating current at 278 A. Then, the solid97 element was transferred into the solid185 element, the coupled analysis was switched from electromagnetic to structural, and the finite element model for the coil case and glass-fiber reinforced composite (G10) spacers was established by the ANSYS Parametric Design Language based on the 3D model from the CATIA V5 software. However, to simulate the friction characteristics inside the coil case, the conta173 surface-to-surface contact element was established. The results for the coil case and G10 spacers show that they are safe and have sufficient strength, on the basis of testing in discharge and quench scenarios.

  15. Monte Carlo simulations for the space radiation superconducting shield project (SR2S).

    PubMed

    Vuolo, M; Giraudo, M; Musenich, R; Calvelli, V; Ambroglini, F; Burger, W J; Battiston, R

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield--a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat.

  16. Monte Carlo simulations for the space radiation superconducting shield project (SR2S).

    PubMed

    Vuolo, M; Giraudo, M; Musenich, R; Calvelli, V; Ambroglini, F; Burger, W J; Battiston, R

    2016-02-01

    Astronauts on deep-space long-duration missions will be exposed for long time to galactic cosmic rays (GCR) and Solar Particle Events (SPE). The exposure to space radiation could lead to both acute and late effects in the crew members and well defined countermeasures do not exist nowadays. The simplest solution given by optimized passive shielding is not able to reduce the dose deposited by GCRs below the actual dose limits, therefore other solutions, such as active shielding employing superconducting magnetic fields, are under study. In the framework of the EU FP7 SR2S Project - Space Radiation Superconducting Shield--a toroidal magnetic system based on MgB2 superconductors has been analyzed through detailed Monte Carlo simulations using Geant4 interface GRAS. Spacecraft and magnets were modeled together with a simplified mechanical structure supporting the coils. Radiation transport through magnetic fields and materials was simulated for a deep-space mission scenario, considering for the first time the effect of secondary particles produced in the passage of space radiation through the active shielding and spacecraft structures. When modeling the structures supporting the active shielding systems and the habitat, the radiation protection efficiency of the magnetic field is severely decreasing compared to the one reported in previous studies, when only the magnetic field was modeled around the crew. This is due to the large production of secondary radiation taking place in the material surrounding the habitat. PMID:26948010

  17. Imaging in turbid media by modified filtered back projection method using data from Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Aggarwal, Ashwani; Vasu, Ram M.

    2003-07-01

    Noninvasive diagnosis in medicine has shown considerable attention in recent years. Several methods are already available for imaging the biological tissue like X-ray computerized tomography, magentic resonance imaging and ultrasound imaging et c. But each of these methods has its own disadvantages. Optical tomography which uses NIR light is one of the emerging methods in teh field of medical imaging because it is non-invasive in nature. The only problem that occurs in using light for imaging the tissue is that it is highly scattered inside tissue, so the propagation of light in tissue is not confined to straight lines as is the case with X-ray tomography. Therefore the need arises to understand the behaviour of propagation of light in tissue. There are several methods for light interaction with tissue. Monte Carlo method is one of these methods which is a simple technique for simulation of light through tissue. The only problem faced with Monte Carlo simulation is its high computational time. Once the data is obtained using Monte Carlo simulation, it need to be inverted to obtain the reconstruction of tissue image. There are standard methods of reconstruction like algebraic reconstruction method, filtered backprojection method etc. But these methods can not be used as such in the case when light is used as probing radiations because it is highly scattered inside the tissue. The standard filtered backprojection method has been modified so that the zigzag path of photons is taken into consideration while back projecting the data. This is achieved by dividing the tissue domain in a square grid and storing the average path traversed in each grid element. It has been observed that the reconstruction obtained using this modification is much better than the result in case of standard filtered backprojection method.

  18. Sensor fusion for intelligent process control.

    SciTech Connect

    Connors, John J.; Hill, Kevin; Hanekamp, David; Haley, William F.; Gallagher, Robert J.; Gowin, Craig; Farrar, Arthur R.; Sheaffer, Donald A.; DeYoung, Mark A.; Bertram, Lee A.; Dodge, Craig; Binion, Bruce; Walsh, Peter M.; Houf, William G.; Desam, Padmabhushana R.; Tiwary, Rajiv; Stokes, Michael R.; Miller, Alan J.; Michael, Richard W.; Mayer, Raymond M.; Jiao, Yu; Smith, Philip J.; Arbab, Mehran; Hillaire, Robert G.

    2004-08-01

    An integrated system for the fusion of product and process sensors and controls for production of flat glass was envisioned, having as its objective the maximization of throughput and product quality subject to emission limits, furnace refractory wear, and other constraints. Although the project was prematurely terminated, stopping the work short of its goal, the tasks that were completed show the value of the approach and objectives. Though the demonstration was to have been done on a flat glass production line, the approach is applicable to control of production in the other sectors of the glass industry. Furthermore, the system architecture is also applicable in other industries utilizing processes in which product uniformity is determined by ability to control feed composition, mixing, heating and cooling, chemical reactions, and physical processes such as distillation, crystallization, drying, etc. The first phase of the project, with Visteon Automotive Systems as industrial partner, was focused on simulation and control of the glass annealing lehr. That work produced the analysis and computer code that provide the foundation for model-based control of annealing lehrs during steady state operation and through color and thickness changes. In the second phase of the work, with PPG Industries as the industrial partner, the emphasis was on control of temperature and combustion stoichiometry in the melting furnace, to provide a wider operating window, improve product yield, and increase energy efficiency. A program of experiments with the furnace, CFD modeling and simulation, flow measurements, and sensor fusion was undertaken to provide the experimental and theoretical basis for an integrated, model-based control system utilizing the new infrastructure installed at the demonstration site for the purpose. In spite of the fact that the project was terminated during the first year of the second phase of the work, the results of these first steps toward implementation

  19. Congress turns cold on fusion

    SciTech Connect

    Marshall, E.

    1984-06-22

    A 5% cut in fusion research budgets will force some programs to be dropped in order to keep the large machinery running unless US and European scientists collaborate instead of competing. Legislators became uneasy about the escalating costs of the new devices. The 1984 budget of $470 million for magnetic fusion research is only half the projected cost of the Tokomak Fusion Core Experiment (TFCX) planned to ignite, for the first time, a self-sustaining burn. Planning for the TCFX continued despite the message from Congress. Work at the large institutions at Princeton, MIT, etc. may survive at the expense of other programs, some of which will lose academic programs as well. Scientists point to the loss of new ideas and approaches when projects are cancelled. Enthusiasm is growing for international collaboration.

  20. The EAGLE project: simulating the evolution and assembly of galaxies and their environments

    NASA Astrophysics Data System (ADS)

    Schaye, Joop; Crain, Robert A.; Bower, Richard G.; Furlong, Michelle; Schaller, Matthieu; Theuns, Tom; Dalla Vecchia, Claudio; Frenk, Carlos S.; McCarthy, I. G.; Helly, John C.; Jenkins, Adrian; Rosas-Guevara, Y. M.; White, Simon D. M.; Baes, Maarten; Booth, C. M.; Camps, Peter; Navarro, Julio F.; Qu, Yan; Rahmati, Alireza; Sawala, Till; Thomas, Peter A.; Trayford, James

    2015-01-01

    We introduce the Virgo Consortium's Evolution and Assembly of GaLaxies and their Environments (EAGLE) project, a suite of hydrodynamical simulations that follow the formation of galaxies and supermassive black holes in cosmologically representative volumes of a standard Λ cold dark matter universe. We discuss the limitations of such simulations in light of their finite resolution and poorly constrained subgrid physics, and how these affect their predictive power. One major improvement is our treatment of feedback from massive stars and active galactic nuclei (AGN) in which thermal energy is injected into the gas without the need to turn off cooling or decouple hydrodynamical forces, allowing winds to develop without predetermined speed or mass loading factors. Because the feedback efficiencies cannot be predicted from first principles, we calibrate them to the present-day galaxy stellar mass function and the amplitude of the galaxy-central black hole mass relation, also taking galaxy sizes into account. The observed galaxy stellar mass function is reproduced to ≲ 0.2 dex over the full resolved mass range, 108 < M*/M⊙ ≲ 1011, a level of agreement close to that attained by semi-analytic models, and unprecedented for hydrodynamical simulations. We compare our results to a representative set of low-redshift observables not considered in the calibration, and find good agreement with the observed galaxy specific star formation rates, passive fractions, Tully-Fisher relation, total stellar luminosities of galaxy clusters, and column density distributions of intergalactic C IV and O VI. While the mass-metallicity relations for gas and stars are consistent with observations for M* ≳ 109 M⊙ (M* ≳ 1010 M⊙ at intermediate resolution), they are insufficiently steep at lower masses. For the reference model, the gas fractions and temperatures are too high for clusters of galaxies, but for galaxy groups these discrepancies can be resolved by adopting a higher

  1. Final Technical Report for Project "Improving the Simulation of Arctic Clouds in CCSM3"

    SciTech Connect

    Stephen J. Vavrus

    2008-11-15

    This project has focused on the simulation of Arctic clouds in CCSM3 and how the modeled cloud amount (and climate) can be improved substantially by altering the parameterized low cloud fraction. The new formula, dubbed 'freeezedry', alleviates the bias of excessive low clouds during polar winter by reducing the cloud amount under very dry conditions. During winter, freezedry decreases the low cloud amount over the coldest regions in high latitudes by over 50% locally and more than 30% averaged across the Arctic (Fig. 1). The cloud reduction causes an Arctic-wide drop of 15 W m{sup -2} in surface cloud radiative forcing (CRF) during winter and about a 50% decrease in mean annual Arctic CRF. Consequently, wintertime surface temperatures fall by up to 4 K on land and 2-8 K over the Arctic Ocean, thus significantly reducing the model's pronounced warm bias (Fig. 1). While improving the polar climate simulation in CCSM3, freezedry has virtually no influence outside of very cold regions (Fig. 2) or during summer (Fig. 3), which are space and time domains that were not targeted. Furthermore, the simplicity of this parameterization allows it to be readily incorporated into other GCMs, many of which also suffer from excessive wintertime polar cloudiness, based on the results from the CMIP3 archive (Vavrus et al., 2008). Freezedry also affects CCSM3's sensitivity to greenhouse forcing. In a transient-CO{sub 2} experiment, the model version with freezedry warms up to 20% less in the North Polar and South Polar regions (1.5 K and 0.5 K smaller warming, respectively) (Fig. 4). Paradoxically, the muted high-latitude response occurs despite a much larger increase in cloud amount with freezedry during non-summer months (when clouds warm the surface), apparently because of the colder modern reference climate. These results of the freezedry parameterization have recently been published (Vavrus and D. Waliser, 2008: An improved parameterization for simulating Arctic cloud amount in

  2. S2-Project: Near-fault earthquake ground motion simulation in the Sulmona alluvial basin

    NASA Astrophysics Data System (ADS)

    Faccioli, E.; Stupazzini, M.; Galadini, F.; Gori, S.

    2008-12-01

    Recently the Italian Department of Civil Protection (DPC), in cooperation with Istituto Nazionale di Geofisica e Vulcanologia (INGV) has promoted the 'S2' research project (http://nuovoprogettoesse2.stru.polimi.it/) aimed at the design, testing and application of an open-source code for seismic hazard assessment (SHA). The tool envisaged will likely differ in several important respects from an existing international initiative (Open SHA, Field et al., 2003). In particular, while "the OpenSHA collaboration model envisions scientists developing their own attenuation relationships and earthquake rupture forecasts, which they will deploy and maintain in their own systems" , the main purpose of S2 project is to provide a flexible computational tool for SHA, primarily suited for the needs of DPC, which not necessarily are scientific needs. Within S2, a crucial issue is to make alternative approaches available to quantify the ground motion, with emphasis on the near field region. The SHA architecture envisaged will allow for the use of ground motion descriptions other than those yielded by empirical attenuation equations, for instance user generated motions provided by deterministic source and wave propagation simulations. In this contribution, after a brief presentation of Project S2, we intend to illustrate some preliminary 3D scenario simulations performed in the alluvial basin of Sulmona (Central Italy), as an example of the type of descriptions that can be handled in the future SHA architecture. In detail, we selected some seismogenic sources (from the DISS database), believed to be responsible for a number of destructive historical earthquakes, and derive from them a family of simplified geometrical and mechanical source models spanning across a reasonable range of parameters, so that the extent of the main uncertainties can be covered. Then, purely deterministic (for frequencies < 2Hz) and hybrid deterministic- stochastic source and propagation simulations are

  3. Protoplast Fusion

    PubMed Central

    Yamada, Yasuyuki; Hara, Yasuhiro; Katagi, Hiroaki; Senda, Mitsugi

    1980-01-01

    The relation between the composition of the phospholipid molecular species in a cell membrane and the velocity of protoplast fusion was studied using cells cultured at a low temperature (10 C). Cells cultured at a low temperature contained larger proportions of phospholipids of low phase transition point, the 1,2-dilinoleoyl-type, than those cultured at a normal temperature (25 C). When treated with polyethylene glycol 6000, protoplasts from cells cultured at 10 C fused and progressed to the fused sphere stage more rapidly than did those from cells cultured at 25 C. PMID:16661339

  4. Splenogonadal fusion.

    PubMed

    Tsingoglou, S; Wilkinson, A W

    1976-04-01

    The fusion between splenic tissue and the left gonad or the derivatives of the left mesonephros is a rare congenital anomaly first described in detail by Pommer in 1887/9 and divided into two forms by Putschar and Manion in 1956. In the first or continuous type a cord of splenic or fibrous tissue connects the spleen and the gonadalmesonephric structures. In the second type the fused splenomesonephric structures have lost continuity with the main spleen. An example of the continuous form is presented and the previous reports are briefly reviewed.

  5. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    NASA Astrophysics Data System (ADS)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Fazio, D.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Sedov, A.; Twomey, M. S.; Wang, F.; Zaytsev, A.

    2015-12-01

    During the LHC Long Shutdown 1 (LSI) period, that started in 2013, the Simulation at Point1 (Sim@P1) project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High-Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 Virtual Machines (VMs) each with 8 CPU cores, for a total of up to 22000 parallel jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 project, operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 33 million CPU-hours and it generated more than 1.1 billion Monte Carlo events. The design aspects are presented: the virtualization platform exploited by Sim@P1 avoids interferences with TDAQ operations and it guarantees the security and the usability of the ATLAS private network. The cloud mechanism allows the separation of the needed support on both infrastructural (hardware, virtualization layer) and logical (Grid site support) levels. This paper focuses on the operational aspects of such a large system during the upcoming LHC Run 2 period: simple, reliable, and efficient tools are needed to quickly switch from Sim@P1 to TDAQ mode and back, to exploit the resources when they are not used for the data acquisition, even for short periods. The evolution of the central OpenStack infrastructure is described, as it was upgraded from Folsom to the Icehouse release, including the scalability issues addressed.

  6. Project MANTIS: A MANTle Induction Simulator for coupling geodynamic and electromagnetic modeling

    NASA Astrophysics Data System (ADS)

    Weiss, C. J.

    2009-12-01

    A key component to testing geodynamic hypotheses resulting from the 3D mantle convection simulations is the ability to easily translate the predicted physiochemical state to the model space relevant for an independent geophysical observation, such as earth's seismic, geodetic or electromagnetic response. In this contribution a new parallel code for simulating low-frequency, global-scale electromagnetic induction phenomena is introduced that has the same Earth discretization as the popular CitcomS mantle convection code. Hence, projection of the CitcomS model into the model space of electrical conductivity is greatly simplified, and focuses solely on the node-to-node, physics-based relationship between these Earth parameters without the need for "upscaling", "downscaling", averaging or harmonizing with some other model basis such as spherical harmonics. Preliminary performance tests of the MANTIS code on shared and distributed memory parallel compute platforms shows favorable scaling (>70% efficiency) for up to 500 processors. As with CitcomS, an OpenDX visualization widget (VISMAN) is also provided for 3D rendering and interactive interrogation of model results. Details of the MANTIS code will be briefly discussed here, focusing on compatibility with CitcomS modeling, as will be preliminary results in which the electromagnetic response of a CitcomS model is evaluated. VISMAN rendering of electrical tomography-derived electrical conductivity model overlain by an a 1x1 deg crustal conductivity map. Grey scale represents the log_10 magnitude of conductivity [S/m]. Arrows are horiztonal components of a hypothetical magnetospheric source field used to electromagnetically excite the conductivity model.

  7. Neural Network Approach To Sensory Fusion

    NASA Astrophysics Data System (ADS)

    Pearson, John C.; Gelfand, Jack J.; Sullivan, W. E.; Peterson, Richard M.; Spence, Clay D.

    1988-08-01

    We present a neural network model for sensory fusion based on the design of the visual/acoustic target localiza-tion system of the barn owl. This system adaptively fuses its separate visual and acoustic representations of object position into a single joint representation used for head orientation. The building block in this system, as in much of the brain, is the neuronal map. Neuronal maps are large arrays of locally interconnected neurons that represent information in a map-like form, that is, parameter values are systematically encoded by the position of neural activation in the array. The computational load is distributed to a hierarchy of maps, and the computation is performed in stages by transforming the representation from map to map via the geometry of the projections between the maps and the local interactions within the maps. For example, azimuthal position is computed from the frequency and binaural phase information encoded in the signals of the acoustic sensors, while elevation is computed in a separate stream using binaural intensity information. These separate streams are merged in their joint projection onto the external nucleus of the inferior colliculus, a two dimensional array of cells which contains a map of acoustic space. This acoustic map, and the visual map of the retina, jointly project onto the optic tectum, creating a fused visual/acoustic representation of position in space that is used for object localization. In this paper we describe our mathematical model of the stage of visual/acoustic fusion in the optic tectum. The model assumes that the acoustic projection from the external nucleus onto the tectum is roughly topographic and one-to-many, while the visual projection from the retina onto the tectum is topographic and one-to-one. A simple process of self-organization alters the strengths of the acoustic connections, effectively forming a focused beam of strong acoustic connections whose inputs are coincident with the visual inputs

  8. Advanced fission and fossil plant economics-implications for fusion

    SciTech Connect

    Delene, J.G.

    1994-09-01

    In order for fusion energy to be a viable option for electric power generation, it must either directly compete with future alternatives or serve as a reasonable backup if the alternatives become unacceptable. This paper discusses projected costs for the most likely competitors with fusion power for baseload electric capacity and what these costs imply for fusion economics. The competitors examined include advanced nuclear fission and advanced fossil-fired plants. The projected costs and their basis are discussed. The estimates for these technologies are compared with cost estimates for magnetic and inertial confinement fusion plants. The conclusion of the analysis is that fusion faces formidable economic competition. Although the cost level for fusion appears greater than that for fission or fossil, the costs are not so high as to preclude fusion`s potential competitiveness.

  9. A Study of the Efficacy of Project-Based Learning Integrated with Computer-Based Simulation--STELLA

    ERIC Educational Resources Information Center

    Eskrootchi, Rogheyeh; Oskrochi, G. Reza

    2010-01-01

    Incorporating computer-simulation modelling into project-based learning may be effective but requires careful planning and implementation. Teachers, especially, need pedagogical content knowledge which refers to knowledge about how students learn from materials infused with technology. This study suggests that students learn best by actively…

  10. Analysis of the steam injection at the Visalia Superfund Project with fully compositional nonisothermal finite difference simulations.

    PubMed

    Kuhlman, Myron I

    2002-05-01

    By injecting steam, over 1.1 million pounds of creosote has been recovered at the Visalia, California Superfund Site from an aquifer 102ft underground. In the first 6 weeks of injection 320,000lb of creosote were recovered or destroyed versus <1lb per day in a pump and treat. The finite difference simulator STARS1, which is widely used in the oil industry to model thermal recovery, has been used to simulate simplified models of the project, to analyze recovery mechanisms, and to demonstrate how the operation of similar projects can be improved. The simulations indicate that vaporization of dense, nonaqueous, phase liquids (DNAPLs) is the most important recovery mechanism, that liquid production is enhanced because a gas phase is present, and that the project could have been completed more rapidly if an additional injector or producer had been added in the center of the site. In addition, the mineralization (conversion to carbon dioxide) of DNAPLs could result from reaction with water, injected air or, most likely, both. The mechanisms are likely to be similar to subcritical water oxidation. While this analysis suggests methods to improve operation of future steam projects, Visalia has been a very successful demonstration of the potential of steam injection to clean up recalcitrant hydrocarbons and will be an inspiration for future projects.

  11. Public Relations on Fusion in Europe

    NASA Astrophysics Data System (ADS)

    Ongena, J.; van Oost, G.; Paris, P. J.

    2000-10-01

    A summary will be presented of PR efforts on fusion energy research in Europe. A 3-D movie of a fusion research experimental reactor has been realized at the start of this year. It has been made entirely on virtual animation basis. Two versions exists, a short version of 3 min., as a video clip, and a longer version of nearly 8 min. Both could be viewed in 3D, using special projections and passive glasses or in normal VHS video projections. A new CD-ROM for individual and classroom use will be presented, discussing (i) the different energy forms, (ii) general principles of fusion, (iii) current research efforts and (iv) future prospects of fusion. This CD-ROM is now produced in English, German, French, Spanish, Italian and Portuguese Several new brochures and leaflets intended to increase the public awareness on fusion in Europe will be on display.

  12. Revised Comparisons of Simulated Hydrodynamics and Water Quality for Projected Demands in 2046, Pueblo Reservoir, Southeastern Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.; Miller, Lisa D.

    2009-01-01

    Pueblo Reservoir is one of southeastern Colorado's most valuable water resources. The reservoir provides irrigation, municipal, and industrial water to various entities throughout the region. The reservoir also provides flood control, recreational activities, sport fishing, and wildlife enhancement to the region. The Southern Delivery System (SDS) project is a regional water-delivery project that has been proposed to provide a safe, reliable, and sustainable water supply through the foreseeable future (2046) for Colorado Springs, Fountain, Security, and Pueblo West. Discussions with the Bureau of Reclamation and the U.S. Geological Survey led to a cooperative agreement to simulate the hydrodynamics and water quality of Pueblo Reservoir. This work has been completed and described in a previously published report, U.S. Geological Survey Scientific Investigations Report 2008-5056. Additionally, there was a need to make comparisons of simulated hydrodynamics and water quality for projected demands associated with the various Environmental Impact Statements (EIS) alternatives and plans by Pueblo West to discharge treated wastewater into the reservoir. Wastewater plans by Pueblo West are fully independent of the SDS project. This report compares simulated hydrodynamics and water quality for projected demands in Pueblo Reservoir resulting from changes in inflow and water quality entering the reservoir, and from changes to withdrawals from the reservoir as projected for the year 2046. Four of the seven EIS alternatives were selected for scenario simulations. The four U.S. Geological Survey simulation scenarios were the No Action scenario (EIS Alternative 1), the Downstream Diversion scenario (EIS Alternative 2), the Upstream Return-Flow scenario (EIS Alternative 4), and the Upstream Diversion scenario (EIS Alternative 7). Additionally, the results of an Existing Conditions scenario (year 2006 demand conditions) were compared to the No Action scenario (projected demands in

  13. The CMIP6 Sea-Ice Model Intercomparison Project (SIMIP): understanding sea ice through climate-model simulations

    NASA Astrophysics Data System (ADS)

    Notz, Dirk; Jahn, Alexandra; Holland, Marika; Hunke, Elizabeth; Massonnet, François; Stroeve, Julienne; Tremblay, Bruno; Vancoppenolle, Martin

    2016-09-01

    A better understanding of the role of sea ice for the changing climate of our planet is the central aim of the diagnostic Coupled Model Intercomparison Project 6 (CMIP6)-endorsed Sea-Ice Model Intercomparison Project (SIMIP). To reach this aim, SIMIP requests sea-ice-related variables from climate-model simulations that allow for a better understanding and, ultimately, improvement of biases and errors in sea-ice simulations with large-scale climate models. This then allows us to better understand to what degree CMIP6 model simulations relate to reality, thus improving our confidence in answering sea-ice-related questions based on these simulations. Furthermore, the SIMIP protocol provides a standard for sea-ice model output that will streamline and hence simplify the analysis of the simulated sea-ice evolution in research projects independent of CMIP. To reach its aims, SIMIP provides a structured list of model output that allows for an examination of the three main budgets that govern the evolution of sea ice, namely the heat budget, the momentum budget, and the mass budget. In this contribution, we explain the aims of SIMIP in more detail and outline how its design allows us to answer some of the most pressing questions that sea ice still poses to the international climate-research community.

  14. The JUMP student project: two weeks of space simulation in a Mars-like environment.

    NASA Astrophysics Data System (ADS)

    de Crombrugghe, Guerric; de Lobkowicz, Ysaline; van Vynckt, Delphine; Reydams, Marc; Denies, Jonathan; Jago, Alban; Le Maire, Victor

    JUMP is a student initiative which aim is to simulate during two weeks the life of astronauts in a Mars-like environment. The simulation will be held in the Mars Desert Research Station (MDRS) a habitat installed by the Mars Society (MS) in the Utah desert. The crew is composed of six students, helped by a remote support of four students, all from different background (engineering, physics, mathematics, biology, and architecture) and degree (bachelor, master, PhD), under the supervision of researchers from several institutes. Several researches will be conducted during the simulation. We shall report on the science and technical results, and implications for Earth-Mars comparative studies. JASE: The Jump Astronaut Safety Experiment (JASE) consists in a deployable Yagi antenna with basic elec-tronics, providing an extremely light and simple way to prevent the solar flares and observe Jupiter bursts. JADE: The Jump Angular Detection Experiment (JADE) is an innovative an-gular particle detector used to determine the irradiation of the surface and monitor the charged particle distribution in Mars' atmosphere. Even if its resolution is low, it is a very light solution compared to pixel detectors. JAPE: The Jump Astronaut Potatoes Experiment (JAPE) will try to grow and eat in a space-like environment high-performance potatoes developed by the Groupe de Recherche en Physiologie Végétale (GRPV) of the UCL in the frame of the Micro-e Ecological Life Support System Alternative (MELiSSA) project of the ESA. JABE: The Jump soil Analysis with a Backpack drill Experiment (JABE) aim to validate a sample procedure, generate vertical profiles of the humidity with a MEMS sensor, and analyze soil samples with a spectrometer. The crew will therefore use a backpack drill, which is portable, fast and easy to use. JARE: The goal of the Jump Astronaut-Rover interaction Experiment (JARE) is to determine how a rover can help an astronaut in his task, and how it is possible to improve this

  15. Evaluation of Arctic Sea Ice Thickness Simulated by Arctic Ocean Model Intercomparison Project Models

    NASA Technical Reports Server (NTRS)

    Johnson, Mark; Proshuntinsky, Andrew; Aksenov, Yevgeny; Nguyen, An T.; Lindsay, Ron; Haas, Christian; Zhang, Jinlun; Diansky, Nikolay; Kwok, Ron; Maslowski, Wieslaw; Hakkinen, Sirpa; Ashik, Igor; De Cuevas, Beverly

    2012-01-01

    Six Arctic Ocean Model Intercomparison Project model simulations are compared with estimates of sea ice thickness derived from pan-Arctic satellite freeboard measurements (2004-2008); airborne electromagnetic measurements (2001-2009); ice draft data from moored instruments in Fram Strait, the Greenland Sea, and the Beaufort Sea (1992-2008) and from submarines (1975-2000); and drill hole data from the Arctic basin, Laptev, and East Siberian marginal seas (1982-1986) and coastal stations (1998-2009). Despite an assessment of six models that differ in numerical methods, resolution, domain, forcing, and boundary conditions, the models generally overestimate the thickness of measured ice thinner than approximately 2 mand underestimate the thickness of ice measured thicker than about approximately 2m. In the regions of flat immobile landfast ice (shallow Siberian Seas with depths less than 25-30 m), the models generally overestimate both the total observed sea ice thickness and rates of September and October ice growth from observations by more than 4 times and more than one standard deviation, respectively. The models do not reproduce conditions of fast ice formation and growth. Instead, the modeled fast ice is replaced with pack ice which drifts, generating ridges of increasing ice thickness, in addition to thermodynamic ice growth. Considering all observational data sets, the better correlations and smaller differences from observations are from the Estimating the Circulation and Climate of the Ocean, Phase II and Pan-Arctic Ice Ocean Modeling and Assimilation System models.

  16. COMPASS, the COMmunity Petascale Project for Accelerator Science and Simulation, a broad computational accelerator physics initiative

    SciTech Connect

    J.R. Cary; P. Spentzouris; J. Amundson; L. McInnes; M. Borland; B. Mustapha; B. Norris; P. Ostroumov; Y. Wang; W. Fischer; A. Fedotov; I. Ben-Zvi; R. Ryne; E. Esarey; C. Geddes; J. Qiang; E. Ng; S. Li; C. Ng; R. Lee; L. Merminga; H. Wang; D.L. Bruhwiler; D. Dechow; P. Mullowney; P. Messmer; C. Nieter; S. Ovtchinnikov; K. Paul; P. Stoltz; D. Wade-Stein; W.B. Mori; V. Decyk; C.K. Huang; W. Lu; M. Tzoufras; F. Tsung; M. Zhou; G.R. Werner; T. Antonsen; T. Katsouleas

    2007-06-01

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  17. COMPASS, the COMmunity Petascale project for Accelerator Science and Simulation, a board computational accelerator physics initiative

    SciTech Connect

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; Wang, H.; Bruhwiler, D.L.; Dechow, D.; Mullowney, P.; Messmer, P.; Nieter, C.; Ovtchinnikov, S.; Paul, K.; Stoltz, P.; Wade-Stein, D.; Mori, W.B.; Decyk, V.; Huang, C.K.; Lu, W.; Tzoufras, M.; Tsung, F.; Zhou, M.; Werner, G.R.; Antonsen, T.; Katsouleas, T.; Morris, B.

    2007-07-16

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  18. COMPASS, the COMmunity Petascale Project for Accelerator Science And Simulation, a Broad Computational Accelerator Physics Initiative

    SciTech Connect

    Cary, J.R.; Spentzouris, P.; Amundson, J.; McInnes, L.; Borland, M.; Mustapha, B.; Norris, B.; Ostroumov, P.; Wang, Y.; Fischer, W.; Fedotov, A.; Ben-Zvi, I.; Ryne, R.; Esarey, E.; Geddes, C.; Qiang, J.; Ng, E.; Li, S.; Ng, C.; Lee, R.; Merminga, L.; /Jefferson Lab /Tech-X, Boulder /UCLA /Colorado U. /Maryland U. /Southern California U.

    2007-11-09

    Accelerators are the largest and most costly scientific instruments of the Department of Energy, with uses across a broad range of science, including colliders for particle physics and nuclear science and light sources and neutron sources for materials studies. COMPASS, the Community Petascale Project for Accelerator Science and Simulation, is a broad, four-office (HEP, NP, BES, ASCR) effort to develop computational tools for the prediction and performance enhancement of accelerators. The tools being developed can be used to predict the dynamics of beams in the presence of optical elements and space charge forces, the calculation of electromagnetic modes and wake fields of cavities, the cooling induced by comoving beams, and the acceleration of beams by intense fields in plasmas generated by beams or lasers. In SciDAC-1, the computational tools had multiple successes in predicting the dynamics of beams and beam generation. In SciDAC-2 these tools will be petascale enabled to allow the inclusion of an unprecedented level of physics for detailed prediction.

  19. Comparison of projection skills of deterministic ensemble methods using pseudo-simulation data generated from multivariate Gaussian distribution

    NASA Astrophysics Data System (ADS)

    Oh, Seok-Geun; Suh, Myoung-Seok

    2016-03-01

    The projection skills of five ensemble methods were analyzed according to simulation skills, training period, and ensemble members, using 198 sets of pseudo-simulation data (PSD) produced by random number generation assuming the simulated temperature of regional climate models. The PSD sets were classified into 18 categories according to the relative magnitude of bias, variance ratio, and correlation coefficient, where each category had 11 sets (including 1 truth set) with 50 samples. The ensemble methods used were as follows: equal weighted averaging without bias correction (EWA_NBC), EWA with bias correction (EWA_WBC), weighted ensemble averaging based on root mean square errors and correlation (WEA_RAC), WEA based on the Taylor score (WEA_Tay), and multivariate linear regression (Mul_Reg). The projection skills of the ensemble methods improved generally as compared with the best member for each category. However, their projection skills are significantly affected by the simulation skills of the ensemble member. The weighted ensemble methods showed better projection skills than non-weighted methods, in particular, for the PSD categories having systematic biases and various correlation coefficients. The EWA_NBC showed considerably lower projection skills than the other methods, in particular, for the PSD categories with systematic biases. Although Mul_Reg showed relatively good skills, it showed strong sensitivity to the PSD categories, training periods, and number of members. On the other hand, the WEA_Tay and WEA_RAC showed relatively superior skills in both the accuracy and reliability for all the sensitivity experiments. This indicates that WEA_Tay and WEA_RAC are applicable even for simulation data with systematic biases, a short training period, and a small number of ensemble members.

  20. Geophysical data fusion for subsurface imaging. Phase 1

    SciTech Connect

    Hoekstra, P.; Vandergraft, J.; Blohm, M.; Porter, D.

    1993-08-01

    A geophysical data fusion methodology is under development to combine data from complementary geophysical sensors and incorporate geophysical understanding to obtain three dimensional images of the subsurface. The research reported here is the first phase of a three phase project. The project focuses on the characterization of thin clay lenses (aquitards) in a highly stratified sand and clay coastal geology to depths of up to 300 feet. The sensor suite used in this work includes time-domain electromagnetic induction (TDEM) and near surface seismic techniques. During this first phase of the project, enhancements to the acquisition and processing of TDEM data were studied, by use of simulated data, to assess improvements for the detection of thin clay layers. Secondly, studies were made of the use of compressional wave and shear wave seismic reflection data by using state-of-the-art high frequency vibrator technology. Finally, a newly developed processing technique, called ``data fusion,`` was implemented to process the geophysical data, and to incorporate a mathematical model of the subsurface strata. Examples are given of the results when applied to real seismic data collected at Hanford, WA, and for simulated data based on the geology of the Savannah River Site.

  1. Establishment of an Institute for Fusion Studies. Technical progress report, November 1, 1994--October 31, 1995

    SciTech Connect

    1995-07-01

    The Institute for Fusion Studies is a national center for theoretical fusion plasma physics research. Its purposes are to (1) conduct research on theoretical questions concerning the achievement of controlled fusion energy by means of magnetic confinement--including both fundamental problems of long-range significance, as well as shorter-term issues; (2) serve as a national and international center for information exchange by hosting exchange visits, conferences, and workshops; and (3) train students and postdoctoral research personnel for the fusion energy program and plasma physics research areas. During FY 1995, a number of significant scientific advances were achieved at the IFS, both in long-range fundamental problems as well as in near-term strategic issues, consistent with the Institute`s mandate. Examples of these achievements include, for example, tokamak edge physics, analytical and computational studies of ion-temperature-gradient-driven turbulent transport, alpha-particle-excited toroidal Alfven eigenmode nonlinear behavior, sophisticated simulations for the Numerical Tokamak Project, and a variety of non-tokamak and non-fusion basic plasma physics applications. Many of these projects were done in collaboration with scientists from other institutions. Research discoveries are briefly described in this report.

  2. Z-Pinch Fusion Propulsion

    NASA Technical Reports Server (NTRS)

    Miernik, Janie

    2011-01-01

    Fusion-based nuclear propulsion has the potential to enable fast interplanetary transportation. Shorter trips are better for humans in the harmful radiation environment of deep space. Nuclear propulsion and power plants can enable high Ispand payload mass fractions because they require less fuel mass. Fusion energy research has characterized the Z-Pinch dense plasma focus method. (1) Lightning is form of pinched plasma electrical discharge phenomena. (2) Wire array Z-Pinch experiments are commonly studied and nuclear power plant configurations have been proposed. (3) Used in the field of Nuclear Weapons Effects (NWE) testing in the defense industry, nuclear weapon x-rays are simulated through Z-Pinch phenomena.

  3. Photo-fusion reactions in a new compact device for ELI

    SciTech Connect

    Moustaizis, S. D.; Auvray, P.; Hora, H.; Lalousis, P.; Larour, J.; Mourou, G.

    2012-07-09

    In the last few years significant progress on technological, experimental and numerical studies on fusion process in high density and high temperature plasmas produced by a high intensity laser pulse interaction with clusters in a high external applied magnetic field, enable us to propose a compact photo-fusion magnetic device for high neutron production. For the purpose of the project a pulsed magnetic field driver with values up to 110 Tesla has been developed which allows increasing the trapping time of the high density plasma in the device and improving the neutron yield. Numerical simulations show that the proposed device is capable of producing up to 10{sup 9}-10{sup 10} neutrons per laser shot with an external magnetic field of 150 Tesla. The proposed device can be used for experiments and numerical code validation concerning different conventional and (or) exotic fusion fuels.

  4. Inertial confinement fusion

    SciTech Connect

    Powers, L.; Condouris, R.; Kotowski, M.; Murphy, P.W.

    1992-01-01

    This issue of the ICF Quarterly contains seven articles that describe recent progress in Lawrence Livermore National Laboratory's ICF program. The Department of Energy recently initiated an effort to design a 1--2 MJ glass laser, the proposed National Ignition Facility (NIF). These articles span various aspects of a program which is aimed at moving forward toward such a facility by continuing to use the Nova laser to gain understanding of NIF-relevant target physics, by developing concepts for an NIF laser driver, and by envisioning a variety of applications for larger ICF facilities. This report discusses research on the following topics: Stimulated Rotational Raman Scattering in Nitrogen; A Maxwell Equation Solver in LASNEX for the Simulation of Moderately Intense Ultrashort Pulse Experiments; Measurements of Radial Heat-Wave Propagation in Laser-Produced Plasmas; Laser-Seeded Modulation Growth on Directly Driven Foils; Stimulated Raman Scattering in Large-Aperture, High-Fluence Frequency-Conversion Crystals; Fission Product Hazard Reduction Using Inertial Fusion Energy; Use of Inertial Confinement Fusion for Nuclear Weapons Effects Simulations.

  5. Accelerators for Fusion Materials Testing

    NASA Astrophysics Data System (ADS)

    Knaster, Juan; Okumura, Yoshikazu

    Fusion materials research is a worldwide endeavor as old as the parallel one working toward the long term stable confinement of ignited plasma. In a fusion reactor, the preservation of the required minimum thermomechanical properties of the in-vessel components exposed to the severe irradiation and heat flux conditions is an indispensable factor for safe operation; it is also an essential goal for the economic viability of fusion. Energy from fusion power will be extracted from the 14 MeV neutron freed as a product of the deuterium-tritium fusion reactions; thus, this kinetic energy must be absorbed and efficiently evacuated and electricity eventually generated by the conventional methods of a thermal power plant. Worldwide technological efforts to understand the degradation of materials exposed to 14 MeV neutron fluxes > 1018 m-2s-1, as expected in future fusion power plants, have been intense over the last four decades. Existing neutron sources can reach suitable dpa ("displacement-per-atom", the figure of merit to assess materials degradation from being exposed to neutron irradiation), but the differences in the neutron spectrum of fission reactors and spallation sources do not allow one to unravel the physics and to anticipate the degradation of materials exposed to fusion neutrons. Fusion irradiation conditions can be achieved through Li (d, xn) nuclear reactions with suitable deuteron beam current and energy, and an adequate flowing lithium screen. This idea triggered in the late 1970s at Los Alamos National Laboratory (LANL) a campaign working toward the feasibility of continuous wave (CW) high current linacs framed by the Fusion Materials Irradiation Test (FMIT) project. These efforts continued with the Low Energy Demonstrating Accelerator (LEDA) (a validating prototype of the canceled Accelerator Production of Tritium (APT) project), which was proposed in 2002 to the fusion community as a 6.7MeV, 100mA CW beam injector for a Li (d, xn) source to bridge

  6. Accelerators for Fusion Materials Testing

    NASA Astrophysics Data System (ADS)

    Knaster, Juan; Okumura, Yoshikazu

    Fusion materials research is a worldwide endeavor as old as the parallel one working toward the long term stable confinement of ignited plasma. In a fusion reactor, the preservation of the required minimum thermomechanical properties of the in-vessel components exposed to the severe irradiation and heat flux conditions is an indispensable factor for safe operation; it is also an essential goal for the economic viability of fusion. Energy from fusion power will be extracted from the 14 MeV neutron freed as a product of the deuterium-tritium fusion reactions; thus, this kinetic energy must be absorbed and efficiently evacuated and electricity eventually generated by the conventional methods of a thermal power plant. Worldwide technological efforts to understand the degradation of materials exposed to 14 MeV neutron fluxes >1018 m-2s-1, as expected in future fusion power plants, have been intense over the last four decades. Existing neutron sources can reach suitable dpa (“displacement-per-atom”, the figure of merit to assess materials degradation from being exposed to neutron irradiation), but the differences in the neutron spectrum of fission reactors and spallation sources do not allow one to unravel the physics and to anticipate the degradation of materials exposed to fusion neutrons. Fusion irradiation conditions can be achieved through Li (d, xn) nuclear reactions with suitable deuteron beam current and energy, and an adequate flowing lithium screen. This idea triggered in the late 1970s at Los Alamos National Laboratory (LANL) a campaign working toward the feasibility of continuous wave (CW) high current linacs framed by the Fusion Materials Irradiation Test (FMIT) project. These efforts continued with the Low Energy Demonstrating Accelerator (LEDA) (a validating prototype of the canceled Accelerator Production of Tritium (APT) project), which was proposed in 2002 to the fusion community as a 6.7MeV, 100mA CW beam injector for a Li (d, xn) source to bridge

  7. Spinning ATOMS Draws Energy from FUSION

    NASA Astrophysics Data System (ADS)

    Turner, Raymond E.

    2004-09-01

    Project FUSION (Facilitating Urban Science Initiatives by Organizational Networking) links scientifically sound, culturally relevant community-based research initiatives to a network of higher education institutions and community-based organizations. Project participants recognize chemistry as the central science and learn of its importance to science exploration at all academic levels. Roxbury Community College, a bridge for economically disadvantaged students aspiring to attend four-year colleges and universities, serves as the research hub for Project FUSION. Expanding on the model of the previous NIGMS-funded ATOMS program, the efforts of Project FUSION increased student participation while also making it a more stable and transferrable program. Results from this project show that culturally relevant community-based research programs supported by organizational networking can have a profound effect on student participation in undergraduate research.

  8. Superconducting magnets for fusion applications

    SciTech Connect

    Henning, C.D.

    1987-07-02

    Fusion magnet technology has made spectacular advances in the past decade; to wit, the Mirror Fusion Test Facility and the Large Coil Project. However, further advances are still required for advanced economical fusion reactors. Higher fields to 14 T and radiation-hardened superconductors and insulators will be necessary. Coupled with high rates of nuclear heating and pulsed losses, the next-generation magnets will need still higher current density, better stability and quench protection. Cable-in-conduit conductors coupled with polyimide insulations and better steels seem to be the appropriate path. Neutron fluences up to 10/sup 19/ neutrons/cm/sup 2/ in niobium tin are achievable. In the future, other amorphous superconductors could raise these limits further to extend reactor life or decrease the neutron shielding and corresponding reactor size.

  9. Fusion Blanket Development in FDF

    NASA Astrophysics Data System (ADS)

    Wong, C. P. C.; Smith, J. P.; Stambaugh, R. D.

    2008-11-01

    To satisfy the electricity and tritium self-sufficiency missions of a Fusion Development Facility (FDF), suitable blanket designs will need to be evaluated, selected and developed. To demonstrate closure of the fusion fuel cycle, 2-3 main tritium breeding blankets will be used to cover most of the available chamber surface area in order to reach the project goal of achieving a tritium breeding ratio, TBR > 1. To demonstrate the feasibility of electricity and tritium production for subsequent devices such as the fusion demonstration power reactor (DEMO), several advanced test blankets will need to be selected and tested on the FDF to demonstrate high coolant outlet temperature necessary for efficient electricity production. Since the design goals for the main and test blankets are different, the design criteria of these blankets will also be different. The considerations in performing the evaluation of blanket and structural material options in concert with the maintenance approach for the FDF will be reported in this paper.

  10. Laser fusion experiments at LLL

    SciTech Connect

    Ahlstrom, H.G.

    1980-06-16

    These notes present the experimental basis and status for laser fusion as developed at LLL. Two other chapters, one authored by K.A. Brueckner and the other by C. Max, present the theoretical implosion physics and laser plasma interaction physics. The notes consist of six sections. The first is an introductory section which provides some of the history of inertial fusion and a simple explanation of the concepts involved. The second section presents an extensive discussion of diagnostic instrumentation used in the LLL Laser Fusion Program. The third section is a presentation of laser facilities and capabilities at LLL. The purpose here is to define capability, not to derive how it was obtained. The fourth and fifth sections present the experimental data on laser-plasma interaction and implosion physics. The last chapter is a short projection of the future.

  11. Fusion energy

    NASA Astrophysics Data System (ADS)

    1990-09-01

    The main purpose of the International Thermonuclear Experimental Reactor (ITER) is to develop an experimental fusion reactor through the united efforts of many technologically advanced countries. The ITER terms of reference, issued jointly by the European Community, Japan, the USSR, and the United States, call for an integrated international design activity and constitute the basis of current activities. Joint work on ITER is carried out under the auspices of the International Atomic Energy Agency (IAEA), according to the terms of quadripartite agreement reached between the European Community, Japan, the USSR, and the United States. The site for joint technical work sessions is at the Max Planck Institute of Plasma Physics. Garching, Federal Republic of Germany. The ITER activities have two phases: a definition phase performed in 1988 and the present design phase (1989 to 1990). During the definition phase, a set of ITER technical characteristics and supporting research and development (R and D) activities were developed and reported. The present conceptual design phase of ITER lasts until the end of 1990. The objectives of this phase are to develop the design of ITER, perform a safety and environmental analysis, develop site requirements, define future R and D needs, and estimate cost, manpower, and schedule for construction and operation. A final report will be submitted at the end of 1990. This paper summarizes progress in the ITER program during the 1989 design phase.

  12. Fusion energy

    SciTech Connect

    Not Available

    1990-09-01

    The main purpose of the International Thermonuclear Experimental Reactor (ITER) is to develop an experimental fusion reactor through the united efforts of many technologically advanced countries. The ITER terms of reference, issued jointly by the European Community, Japan, the USSR, and the United States, call for an integrated international design activity and constitute the basis of current activities. Joint work on ITER is carried out under the auspices of the International Atomic Energy Agency (IAEA), according to the terms of quadripartite agreement reached between the European Community, Japan, the USSR, and the United States. The site for joint technical work sessions is at the MaxPlanck Institute of Plasma Physics. Garching, Federal Republic of Germany. The ITER activities have two phases: a definition phase performed in 1988 and the present design phase (1989--1990). During the definition phase, a set of ITER technical characteristics and supporting research and development (R D) activities were developed and reported. The present conceptual design phase of ITER lasts until the end of 1990. The objectives of this phase are to develop the design of ITER, perform a safety and environmental analysis, develop site requirements, define future R D needs, and estimate cost, manpower, and schedule for construction and operation. A final report will be submitted at the end of 1990. This paper summarizes progress in the ITER program during the 1989 design phase.

  13. Magnetic-Nozzle Studies for Fusion Propulsion Applications: Gigawatt Plasma Source Operation and Magnetic Nozzle Analysis

    NASA Technical Reports Server (NTRS)

    Gilland, James H.; Mikekkides, Ioannis; Mikellides, Pavlos; Gregorek, Gerald; Marriott, Darin

    2004-01-01

    This project has been a multiyear effort to assess the feasibility of a key process inherent to virtually all fusion propulsion concepts: the expansion of a fusion-grade plasma through a diverging magnetic field. Current fusion energy research touches on this process only indirectly through studies of plasma divertors designed to remove the fusion products from a reactor. This project was aimed at directly addressing propulsion system issues, without the expense of constructing a fusion reactor. Instead, the program designed, constructed, and operated a facility suitable for simulating fusion reactor grade edge plasmas, and to examine their expansion in an expanding magnetic nozzle. The approach was to create and accelerate a dense (up to l0(exp 20)/m) plasma, stagnate it in a converging magnetic field to convert kinetic energy to thermal energy, and examine the subsequent expansion of the hot (100's eV) plasma in a subsequent magnetic nozzle. Throughout the project, there has been a parallel effort between theoretical and numerical design and modelling of the experiment and the experiment itself. In particular, the MACH2 code was used to design and predict the performance of the magnetoplasmadynamic (MPD) plasma accelerator, and to design and predict the design and expected behavior for the magnetic field coils that could be added later. Progress to date includes the theoretical accelerator design and construction, development of the power and vacuum systems to accommodate the powers and mass flow rates of interest to out research, operation of the accelerator and comparison to theoretical predictions, and computational analysis of future magnetic field coils and the expected performance of an integrated source-nozzle experiment.

  14. Basics of Fusion-Fissison Research Facility (FFRF) as a Fusion Neutron Source

    SciTech Connect

    Leonid E. Zakharov

    2011-06-03

    FFRF, standing for the Fusion-Fission Research Facility represents an option for the next step project of ASIPP (Hefei, China) aiming to a first fusion-fission multifunctional device [1]. FFRF strongly relies on new, Lithium Wall Fusion plasma regimes, the development of which has already started in the US and China. With R/a=4/1m/m, Ipl=5 MA, Btor=4-6 T, PDT=50- 100 MW, Pfission=80-4000MW, 1 m thick blanket, FFRF has a unique fusion mission of a stationary fusion neutron source. Its pioneering mission of merging fusion and fission consists in accumulation of design, experimental, and operational data for future hybrid applications.

  15. Multi-model ensemble simulation and projection in the climate change in the Mekong River Basin. Part I: temperature.

    PubMed

    Huang, Yong; Wang, Fengyou; Li, Yi; Cai, Tijiu

    2014-11-01

    This paper evaluates the performance of the Coupled Model Intercomparison Project phase 5 (CMIP5) in simulating annual and decadal temperature in the Mekong River Basin from 1950 to 2005. By use of Bayesian multi-model averaging method, the future projection of temperature variation under different scenarios are also analyzed. The results show, the performances of climate model are more accurate in space than time, the model can catch the warming characteristics in the Mekong river Basin, but the accuracy of simulation is not good enough. Bayesian multi-model averaging method can improve the annual and decadal temperature simulation when compared to a single result. The projected temperature in Mekong River will increase by 0.88 °C/100 year, 2.15 °C/100 year and 4.96 °C/100 year for the RCP2.6, RCP4.5, and RCP8.5 scenarios, respectively, over the twenty-first century. The findings will be beneficial for local people and policy-maker to formulate regional strategies against the potential menaces of warming scenarios.

  16. The New England Climate Adaptation Project: Enhancing Local Readiness to Adapt to Climate Change through Role-Play Simulations

    NASA Astrophysics Data System (ADS)

    Rumore, D.; Kirshen, P. H.; Susskind, L.

    2014-12-01

    Despite scientific consensus that the climate is changing, local efforts to prepare for and manage climate change risks remain limited. How we can raise concern about climate change risks and enhance local readiness to adapt to climate change's effects? In this presentation, we will share the lessons learned from the New England Climate Adaptation Project (NECAP), a participatory action research project that tested science-based role-play simulations as a tool for educating the public about climate change risks and simulating collective risk management efforts. NECAP was a 2-year effort involving the Massachusetts Institute of Technology, the Consensus Building Institute, the National Estuarine Research Reserve System, and four coastal New England municipalities. During 2012-2013, the NECAP team produced downscaled climate change projections, a summary risk assessment, and a stakeholder assessment for each partner community. Working with local partners, we used these assessments to create a tailored, science-based role-play simulation for each site. Through a series of workshops in 2013, NECAP engaged between 115-170 diverse stakeholders and members of the public in each partner municipality in playing the simulation and a follow up conversation about local climate change risks and possible adaptation strategies. Data were collected through before-and-after surveys administered to all workshop participants, follow-up interviews with 25 percent of workshop participants, public opinion polls conducted before and after our intervention, and meetings with public officials. This presentation will report our research findings and explain how science-based role-play simulations can be used to help communicate local climate change risks and enhance local readiness to adapt.

  17. Economic potential of magnetic fusion energy

    SciTech Connect

    Henning, C.D.

    1981-03-10

    Scientific feasibility of magnetic fusion is no longer seriously in doubt. Rapid advances have been made in both tokamak and mirror research, leading to a demonstration in the TFTR tokamak at Princeton in 1982 and the tandem mirror MFTF-B at Livermore in 1985. Accordingly, the basis is established for an aggressive engineering thrust to develop a reactor within this century. However, care must be taken to guide the fusion program towards an economically and environmentally viable goal. While the fusion fuels are essentially free, capital costs of reactors appear to be at least as large as current power plants. Accordingly, the price of electricity will not decline, and capital availability for reactor constructions will be important. Details of reactor cost projections are discussed and mechanisms suggested for fusion power implementation. Also discussed are some environmental and safety aspects of magnetic fusion.

  18. Projecting Wind Energy Potential Under Climate Change with Ensemble of Climate Model Simulations

    NASA Astrophysics Data System (ADS)

    Jain, A.; Shashikanth, K.; Ghosh, S.; Mukherjee, P. P.

    2013-12-01

    Recent years have witnessed an increasing global concern over energy sustainability and security, triggered by a number of issues, such as (though not limited to): fossil fuel depletion, energy resource geopolitics, economic efficiency versus population growth debate, environmental concerns and climate change. Wind energy is a renewable and sustainable form of energy in which wind turbines convert the kinetic energy of wind into electrical energy. Global warming and differential surface heating may significantly impact the wind velocity and hence the wind energy potential. Sustainable design of wind mills requires understanding the impacts of climate change on wind energy potential, which we evaluate here with multiple General Circulation Models (GCMs). GCMs simulate the climate variables globally considering the greenhouse emission scenarios provided as Representation Concentration path ways (RCPs). Here we use new generation climate model outputs obtained from Coupled model Intercomparison Project 5(CMIP5). We first compute the wind energy potential with reanalysis data (NCEP/ NCAR), at a spatial resolution of 2.50, where the gridded data is fitted to Weibull distribution and with the Weibull parameters, the wind energy densities are computed at different grids. The same methodology is then used, to CMIP5 outputs (resultant of U-wind and V-wind) of MRI, CMCC, BCC, CanESM, and INMCM4 for historical runs. This is performed separately for four seasons globally, MAM, JJA, SON and DJF. We observe the muti-model average of wind energy density for historic period has significant bias with respect to that of reanalysis product. Here we develop a quantile based superensemble approach where GCM quantiles corresponding to selected CDF values are regressed to reanalysis data. It is observed that this regression approach takes care of both, bias in GCMs and combination of GCMs. With superensemble, we observe that the historical wind energy density resembles quite well with

  19. Simulation and projection of summer surface air temperature over China: a comparison between a RCM and the driving global model

    NASA Astrophysics Data System (ADS)

    Li, Donghuan; Zhou, Tianjun; Zou, Liwei

    2016-04-01

    The regional climate model (version 3, RegCM3) with the horizontal resolution of 50 km was employed to downscale the historical and projected climate changes over CORDEX East Asia domain, nested within the global climate system model FGOALS-g2 (Flexible Global Ocean-Atmosphere-Land System Model: Grid-point Version 2). The simulated (1986-2005) and projected (2046-2065) summer surface air temperature changes under RCP8.5 scenario over China were compared between the RegCM3 and FGOALS-g2. The air temperature indices used in this study included tmx (daily maximum temperature), t2m (daily average temperature) and tmn (daily minimum temperature), and extreme high-temperature events included TXx (max tmx), TX90p (warm days) and WSDI (warm spell duration). Results indicated that both models could reasonably reproduce the climatological distribution of surface air temperature and extreme high-temperature events. Compared to the driving global climate model, the detailed characteristics of summer surface air temperature were better simulated in RegCM3 due to its higher horizontal resolution. Under the RCP8.5 scenario, summer surface air temperature over China will increase significantly during the middle of 21st century. RegCM3 projected larger increase of tmx than tmn over most regions of China, but in the western Tibet Plateau, the increase of tmn was larger. In the projection of FGOALS-g2, the projected changes of the three temperature indices (t2m, tmn, and tmx) were similar with larger increases over northeastern China and Tibet Plateau. Extreme high-temperature events were projected to increase significantly in both models. TX90p will increase more than 60% compared to present day, while WSDI will become twice of present day. Key words: Summer surface air temperature; Extreme high-temperature events; Regional climate model; Climate change

  20. A Reliability-Based Track Fusion Algorithm

    PubMed Central

    Xu, Li; Pan, Liqiang; Jin, Shuilin; Liu, Haibo; Yin, Guisheng

    2015-01-01

    The common track fusion algorithms in multi-sensor systems have some defects, such as serious imbalances between accuracy and computational cost, the same treatment of all the sensor information regardless of their quality, high fusion errors at inflection points. To address these defects, a track fusion algorithm based on the reliability (TFR) is presented in multi-sensor and multi-target environments. To improve the information quality, outliers in the local tracks are eliminated at first. Then the reliability of local tracks is calculated, and the local tracks with high reliability are chosen for the state estimation fusion. In contrast to the existing methods, TFR reduces high fusion errors at the inflection points of system tracks, and obtains a high accuracy with less computational cost. Simulation results verify the effectiveness and the superiority of the algorithm in dense sensor environments. PMID:25950174

  1. A reliability-based track fusion algorithm.

    PubMed

    Xu, Li; Pan, Liqiang; Jin, Shuilin; Liu, Haibo; Yin, Guisheng

    2015-01-01

    The common track fusion algorithms in multi-sensor systems have some defects, such as serious imbalances between accuracy and computational cost, the same treatment of all the sensor information regardless of their quality, high fusion errors at inflection points. To address these defects, a track fusion algorithm based on the reliability (TFR) is presented in multi-sensor and multi-target environments. To improve the information quality, outliers in the local tracks are eliminated at first. Then the reliability of local tracks is calculated, and the local tracks with high reliability are chosen for the state estimation fusion. In contrast to the existing methods, TFR reduces high fusion errors at the inflection points of system tracks, and obtains a high accuracy with less computational cost. Simulation results verify the effectiveness and the superiority of the algorithm in dense sensor environments.

  2. High-Fidelity Simulation Meets Athletic Training Education: An Innovative Collaborative Teaching Project

    ERIC Educational Resources Information Center

    Palmer, Elizabeth; Edwards, Taylor; Racchini, James

    2014-01-01

    High-fidelity simulation is frequently used in nursing education to provide students with simulated experiences prior to and throughout clinical coursework that involves direct patient care. These high-tech exercises take advantage of the benefits of a standardized patient or mock patient encounter, while eliminating some of the drawbacks…

  3. Incorporating Reflective Practice into Team Simulation Projects for Improved Learning Outcomes

    ERIC Educational Resources Information Center

    Wills, Katherine V.; Clerkin, Thomas A.

    2009-01-01

    The use of simulation games in business courses is a popular method for providing undergraduate students with experiences similar to those they might encounter in the business world. As such, in 2003 the authors were pleased to find a classroom simulation tool that combined the decision-making and team experiences of a senior management group with…

  4. The accomplishment of the Engineering Design Activities of IFMIF/EVEDA: The European-Japanese project towards a Li(d,xn) fusion relevant neutron source

    NASA Astrophysics Data System (ADS)

    Knaster, J.; Ibarra, A.; Abal, J.; Abou-Sena, A.; Arbeiter, F.; Arranz, F.; Arroyo, J. M.; Bargallo, E.; Beauvais, P.-Y.; Bernardi, D.; Casal, N.; Carmona, J. M.; Chauvin, N.; Comunian, M.; Delferriere, O.; Delgado, A.; Diaz-Arocas, P.; Fischer, U.; Frisoni, M.; Garcia, A.; Garin, P.; Gobin, R.; Gouat, P.; Groeschel, F.; Heidinger, R.; Ida, M.; Kondo, K.; Kikuchi, T.; Kubo, T.; Le Tonqueze, Y.; Leysen, W.; Mas, A.; Massaut, V.; Matsumoto, H.; Micciche, G.; Mittwollen, M.; Mora, J. C.; Mota, F.; Nghiem, P. A. P.; Nitti, F.; Nishiyama, K.; Ogando, F.; O'hira, S.; Oliver, C.; Orsini, F.; Perez, D.; Perez, M.; Pinna, T.; Pisent, A.; Podadera, I.; Porfiri, M.; Pruneri, G.; Queral, V.; Rapisarda, D.; Roman, R.; Shingala, M.; Soldaini, M.; Sugimoto, M.; Theile, J.; Tian, K.; Umeno, H.; Uriot, D.; Wakai, E.; Watanabe, K.; Weber, M.; Yamamoto, M.; Yokomine, T.

    2015-08-01

    The International Fusion Materials Irradiation Facility (IFMIF), presently in its Engineering Validation and Engineering Design Activities (EVEDA) phase under the frame of the Broader Approach Agreement between Europe and Japan, accomplished in summer 2013, on schedule, its EDA phase with the release of the engineering design report of the IFMIF plant, which is here described. Many improvements of the design from former phases are implemented, particularly a reduction of beam losses and operational costs thanks to the superconducting accelerator concept, the re-location of the quench tank outside the test cell (TC) with a reduction of tritium inventory and a simplification on its replacement in case of failure, the separation of the irradiation modules from the shielding block gaining irradiation flexibility and enhancement of the remote handling equipment reliability and cost reduction, and the water cooling of the liner and biological shielding of the TC, enhancing the efficiency and economy of the related sub-systems. In addition, the maintenance strategy has been modified to allow a shorter yearly stop of the irradiation operations and a more careful management of the irradiated samples. The design of the IFMIF plant is intimately linked with the EVA phase carried out since the entry into force of IFMIF/EVEDA in June 2007. These last activities and their on-going accomplishment have been thoroughly described elsewhere (Knaster J et al [19]), which, combined with the present paper, allows a clear understanding of the maturity of the European-Japanese international efforts. This released IFMIF Intermediate Engineering Design Report (IIEDR), which could be complemented if required concurrently with the outcome of the on-going EVA, will allow decision making on its construction and/or serve as the basis for the definition of the next step, aligned with the evolving needs of our fusion community.

  5. Simulating Carbon Dynamics and Species Composition Under Projected Changes in Climate in the Puget Sound, Washington, USA

    NASA Astrophysics Data System (ADS)

    Laflower, D.; Hurteau, M. D.

    2014-12-01

    Changing climate has the potential to directly and indirectly alter forest carbon dynamics and species composition, particularly in temperature or precipitation limited systems. In light-limited systems, species-specific response to changing climate could result in an indirect effect of climate through altered competitive interactions. Joint Base Lewis-McChord, in Washington, contains one of the largest intact forested areas in the Puget Sound. Management priorities include development of late-successional forests and conservation. We sought to quantify how projected changes in climate would affect species diversity and carbon (C) sequestration given management priorities. We used Landis-II to simulate forest dynamics over 100 years using current climate and projected climate under two emission scenarios. Preliminary analyses indicate a decrease in soil C, relative to current climate, beginning mid-century for both emission scenarios. Under the low emission scenario, the decrease is compensated by increased aboveground C, while the high scenario experiences a decline in aboveground C. Total ecosystem C was consistent between baseline and low emission climate throughout the simulation. By late-century, the high scenario had an average decrease of 10 Mg C ha-1. Douglas-fir (DF) accounts for the largest fraction of aboveground biomass (AGB) in the study area. Interestingly, DF AGB was fairly consistent between climate scenarios through mid-century, but diverged during late-century, with the high scenario having the greatest amount of DF AGB (mean 368 Mg ha-1) and current climate having the lowest (mean 341 Mg ha-1). We found the inverse relationship when examining all other species. Given the uncertainty associated with climate projections, future simulations will include a larger suite of climate projections and address the secondary effects of climate change (e.g. increased wildfire, disease or insect outbreaks) that can impact productivity.

  6. Virtual Airspace Modeling and Simulation (VAMS) Project First Technical Interchange Meeting

    NASA Technical Reports Server (NTRS)

    Beard, Robert; Kille, Robert; Kirsten, Richard; Rigterink, Paul; Sielski, Henry; Gratteau, Melinda F. (Editor)

    2002-01-01

    A three-day NASA Virtual Airspace and Modeling Project (VAMS) Technical Interchange Meeting (TIM) was held at the NASA Ames Research Center in Mountain View, CA. on May 21 through May 23,2002. The purpose of this meeting was to share initial concept information sponsored by the VAMS Project. An overall goal of the VAMS Project is to develop validated, blended, robust and transition-able air transportation system concepts over the next five years that will achieve NASA's long-term Enterprise Aviation Capacity goals. This document describes the presentations at the TIM, their related questions and answers, and presents the TIM recommendations.

  7. NASA/Haughton-Mars Project (HMP) 2006 Lunar Medical Contingency Simulation at Devon Island

    NASA Astrophysics Data System (ADS)

    Scheuring, R. A.; Jones, J. A.; Lee, P.; Comtois, J. M.; Chappell, S.; Rafiq, A.; Braham, S.; Hodgson, E.; Sullivan, P.; Wilkinson, N.; Bach, D.; Torney, S.

    2007-03-01

    In order to develop an evidence-base for handling a medical contingency on the lunar surface, a project using the Moon-Mars analog environment at Devon Island, Nunavut, high Canadian Arctic was conducted.

  8. The Virtual ChemLab Project: A Realistic and Sophisticated Simulation of Inorganic Qualitative Analysis

    NASA Astrophysics Data System (ADS)

    Woodfield, Brian F.; Catlin, Heidi R.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Allen, Rob; Bodily, Greg

    2004-11-01

    We have created a set of sophisticated and realistic laboratory simulations for use in freshman- and sophomore-level chemistry classes and laboratories called Virtual ChemLab. We have completed simulations for Inorganic Qualitative Analysis, Organic Synthesis and Organic Qualitative Analysis, Experiments in Quantum Chemistry, Gas Properties, Titration Experiments, and Calorimetric and Thermochemical Experiments. The purpose of our simulations is to reinforce concepts taught in the classroom, provide an environment for creative learning, and emphasize the thinking behind instructional laboratory experiments. We have used the inorganic simulation extensively with thousands of students in our department at Brigham Young University. We have learned from our evaluation that: (i) students enjoy using these simulations and find them to be an asset in learning effective problem-solving strategies, (ii) students like the fact that they can both reproduce experimental procedures and explore various topics in ways they choose, and (iii) students naturally divide themselves into two groups: creative learners, who excel in an open-ended environment of virtual laboratories, and structured learners, who struggle in this same environment. In this article, we describe the Inorganic Qualitative Analysis simulation; we also share specific evaluation findings from using the inorganic simulation in classroom and laboratory settings.

  9. Modeling and Simulation of Longitudinal Dynamics for Low Energy Ring_High Energy Ring at the Positron-Electron Project

    SciTech Connect

    Rivetta, Claudio; Mastorides, T.; Fox, J.D.; Teytelman, D.; Van Winkle, D.; /SLAC

    2007-03-06

    A time domain dynamic modeling and simulation tool for beam-cavity interactions in the Low Energy Ring (LER) and High Energy Ring (HER) at the Positron-Electron Project (PEP-II) is presented. Dynamic simulation results for PEP-II are compared to measurements of the actual machine. The motivation for this tool is to explore the stability margins and performance limits of PEP-II radio-frequency (RF) systems at future higher currents and upgraded RF configurations. It also serves as a test bed for new control algorithms and can define the ultimate limits of the low-level RF (LLRF) architecture. The time domain program captures the dynamic behavior of the beam-cavity-LLRF interaction based on a reduced model. The ring current is represented by macrobunches. Multiple RF stations in the ring are represented via one or two macrocavities. Each macrocavity captures the overall behavior of all the 2 or 4 cavity RF stations. Station models include nonlinear elements in the klystron and signal processing. This enables modeling the principal longitudinal impedance control loops interacting via the longitudinal beam model. The dynamics of the simulation model are validated by comparing the measured growth rates for the LER with simulation results. The simulated behavior of the LER at increased operation currents is presented via low-mode instability growth rates. Different control strategies are compared and the effects of both the imperfections in the LLRF signal processing and the nonlinear drivers and klystrons are explored.

  10. Projected climate regime shift under future global warming from multi-model, multi-scenario CMIP5 simulations

    NASA Astrophysics Data System (ADS)

    Feng, Song; Hu, Qi; Huang, Wei; Ho, Chang-Hoi; Li, Ruopu; Tang, Zhenghong

    2014-01-01

    This study examined shifts in climate regimes over the global land area using the Köppen-Trewartha (K-T) climate classification by analyzing observations during 1900-2010, and simulations during 1900-2100 from twenty global climate models participating in Phase 5 of the Coupled Model Inter-comparison Project (CMIP5). Under the Intergovernmental Panel on Climate Change Representative Concentration Pathways 8.5 (RCP8.5) scenario, the models projected a 3°-10 °C warming in annual temperature over the global land area by the end of the twenty-first century, with strong (moderate) warming in the high (middle) latitudes of the Northern Hemisphere and weaker warming in the tropics and the Southern Hemisphere. The projected changes in precipitation vary considerably in space and present greater uncertainties among the models. Overall, the models are consistent in projecting increasing precipitation over the high-latitude of the Northern Hemisphere, and reduced precipitation in the Mediterranean, southwestern North America, northern and southern Africa and Australia. Based on the projected changes in temperature and precipitation, the K-T climate types would shift toward warmer and drier climate types from the current climate distribution. Regions of temperate, tropical and dry climate types are projected to expand, while regions of polar, sub-polar and subtropical climate types are projected to contract. The magnitudes of the projected changes are stronger in the RCP8.5 scenario than the low emission scenario RCP4.5. On average, the climate types in 31.4% and 46.3% of the global land area are projected to change by the end of the twenty-first century under RCP4.5 and RCP8.5 scenarios, respectively. Further analysis suggests that changes in precipitation played a slightly more important role in causing shifts of climate type during the twentieth century. However, the projected changes in temperature play an increasingly important role and dominate shifts in climate type

  11. Three-dimensional numerical reservoir simulation of the EGS Demonstration Project at The Geysers geothermal field

    NASA Astrophysics Data System (ADS)

    Borgia, Andrea; Rutqvist, Jonny; Oldenburg, Curt M.; Hutchings, Lawrence; Garcia, Julio; Walters, Mark; Hartline, Craig; Jeanne, Pierre; Dobson, Patrick; Boyle, Katie

    2013-04-01

    The Enhanced Geothermal System (EGS) Demonstration Project, currently underway at the Northwest Geysers, California, aims to demonstrate the feasibility of stimulating a deep high-temperature reservoir (up to 400 °C) through water injection over a 2-year period. On October 6, 2011, injection of 25 l/s started from the Prati 32 well at a depth interval of 1850-2699 m below sea level. After a period of almost 2 months, the injection rate was raised to 63 l/s. The flow rate was then decreased to 44 l/s after an additional 3.5 months and maintained at 25 l/s up to August 20, 2012. Significant well-head pressure changes were recorded at Prati State 31 well, which is separated from Prati 32 by about 500 m at reservoir level. More subdued pressure increases occur at greater distances. The water injection caused induced seismicity in the reservoir in the vicinity of the well. Microseismic monitoring and interpretation shows that the cloud of seismic events is mainly located in the granitic intrusion below the injection zone, forming a cluster elongated SSE-NNW (azimuth 170°) that dips steeply to the west. In general, the magnitude of the events increases with depth and the hypocenter depth increases with time. This seismic cloud is hypothesized to correlate with enhanced permeability in the high-temperature reservoir and its variation with time. Based on the existing borehole data, we use the GMS™ GUI to construct a realistic three-dimensional (3D) geologic model of the Northwest Geysers geothermal field. This model includes, from the top down, a low permeability graywacke layer that forms the caprock for the reservoir, an isothermal steam zone (known as the normal temperature reservoir) within metagraywacke, a hornfels zone (where the high-temperature reservoir is located), and a felsite layer that is assumed to extend downward to the magmatic heat source. We then map this model onto a rectangular grid for use with the TOUGH2 multiphase, multicomponent, non

  12. Heavy Ion Fusion Injector Program

    SciTech Connect

    Yu, S.; Eylon, S.; Chupp, W.W.

    1993-05-01

    A program is underway to construct a 2 MV, 800 mA, K{sup +} injector for heavy ion fusion. The Electrostatic Quadrupole (ESQ) injector configuration consists of a zeolite source, a diode of up to 1 MV, together with several electrostatic quadrupole units to simultaneously focus and accelerate the beam to 2 MV. The key issues of source technology, high voltage breakdown, beam aberrations, and transient effects will be discussed. Results from ongoing experiments and simulations will be presented.

  13. Polarimeter for the General Fusion SPECTOR machine

    NASA Astrophysics Data System (ADS)

    Carle, Patrick; Froese, Aaron; Wong, Adrian; Howard, Stephen; O'Shea, Peter; Laberge, Michel

    2016-11-01

    A polarimeter has been designed to measure Faraday rotation and help to understand the profile of its safety factor, q, on the recently built SPECTOR magnetized target fusion machine at General Fusion. The polarimeter uses two counter-rotating, circularly polarized, 118.8 μm beams to probe the plasma. Grad-Shafranov simulations have been used to investigate the effect of measurement error and chord geometry.

  14. Simulation of African Easterly Waves and its Projection in Response to Anthropogenic Greenhouse Forcing in a High Resolution AGCM

    NASA Astrophysics Data System (ADS)

    Kunhu Bangalth, Hamza; Raj, Jerry; Bhaskar Gunturu, Udaya; Stenchikov, Georgiy

    2016-04-01

    African Easterly Waves (AEWs) are the primary synoptic-scale disturbances over tropical Africa and Atlantic, which propagate westward from East Africa towards Atlantic during summer. AEWs have a pivotal role in the initiation and organization of the convective rainfall over this region and often act as the precursor for Atlantic tropical cyclones. Present study uses a high resolution AGCM, High Resolution Atmospheric Model (HiRAM) developed at GFDL, to investigate the projected changes in AEW characteristics in response to anthropogenic greenhouse forcing. Ensembles of simulations are conducted at a spatial resolution of ~ 25 km, with observed SST and SSTs from two coarse resolution Earth System Models (ESM2M and ESM2G) developed at GFDL, in the history period (1975-2004). Future projections (till 2050) are also conducted for two Representative Concentration Pathways (RCPs), RCP4.5 and RCP8.5. To test the ability of HiRAM to properly simulate the three dimensional structure and the space-time variability of AEW, the simulations in the history period are compared against two reanalysis products, ERA-Interim and MERRA, and against the parent ESMs. Space-time spectral analysis and complex empirical orthogonal function analysis have been conducted to investigate the dispersion characteristics and modes of variability, respectively. The representation of AEW in HiRAM is comparable to reanalyses and is improved in comparison with the coarse resolution parent ESMs.

  15. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant.

    PubMed

    Camplani, M; Malizia, A; Gelfusa, M; Barbato, F; Antonelli, L; Poggi, L A; Ciparisse, J F; Salgado, L; Richetta, M; Gaudio, P

    2016-01-01

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles' velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach. PMID:26827318

  16. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant

    NASA Astrophysics Data System (ADS)

    Camplani, M.; Malizia, A.; Gelfusa, M.; Barbato, F.; Antonelli, L.; Poggi, L. A.; Ciparisse, J. F.; Salgado, L.; Richetta, M.; Gaudio, P.

    2016-01-01

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles' velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach.

  17. Image computing techniques to extrapolate data for dust tracking in case of an experimental accident simulation in a nuclear fusion plant.

    PubMed

    Camplani, M; Malizia, A; Gelfusa, M; Barbato, F; Antonelli, L; Poggi, L A; Ciparisse, J F; Salgado, L; Richetta, M; Gaudio, P

    2016-01-01

    In this paper, a preliminary shadowgraph-based analysis of dust particles re-suspension due to loss of vacuum accident (LOVA) in ITER-like nuclear fusion reactors has been presented. Dust particles are produced through different mechanisms in nuclear fusion devices, one of the main issues is that dust particles are capable of being re-suspended in case of events such as LOVA. Shadowgraph is based on an expanded collimated beam of light emitted by a laser or a lamp that emits light transversely compared to the flow field direction. In the STARDUST facility, the dust moves in the flow, and it causes variations of refractive index that can be detected by using a CCD camera. The STARDUST fast camera setup allows to detect and to track dust particles moving in the vessel and then to obtain information about the velocity field of dust mobilized. In particular, the acquired images are processed such that per each frame the moving dust particles are detected by applying a background subtraction technique based on the mixture of Gaussian algorithm. The obtained foreground masks are eventually filtered with morphological operations. Finally, a multi-object tracking algorithm is used to track the detected particles along the experiment. For each particle, a Kalman filter-based tracker is applied; the particles dynamic is described by taking into account position, velocity, and acceleration as state variable. The results demonstrate that it is possible to obtain dust particles' velocity field during LOVA by automatically processing the data obtained with the shadowgraph approach.

  18. Viral membrane fusion.

    PubMed

    Harrison, Stephen C

    2015-05-01

    Membrane fusion is an essential step when enveloped viruses enter cells. Lipid bilayer fusion requires catalysis to overcome a high kinetic barrier; viral fusion proteins are the agents that fulfill this catalytic function. Despite a variety of molecular architectures, these proteins facilitate fusion by essentially the same generic mechanism. Stimulated by a signal associated with arrival at the cell to be infected (e.g., receptor or co-receptor binding, proton binding in an endosome), they undergo a series of conformational changes. A hydrophobic segment (a "fusion loop" or "fusion peptide") engages the target-cell membrane and collapse of the bridging intermediate thus formed draws the two membranes (virus and cell) together. We know of three structural classes for viral fusion proteins. Structures for both pre- and postfusion conformations of illustrate the beginning and end points of a process that can be probed by single-virion measurements of fusion kinetics.

  19. Line-Tension Controlled Mechanism for Influenza Fusion

    PubMed Central

    Risselada, Herre Jelger; Smirnova, Yuliya G.; Grubmüller, Helmut; Marrink, Siewert Jan; Müller, Marcus

    2012-01-01

    Our molecular simulations reveal that wild-type influenza fusion peptides are able to stabilize a highly fusogenic pre-fusion structure, i.e. a peptide bundle formed by four or more trans-membrane arranged fusion peptides. We rationalize that the lipid rim around such bundle has a non-vanishing rim energy (line-tension), which is essential to (i) stabilize the initial contact point between the fusing bilayers, i.e. the stalk, and (ii) drive its subsequent evolution. Such line-tension controlled fusion event does not proceed along the hypothesized standard stalk-hemifusion pathway. In modeled influenza fusion, single point mutations in the influenza fusion peptide either completely inhibit fusion (mutants G1V and W14A) or, intriguingly, specifically arrest fusion at a hemifusion state (mutant G1S). Our simulations demonstrate that, within a line-tension controlled fusion mechanism, these known point mutations either completely inhibit fusion by impairing the peptide’s ability to stabilize the required peptide bundle (G1V and W14A) or stabilize a persistent bundle that leads to a kinetically trapped hemifusion state (G1S). In addition, our results further suggest that the recently discovered leaky fusion mutant G13A, which is known to facilitate a pronounced leakage of the target membrane prior to lipid mixing, reduces the membrane integrity by forming a ‘super’ bundle. Our simulations offer a new interpretation for a number of experimentally observed features of the fusion reaction mediated by the prototypical fusion protein, influenza hemagglutinin, and might bring new insights into mechanisms of other viral fusion reactions. PMID:22761674

  20. Simulation of five ground-water withdrawal projections for the Black Mesa area, Navajo and Hopi Indian Reservations, Arizona

    USGS Publications Warehouse

    Brown, J.G.; Eychaner, J.H.

    1988-01-01

    The N Aquifer is the main source of water in the 5,400 sq mi Black Mesa area in the Navajo and Hopi Indian Reservations in northeastern Arizona. Water in the aquifer is under confined conditions in the central 3,300 sq mi of the area. Maximum saturated thickness is about 1,050 ft. Annual groundwater withdrawals from 1972 through 1986 averaged 5,480 acre-ft and included 3,820 acre-ft used to operate a coal mine on Black Mesa. As a result, water levels have declined in a large part of the aquifer. The coal company has applied for a permanent permit under the Surface Mining Control and Reclamation Act of 1977. An existing mathematical model of the aquifer in the Black Mesa area was converted to a newer model program and recalibrated by using revised estimates of selected aquifer parameters and a finer spatial grid. The model was used to simulate four groundwater withdrawal alternatives that combined the existing and proposed mining plans with projected constant or increasing pumpage for nearby communities. A fifth alternative combined increasing community pumpage with no mine withdrawals and was used as a basis for comparison. Simulated water levels for the year 2031 in the coal-lease area are projected to be 60 ft lower than in 1985 for the proposed mining plan combined with growing community pumpage and > 100 ft lower than predevelopment water levels over an area of 1,660 sq mi. Groundwater would rise to within 100 ft of predevelopment levels < 10 yr after mine withdrawals cease. Withdrawals at the mine were a minor factor in determining simulated water levels at most communities in the study area. Water levels at Tuba City were not affected by mine pumpage in any projection. (Author 's abstract)

  1. Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, 1901-95, with projections to 2020

    USGS Publications Warehouse

    Kernodle, J.M.

    1998-01-01

    The ground-water-flow model of the Albuquerque Basin (Kernodle, J.M., McAda, D.P., and Thorn, C.R., 1995, Simulation of ground-water flow in the Albuquerque Basin, central New Mexico, with projections to 2020: U.S. Geological Survey Water-Resources Investigations Report 94-4251, 114 p.) was updated to include new information on the hydrogeologic framework (Hawley, J.W., Haase, C.S., and Lozinsky, R.P., 1995, An underground view of the Albuquerque Basin: Proceedings of the 39th Annual New Mexico Water Conference, November 3-4, 1994, p. 37-55). An additional year of ground-water-withdrawal data was appended to the simulation of the historical period and incorporated into the base for future projections to the year 2020. The revised model projects the simulated ground-water levels associated with an aerally enlarged occurrence of the relatively high hydraulic conductivity in the upper part of the Santa Fe Group east and west of the Rio Grande in the Albuquerque area and north to Bernalillo. Although the differences between the two model versions are substantial, the revised model does not contradict any previous conclusions about the effect of City of Albuquerque ground-water withdrawals on flow in the Rio Grande or the net benefits of an effort to conserve ground water. Recent revisions to the hydrogeologic model (Hawley, J.W., Haneberg, W.C., and Whitworth, P.M., in press, Hydrogeologic investigations in the Albuquerque Basin, central New Mexico, 1992-1995: Socorro, New Mexico Bureau of Mines and Mineral Resources Open- File Report 402) of the Albuquerque Basin eventually will require that this model version also be revised and updated.

  2. Magneto-Inertial Fusion

    DOE PAGESBeta

    Wurden, G. A.; Hsu, S. C.; Intrator, T. P.; Grabowski, T. C.; Degnan, J. H.; Domonkos, M.; Turchi, P. J.; Campbell, E. M.; Sinars, D. B.; Herrmann, M. C.; et al

    2015-11-17

    In this community white paper, we describe an approach to achieving fusion which employs a hybrid of elements from the traditional magnetic and inertial fusion concepts, called magneto-inertial fusion (MIF). The status of MIF research in North America at multiple institutions is summarized including recent progress, research opportunities, and future plans.

  3. Hot and cold fusion

    SciTech Connect

    Not Available

    1990-08-01

    This article presents an overview of research in cold fusion research and development in cold fusion at the Tokomak Fusion Test Reactor at the Princeton Plasma Physics Lab, and at the inertial containment facility at Lawrence Livermore National Lab. is described.

  4. New Capabilities for Modeling Intense Beams in Heavy Ion Fusion Drivers

    SciTech Connect

    Friedman, A; Barnard, J J; Bieniosek, F M; Celata, C M; Cohen, R H; Davidson, R C; Grote, D P; Haber, I; Henestroza, E; Lee, E P; Lund, S M; Qin, H; Sharp, W M; Startsev, E; Vay, J L

    2003-09-09

    Significant advances have been made in modeling the intense beams of heavy-ion beam-driven Inertial Fusion Energy (Heavy Ion Fusion). In this paper, a roadmap for a validated, predictive driver simulation capability, building on improved codes and experimental diagnostics, is presented, as are examples of progress. The Mesh Refinement and Particle-in-Cell methods were integrated in the WARP code; this capability supported an injector experiment that determined the achievable current rise time, in good agreement with calculations. In a complementary effort, a new injector approach based on the merging of {approx}100 small beamlets was simulated, its basic feasibility established, and an experimental test designed. Time-dependent 3D simulations of the High Current Experiment (HCX) were performed, yielding voltage waveforms for an upcoming study of bunch-end control. Studies of collective beam modes which must be taken into account in driver designs were carried out. The value of using experimental data to tomographically ''synthesize'' a 4D beam particle distribution and so initialize a simulation was established; this work motivated further development of new diagnostics which yield 3D projections of the beam phase space. Other developments, including improved modeling of ion beam focusing and transport through the fusion chamber environment and onto the target, and of stray electrons and their effects on ion beams, are briefly noted.

  5. Validation of CME Detection Software (CACTus) by Means of Simulated Data, and Analysis of Projection Effects on CME Velocity Measurements

    NASA Astrophysics Data System (ADS)

    Bonte, K.; Jacobs, C.; Robbrecht, E.; de Groof, A.; Berghmans, D.; Poedts, S.

    2011-05-01

    In the context of space weather forecasting, an automated detection of coronal mass ejections (CMEs) becomes more and more important for efficiently handling a large data flow which is expected from recently-launched and future solar missions. In this paper we validate the detection software package "CACTus" by applying the program to synthetic data from our 3D time-dependent CME simulations instead of observational data. The main strength of this study is that we know in advance what should be detected. We describe the sensitivities and strengths of automated detection, more specific for the CACTus program, resulting in a better understanding of CME detection on one hand and the calibration of the CACTus software on the other hand, suggesting possible improvements of the package. In addition, the simulation is an ideal tool to investigate projection effects on CME velocity measurements.

  6. The QuakeSim Project: Numerical Simulations for Active Tectonic Processes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry

    2004-01-01

    In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.

  7. Simulation of a Forensic Chemistry Problem: A Multidisciplinary Project for Secondary School Chemistry Students.

    ERIC Educational Resources Information Center

    Long, G. A.

    1995-01-01

    Describes a project that uses a multidisciplinary approach to problem solving in analyzing a crime scene and suspect evidence. Requires each student to work effectively in a team, communicate in both written and oral forms, perform hands-on laboratory manipulations, and realize that the entire class was depending on their individual contributions…

  8. A Tire Gasification Senior Design Project That Integrates Laboratory Experiments and Computer Simulation

    ERIC Educational Resources Information Center

    Weiss, Brian; Castaldi, Marco J.

    2006-01-01

    A reactor to convert waste rubber tires to useful products such as CO and H2, was investigated in a university undergraduate design project. The student worked individually with mentorship from a faculty professor who aided the student with professional critique. The student was able to research the background of the field and conceive of a novel…

  9. Project GeoSim: A GIS-Based Simulation Laboratory for Introductory Geography.

    ERIC Educational Resources Information Center

    Shaffer, Clifford A.

    This report describes a multidisciplinary project by members of Virginia Polytechnic Institute and State University's Departments of Geography and Computer Science, and College of Education, to develop computer-aided education software for introductory geography at the college and high school levels. GeoSim's goal was to produce major changes in…

  10. Discrete event simulation of NASA's Remote Exploration and Experimentation Project (REE)

    NASA Technical Reports Server (NTRS)

    Dunphy, J.; Rogstad, S.

    2001-01-01

    The Remote Exploration and Experimentation Project (REE) is a new initiative at JPL to be able to place a supercomputer on board a spacecraft and allow large amounts of data reduction and compression to be done before science results are returned to Earth.

  11. The Virtual ChemLab Project: A Realistic and Sophisticated Simulation of Inorganic Qualitative Analysis

    ERIC Educational Resources Information Center

    Woodfield, Brian F.; Catlin, Heidi R.; Waddoups, Gregory L.; Moore, Melissa S.; Swan, Richard; Bodily, Greg; Allen, Rob

    2004-01-01

    Virtual ChemLab project is an instructional laboratory involved in providing a practical experience by connecting the theory and laboratory practicals, teaching laboratory techniques and teaching the cognitive processes. This lab provides the students with the freedom to explore, repeat the procedures again, focuses on the underlying principles of…

  12. Plasma asymmetry due to the magnetic filter in fusion-type negative ion sources: Comparisons between two and three-dimensional particle-in-cell simulations

    SciTech Connect

    Fubiani, G. Boeuf, J. P.

    2014-07-15

    Previously reported 2D Particle-In-Cell Monte Carlo Collisions (PIC-MCC) simulations of negative ion sources under conditions similar to those of the ITER neutral beam injection system have shown that the presence of the magnetic filter tends to generate asymmetry in the plasma properties in the extraction region. In this paper, we show that these conclusions are confirmed by 3D PIC-MCC simulations and we provide quantitative comparisons between the 2D and 3D model predictions.

  13. Simulation of the Ground-Water Flow System in 1992, and Simulated Effects of Projected Ground-Water Withdrawals in 2020 in the New Jersey Coastal Plain

    USGS Publications Warehouse

    Gordon, Alison D.

    2003-01-01

    In 1992, ground-water withdrawals from the unconfined and confined aquifers in the New Jersey Coastal Plain totaled about 300 million gallons per day, and about 70 percent (200 million galllons per day) of this water was pumped from confined aquifers. The withdrawals have created large cones of depression in several Coastal Plain aquifers near populated areas, particularly in Camden and Ocean Counties. The continued decline of water levels in confined aquifers could cause saltwater intrusion, reduction of stream discharge near the outcrop areas of these aquifers, and depletion of the ground-water supply. Because of this, withdrawals from wells located within these critical areas have been reduced in the Potomac-Raritan-Magothy aquifer system, the Englishtown aquifer system, and the Wenonah-Mount Laurel aquifer. A computer-based model that simulates freshwater and saltwater flow was used to simulate transient ground-water flow conditions and the location of the freshwater-saltwater interface during 1989-92 in the New Jersey Coastal Plain. This simulation was used as the baseline for comparison of water levels and flow budgets. Four hypothetical withdrawal scenarios were simulated in which ground-water withdrawals were either increased or decreased. In scenario 1, withdrawals from wells located within critical area 2 in the Potomac-Raritan-Magothy aquifer system were reduced by amounts ranging from 0 to 35 percent of withdrawals prior to 1992. Critical area 2 is mainly located in Camden County, and most of Burlington and Gloucester Counties. With the reductions, water levels recovered about 30 feet in the regional cone of depression centered in Camden County in the Upper Potomac-Raritan-Magothy aquifer and by 20 ft in the Lower and Middle Potomac-Raritan-Magothy aquifers. In scenarios 2 to 4, withdrawals projected for 2020 were input to the model. In scenario 2, withdrawal restrictions within the critical areas were imposed in the Potomac-Raritan-Magothy aquifer

  14. A Simulation-Based LED Design Project in Photonics Instruction Based on Industry-University Collaboration

    ERIC Educational Resources Information Center

    Chang, S. -H.; Chen, M. -L.; Kuo, Y. -K.; Shen, Y. -C.

    2011-01-01

    In response to the growing industrial demand for light-emitting diode (LED) design professionals, based on industry-university collaboration in Taiwan, this paper develops a novel instructional approach: a simulation-based learning course with peer assessment to develop students' professional skills in LED design as required by industry as well as…

  15. A Simulated Method Of Image Reconstruction By Projection In Fourier Domain

    NASA Astrophysics Data System (ADS)

    Ma, Ming-gang

    1984-12-01

    In general image processing and pattern recognition laboratory, CT machine is not available. So we use simulated method to research image reconstruction algorithm and develop the application software. A Fortran application software is completed. The result is quite satisfied but bring with some errors.

  16. Viral membrane fusion

    SciTech Connect

    Harrison, Stephen C.

    2015-05-15

    Membrane fusion is an essential step when enveloped viruses enter cells. Lipid bilayer fusion requires catalysis to overcome a high kinetic barrier; viral fusion proteins are the agents that fulfill this catalytic function. Despite a variety of molecular architectures, these proteins facilitate fusion by essentially the same generic mechanism. Stimulated by a signal associated with arrival at the cell to be infected (e.g., receptor or co-receptor binding, proton binding in an endosome), they undergo a series of conformational changes. A hydrophobic segment (a “fusion loop” or “fusion peptide”) engages the target-cell membrane and collapse of the bridging intermediate thus formed draws the two membranes (virus and cell) together. We know of three structural classes for viral fusion proteins. Structures for both pre- and postfusion conformations of illustrate the beginning and end points of a process that can be probed by single-virion measurements of fusion kinetics. - Highlights: • Viral fusion proteins overcome the high energy barrier to lipid bilayer merger. • Different molecular structures but the same catalytic mechanism. • Review describes properties of three known fusion-protein structural classes. • Single-virion fusion experiments elucidate mechanism.

  17. Commnity Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect

    Spentzouris, Panagiotis; Cary, John; Mcinnes, Lois Curfman; Mori, Warren; Ng, Cho; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2008-07-01

    The design and performance optimization of particle accelerators is essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC1 Accelerator Science and Technology project, the SciDAC2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modeling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multi-physics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  18. Community petascale project for accelerator science and simulation : Advancing computational science for future accelerators and accelerator technologies.

    SciTech Connect

    Spentzouris, P.; Cary, J.; McInnes, L. C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.

    2008-01-01

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R & D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  19. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    SciTech Connect

    Spentzouris, Panagiotis; Cary, John; Mcinnes, Lois Curfman; Mori, Warren; Ng, Cho; Ng, Esmond; Ryne, Robert; /LBL, Berkeley

    2011-10-21

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.

  20. Simulating the Cranfield geological carbon sequestration project with high-resolution static models and an accurate equation of state

    DOE PAGESBeta

    Soltanian, Mohamad Reza; Amooie, Mohammad Amin; Cole, David R.; Graham, David E.; Hosseini, Seyyed Abolfazl; Hovorka, Susan; Pfiffner, Susan M.; Phelps, Tommy Joe; Moortgat, Joachim

    2016-10-11

    In this study, a field-scale carbon dioxide (CO2) injection pilot project was conducted as part of the Southeast Regional Sequestration Partnership (SECARB) at Cranfield, Mississippi. We present higher-order finite element simulations of the compositional two-phase CO2-brine flow and transport during the experiment. High- resolution static models of the formation geology in the Detailed Area Study (DAS) located below the oil- water contact (brine saturated) are used to capture the impact of connected flow paths on breakthrough times in two observation wells. Phase behavior is described by the cubic-plus-association (CPA) equation of state, which takes into account the polar nature ofmore » water molecules. Parameter studies are performed to investigate the importance of Fickian diffusion, permeability heterogeneity, relative permeabilities, and capillarity. Simulation results for the pressure response in the injection well and the CO2 breakthrough times at the observation wells show good agreement with the field data. For the high injection rates and short duration of the experiment, diffusion is relatively unimportant (high P clet numbers), while relative permeabilities have a profound impact on the pressure response. High-permeability pathways, created by fluvial deposits, strongly affect the CO2 transport and highlight the importance of properly characterizing the formation heterogeneity in future carbon sequestration projects.« less

  1. The Dust Management Project: Characterizing Lunar Environments and Dust, Developing Regolith Mitigation Technology and Simulants

    NASA Technical Reports Server (NTRS)

    Hyatt, Mark J.; Straka, Sharon A.

    2010-01-01

    A return to the Moon to extend human presence, pursue scientific activities, use the Moon to prepare for future human missions to Mars, and expand Earth?s economic sphere, will require investment in developing new technologies and capabilities to achieve affordable and sustainable human exploration. From the operational experience gained and lessons learned during the Apollo missions, conducting long-term operations in the lunar environment will be a particular challenge, given the difficulties presented by the unique physical properties and other characteristics of lunar regolith, including dust. The Apollo missions and other lunar explorations have identified significant lunar dust-related problems that will challenge future mission success. Comprised of regolith particles ranging in size from tens of nanometers to microns, lunar dust is a manifestation of the complex interaction of the lunar soil with multiple mechanical, electrical, and gravitational effects. The environmental and anthropogenic factors effecting the perturbation, transport, and deposition of lunar dust must be studied in order to mitigate it?s potentially harmful effects on exploration systems and human explorers. The Dust Management Project (DMP) is tasked with the evaluation of lunar dust effects, assessment of the resulting risks, and development of mitigation and management strategies and technologies related to Exploration Systems architectures. To this end, the DMP supports the overall goal of the Exploration Technology Development Program (ETDP) of addressing the relevant high priority technology needs of multiple elements within the Constellation Program (CxP) and sister ETDP projects. Project scope, plans, and accomplishments will be presented.

  2. The fusion breeder

    NASA Astrophysics Data System (ADS)

    Moir, Ralph W.

    1982-10-01

    The fusion breeder is a fusion reactor designed with special blankets to maximize the transmutation by 14 MeV neutrons of uranium-238 to plutonium or thorium to uranium-233 for use as a fuel for fission reactors. Breeding fissile fuels has not been a goal of the U.S. fusion energy program. This paper suggests it is time for a policy change to make the fusion breeder a goal of the U.S. fusion program and the U.S. nuclear energy program. There is wide agreement that many approaches will work and will produce fuel for five equal-sized LWRs, and some approach as many as 20 LWRs at electricity costs within 20% of those at today's price of uranium (30/lb of U3O8). The blankets designed to suppress fissioning, called symbiotes, fusion fuel factories, or just fusion breeders, will have safety characteristics more like pure fusion reactors and will support as many as 15 equal power LWRs. The blankets designed to maximize fast fission of fertile material will have safety characteristics more like fission reactors and will support 5 LWRs. This author strongly recommends development of the fission suppressed blanket type, a point of view not agreed upon by everyone. There is, however, wide agreement that, to meet the market price for uranium which would result in LWR electricity within 20% of today's cost with either blanket type, fusion components can cost severalfold more than would be allowed for pure fusion to meet the goal of making electricity alone at 20% over today's fission costs. Also widely agreed is that the critical-path-item for the fusion breeder is fusion development itself; however, development of fusion breeder specific items (blankets, fuel cycle) should be started now in order to have the fusion breeder by the time the rise in uranium prices forces other more costly choices.

  3. Research on compressive fusion by multiwavelet transform

    NASA Astrophysics Data System (ADS)

    Yang, Senlin; Wan, Guobin; Li, Yuanyuan; Zhao, Xiaoxia; Chong, Xin

    2014-02-01

    A new strategy for images fusion is developed on the basis of block compressed sensing (BCS) and multiwavelet transform (MWT). Since the BCS with structured random matrix requires small memory space and enables fast computation, firstly, the images with large amounts of data can be compressively sampled into block images for fusion. Secondly, taking full advantages of multiwavelet such as symmetry, orthogonality, short support, and a higher number of vanishing moments, the compressive sampling of block images can be better described by MWT transform. Then the compressive measurements are fused with a linear weighting strategy based on MWT decomposition. And finally, the fused compressive samplings are reconstructed by the smoothed projection Landweber algorithm, with consideration of blocking artifacts. Experiment result shows that the validity of proposed method. Simultaneously, field test indicates that the compressive fusion can give similar resolution with traditional MWT fusion.

  4. Magnetic mirror fusion: status and prospects

    SciTech Connect

    Post, R.F.

    1980-02-11

    Two improved mirror systems, the tandem mirror (TM) and the field-reversed mirror (FRM) are being intensively studied. The twin practical aims of these studies: to improve the economic prospects for mirror fusion power plants and to reduce the size and/or complexity of such plants relative to earlier approaches to magnetic fusion. While at the present time the program emphasis is still strongly oriented toward answering scientific questions, the emphasis is shifting as the data accumulates and as larger facilities - ones with a heavy technological and engineering orientation - are being prepared. The experimental and theoretical progress that led to the new look in mirror fusion research is briefly reviewed, the new TM and the FRM ideas are outlined, and the projected future course of mirror fusion research is discussed.

  5. The Numerical Tokamak Project (NTP) simulation of turbulent transport in the core plasma: A grand challenge in plasma physics

    SciTech Connect

    Not Available

    1993-12-01

    The long-range goal of the Numerical Tokamak Project (NTP) is the reliable prediction of tokamak performance using physics-based numerical tools describing tokamak physics. The NTP is accomplishing the development of the most advanced particle and extended fluid model`s on massively parallel processing (MPP) environments as part of a multi-institutional, multi-disciplinary numerical study of tokamak core fluctuations. The NTP is a continuing focus of the Office of Fusion Energy`s theory and computation program. Near-term HPCC work concentrates on developing a predictive numerical description of the core plasma transport in tokamaks driven by low-frequency collective fluctuations. This work addresses one of the greatest intellectual challenges to our understanding of the physics of tokamak performance and needs the most advanced computational resources to progress. We are conducting detailed comparisons of kinetic and fluid numerical models of tokamak turbulence. These comparisons are stimulating the improvement of each and the development of hybrid models which embody aspects of both. The combination of emerging massively parallel processing hardware and algorithmic improvements will result in an estimated 10**2--10**6 performance increase. Development of information processing and visualization tools is accelerating our comparison of computational models to one another, to experimental data, and to analytical theory, providing a bootstrap effect in our understanding of the target physics. The measure of success is the degree to which the experimentally observed scaling of fluctuation-driven transport may be predicted numerically. The NTP is advancing the HPCC Initiative through its state-of-the-art computational work. We are pushing the capability of high performance computing through our efforts which are strongly leveraged by OFE support.

  6. INTRODUCTION: Status report on fusion research

    NASA Astrophysics Data System (ADS)

    Burkart, Werner

    2005-10-01

    members' personal views on the latest achievements in fusion research, including magnetic and inertial confinement scenarios. The report describes fusion fundamentals and progress in fusion science and technology, with ITER as a possible partner in the realization of self-sustainable burning plasma. The importance of the socio-economic aspects of energy production using fusion power plants is also covered. Noting that applications of plasma science are of broad interest to the Member States, the report addresses the topic of plasma physics to assist in understanding the achievements of better coatings, cheaper light sources, improved heat-resistant materials and other high-technology materials. Nuclear fusion energy production is intrinsically safe, but for ITER the full range of hazards will need to be addressed, including minimising radiation exposure, to accomplish the goal of a sustainable and environmentally acceptable production of energy. We anticipate that the role of the Agency will in future evolve from supporting scientific projects and fostering information exchange to the preparation of safety principles and guidelines for the operation of burning fusion plasmas with a Q > 1. Technical progress in inertial and magnetic confinement, as well as in alternative concepts, will lead to a further increase in international cooperation. New means of communication will be needed, utilizing the best resources of modern information technology to advance interest in fusion. However, today the basis of scientific progress is still through journal publications and, with this in mind, we trust that this report will find an interested readership. We acknowledge with thanks the support of the members of the IFRC as an advisory body to the Agency. Seven chairmen have presided over the IFRC since its first meeting in 1971 in Madison, USA, ensuring that the IAEA fusion efforts were based on the best professional advice possible, and that information on fusion developments has

  7. Engineering Challenges in Antiproton Triggered Fusion Propulsion

    SciTech Connect

    Cassenti, Brice; Kammash, Terry

    2008-01-21

    During the last decade antiproton triggered fusion propulsion has been investigated as a method for achieving high specific impulse, high thrust in a nuclear pulse propulsion system. In general the antiprotons are injected into a pellet containing fusion fuel with a small amount of fissionable material (i.e., an amount less than the critical mass) where the products from the fission are then used to trigger a fusion reaction. Initial calculations and simulations indicate that if magnetically insulated inertial confinement fusion is used that the pellets should result in a specific impulse of between 100,000 and 300,000 seconds at high thrust. The engineering challenges associated with this propulsion system are significant. For example, the antiprotons must be precisely focused. The pellet must be designed to contain the fission and initial fusion products and this will require strong magnetic fields. The fusion fuel must be contained for a sufficiently long time to effectively release the fusion energy, and the payload must be shielded from the radiation, especially the excess neutrons emitted, in addition to many other particles. We will review the recent progress, possible engineering solutions and the potential performance of these systems.

  8. Downscaling seasonal to centennial simulations on distributed computing infrastructures using WRF model. The WRF4G project

    NASA Astrophysics Data System (ADS)

    Cofino, A. S.; Fernández Quiruelas, V.; Blanco Real, J. C.; García Díez, M.; Fernández, J.

    2013-12-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the WRF4G project objective is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is used by many groups, in the climate research community, to carry on downscaling simulations. Therefore this community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the simulations and the data. Thus,another objective of theWRF4G project consists on the development of a generic adaptation of WRF to DCIs. It should simplify the access to the DCIs for the researchers, and also to free them from the technical and computational aspects of the use of theses DCI. Finally, in order to demonstrate the ability of WRF4G solving actual scientific challenges with interest and relevance on the climate science (implying a high computational cost) we will shown results from different kind of downscaling experiments, like ERA-Interim re-analysis, CMIP5 models

  9. Control Room Training for the Hyper-X Project Utilizing Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Lux-Baumann, Jesica; Dees, Ray; Fratello, David

    2006-01-01

    The NASA Dryden Flight Research Center flew two Hyper-X research vehicles and achieved hypersonic speeds over the Pacific Ocean in March and November 2004. To train the flight and mission control room crew, the NASA Dryden simulation capability was utilized to generate telemetry and radar data, which was used in nominal and emergency mission scenarios. During these control room training sessions personnel were able to evaluate and refine data displays, flight cards, mission parameter allowable limits, and emergency procedure checklists. Practice in the mission control room ensured that all primary and backup Hyper-X staff were familiar with the nominal mission and knew how to respond to anomalous conditions quickly and successfully. This report describes the technology in the simulation environment and the Mission Control Center, the need for and benefit of control room training, and the rationale and results of specific scenarios unique to the Hyper-X research missions.

  10. The use of sequential indicator simulation to characterize geostatistical uncertainty; Yucca Mountain Site Characterization Project

    SciTech Connect

    Hansen, K.M.

    1992-10-01

    Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It is recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds.

  11. Frontier of Fusion Research: Path to the Steady State Fusion Reactor by Large Helical Device

    NASA Astrophysics Data System (ADS)

    Motojima, Osamu

    2006-12-01

    The ITER, the International Thermonuclear Experimental Reactor, which will be built in Cadarache in France, has finally started this year, 2006. Since the thermal energy produced by fusion reactions divided by the external heating power, i.e., the Q value, will be larger than 10, this is a big step of the fusion research for half a century trying to tame the nuclear fusion for the 6.5 Billion people on the Earth. The source of the Sun's power is lasting steadily and safely for 8 Billion years. As a potentially safe environmentally friendly and economically competitive energy source, fusion should provide a sustainable future energy supply for all mankind for ten thousands of years. At the frontier of fusion research important milestones are recently marked on a long road toward a true prototype fusion reactor. In its own merits, research into harnessing turbulent burning plasmas and thereby controlling fusion reaction, is one of the grand challenges of complex systems science. After a brief overview of a status of world fusion projects, a focus is given on fusion research at the National Institute for Fusion Science (NIFS) in Japan, which is playing a role of the Inter University Institute, the coordinating Center of Excellence for academic fusion research and by the Large Helical Device (LHD), the world's largest superconducting heliotron device, as a National Users' facility. The current status of LHD project is presented focusing on the experimental program and the recent achievements in basic parameters and in steady state operations. Since, its start in a year 1998, a remarkable progress has presently resulted in the temperature of 140 Million degree, the highest density of 500 Thousand Billion/cc with the internal density barrier (IDB) and the highest steady average beta of 4.5% in helical plasma devices and the largest total input energy of 1.6 GJ, in all magnetic confinement fusion devices. Finally, a perspective is given of the ITER Broad Approach program

  12. Frontier of Fusion Research: Path to the Steady State Fusion Reactor by Large Helical Device

    SciTech Connect

    Motojima, Osamu

    2006-12-01

    The ITER, the International Thermonuclear Experimental Reactor, which will be built in Cadarache in France, has finally started this year, 2006. Since the thermal energy produced by fusion reactions divided by the external heating power, i.e., the Q value, will be larger than 10, this is a big step of the fusion research for half a century trying to tame the nuclear fusion for the 6.5 Billion people on the Earth. The source of the Sun's power is lasting steadily and safely for 8 Billion years. As a potentially safe environmentally friendly and economically competitive energy source, fusion should provide a sustainable future energy supply for all mankind for ten thousands of years. At the frontier of fusion research important milestones are recently marked on a long road toward a true prototype fusion reactor. In its own merits, research into harnessing turbulent burning plasmas and thereby controlling fusion reaction, is one of the grand challenges of complex systems science.After a brief overview of a status of world fusion projects, a focus is given on fusion research at the National Institute for Fusion Science (NIFS) in Japan, which is playing a role of the Inter University Institute, the coordinating Center of Excellence for academic fusion research and by the Large Helical Device (LHD), the world's largest superconducting heliotron device, as a National Users' facility. The current status of LHD project is presented focusing on the experimental program and the recent achievements in basic parameters and in steady state operations. Since, its start in a year 1998, a remarkable progress has presently resulted in the temperature of 140 Million degree, the highest density of 500 Thousand Billion/cc with the internal density barrier (IDB) and the highest steady average beta of 4.5% in helical plasma devices and the largest total input energy of 1.6 GJ, in all magnetic confinement fusion devices. Finally, a perspective is given of the ITER Broad Approach program

  13. Simulation of Plasma Jet Merger and Liner Formation within the PLX- α Project

    NASA Astrophysics Data System (ADS)

    Samulyak, Roman; Chen, Hsin-Chiang; Shih, Wen; Hsu, Scott

    2015-11-01

    Detailed numerical studies of the propagation and merger of high Mach number argon plasma jets and the formation of plasma liners have been performed using the newly developed method of Lagrangian particles (LP). The LP method significantly improves accuracy and mathematical rigor of common particle-based numerical methods such as smooth particle hydrodynamics while preserving their main advantages compared to grid-based methods. A brief overview of the LP method will be presented. The Lagrangian particle code implements main relevant physics models such as an equation of state for argon undergoing atomic physics transformation, radiation losses in thin optical limit, and heat conduction. Simulations of the merger of two plasma jets are compared with experimental data from past PLX experiments. Simulations quantify the effect of oblique shock waves, ionization, and radiation processes on the jet merger process. Results of preliminary simulations of future PLX- alpha experiments involving the ~ π / 2 -solid-angle plasma-liner configuration with 9 guns will also be presented. Partially supported by ARPA-E's ALPHA program.

  14. The Living Heart Project: A robust and integrative simulator for human heart function

    PubMed Central

    Baillargeon, Brian; Rebelo, Nuno; Fox, David D.; Taylor, Robert L.; Kuhl, Ellen

    2014-01-01

    The heart is not only our most vital, but also our most complex organ: Precisely controlled by the interplay of electrical and mechanical fields, it consists of four chambers and four valves, which act in concert to regulate its filling, ejection, and overall pump function. While numerous computational models exist to study either the electrical or the mechanical response of its individual chambers, the integrative electro-mechanical response of the whole heart remains poorly understood. Here we present a proof-of-concept simulator for a four-chamber human heart model created from computer topography and magnetic resonance images. We illustrate the governing equations of excitation-contraction coupling and discretize them using a single, unified finite element environment. To illustrate the basic features of our model, we visualize the electrical potential and the mechanical deformation across the human heart throughout its cardiac cycle. To compare our simulation against common metrics of cardiac function, we extract the pressure-volume relationship and show that it agrees well with clinical observations. Our prototype model allows us to explore and understand the key features, physics, and technologies to create an integrative, predictive model of the living human heart. Ultimately, our simulator will open opportunities to probe landscapes of clinical parameters, and guide device design and treatment planning in cardiac diseases such as stenosis, regurgitation, or prolapse of the aortic, pulmonary, tricuspid, or mitral valve. PMID:25267880

  15. The Living Heart Project: A robust and integrative simulator for human heart function.

    PubMed

    Baillargeon, Brian; Rebelo, Nuno; Fox, David D; Taylor, Robert L; Kuhl, Ellen

    2014-11-01

    The heart is not only our most vital, but also our most complex organ: Precisely controlled by the interplay of electrical and mechanical fields, it consists of four chambers and four valves, which act in concert to regulate its filling, ejection, and overall pump function. While numerous computational models exist to study either the electrical or the mechanical response of its individual chambers, the integrative electro-mechanical response of the whole heart remains poorly understood. Here we present a proof-of-concept simulator for a four-chamber human heart model created from computer topography and magnetic resonance images. We illustrate the governing equations of excitation-contraction coupling and discretize them using a single, unified finite element environment. To illustrate the basic features of our model, we visualize the electrical potential and the mechanical deformation across the human heart throughout its cardiac cycle. To compare our simulation against common metrics of cardiac function, we extract the pressure-volume relationship and show that it agrees well with clinical observations. Our prototype model allows us to explore and understand the key features, physics, and technologies to create an integrative, predictive model of the living human heart. Ultimately, our simulator will open opportunities to probe landscapes of clinical parameters, and guide device design and treatment planning in cardiac diseases such as stenosis, regurgitation, or prolapse of the aortic, pulmonary, tricuspid, or mitral valve. PMID:25267880

  16. The Change of First-flowering Date over South Korea Projected from Downscaled IPCC AR5 Simulation: Peach and Pear

    NASA Astrophysics Data System (ADS)

    Ahn, J. B.; Hur, J.

    2014-12-01

    The variations in the first-flowering date (FFD) of peach (Prunus persica) and pear (Pyrus pyrifolia) under future climate change in South Korea are investigated using simulations obtained from five models of the fifth Coupled Model Intercomparison Project. For the study, daily temperature simulations with Historical (1986-2005), and RCP (2071-2090) 4.5 and 8.5 scenarios are statistically downscaled to 50 peach and pear FFD (FFDpeach and FFDpear, respectively) observation sites over South Korea. The number of days transformed to standard temperature (DTS) method is selected as the phenological model and applied to simulations for estimating FFDpeach and FFDpear over South Korea, due to its superior performance on the target plants and region compared to the growing degree days (GDD) and chill days (CD) methods. In the analysis, mean temperatures for early spring (February to April) over South Korea in 2090 under RCP4.5 and 8.5 scenarios are expected to have increased by 1.9K and 3.3K, respectively. Among the early spring months of February to April, February shows the largest temperature increase of 2.1K and 3.7K for RCP4.5 and 8.5 scenarios, respectively. The increased temperature during February and March accelerates the plant growth rate and thereby advances FFDpeach by 7.0 and 12.7 days and FFDpear by 6.1 and 10.7 days, respectively. These results imply that the present flowering of peach and pear in the middle of April will have advanced to late March or early April by the end of this century. Acknowledgements This work was carried out with the support of the Rural Development Administration Cooperative Research Program for Agriculture Science and Technology Development under Grant Project No. PJ009953, Republic of Korea.

  17. The Fight for Fusion: A Modern Nuclear War.

    ERIC Educational Resources Information Center

    Rogers, Adam; Sereda, David

    1992-01-01

    Describes the work of Bogdan Maglich with helium-based fusion and barriers to its development resulting from lack of government support, competition for funding, and political pet projects. Compares tritium-based to helium-based fusion and the potential for nonradioactive nuclear power to supply the world's energy requirements with no negative…

  18. The WASCAL regional climate simulations for West Africa - how to add value to existing climate projections

    NASA Astrophysics Data System (ADS)

    Arnault, J.; Heinzeller, D.; Klein, C.; Dieng, D.; Smiatek, G.; Bliefernicht, J.; Sylla, M. B.; Kunstmann, H.

    2015-12-01

    With climate change being one of the most severe challenges to rural Africa in the 21st century, West Africa is facing an urgent need to develop effective adaptation and mitigation measures to protect its constantly growing population. WASCAL (West African Science Service Center on Climate Change and Adapted Land Use) is a large-scale research-focused program designed to enhance the resilience of human and environmental systems to climate change and increased variability. An integral part of its climate services is the provisioning of a new set of high resolution, ensemble-based regional climate change scenarios for the region of West Africa. In this contribution, we present the overall concept of the WASCAL regional climate projections and provide information on the dissemination of the data. We discuss the model performance over the validation period for two of the three regional climate models employed, the Weather Research & Forecasting Tool (WRF) and the Consortium for Small-scale Modeling Model COSMO in Climate Mode (COSMO-CLM), and give details about a novel precipitation database used to verify the models. Particular attention is paid to the representation of the dynamics of the West African Summer Monsoon and to the added value of our high resolution models over existing data sets. We further present results on the climate change signal obtained from the WRF model runs for the periods 2020-2050 and 2070-2100 and compare them to current state-of-the-art projections from the CORDEX project. As an example, the figure shows the different climate change signals obtained for the total annual rainfall with respect to the 1980-2010 mean (WRF-E: WASCAL 12km high-resolution run MPI-ESM + WRFV3.5.1, CORDEX-E: 50km medium-resolution run MPI-ESM + RCA4, CORDEX-G: 50km medium-resolution run GFDL-ESM + RCA4).

  19. Analyzing and Projecting U.S. Wildfire Potential Based on NARCCAP Regional Climate Simulations

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Mearns, L. O.

    2012-12-01

    Wildfires usually ignite and spread under hot, dry, and windy conditions. Wildfires, especially catastrophic mega-fires, have increased in recent decades in the United States and other parts of the world. Among the converging factors were extreme weather event such as extended drought. Furthermore, climate has been projected to become warmer worldwide and drier with more frequent droughts in many subtropical and mid-latitude regions including parts of the U.S. due to the greenhouse effect. As a result, wildfires are expected to increase in the future. This study analyzes current features and project future trends of wildfire potential in the continental United States. Fire potential is measured by fire indices including the Keetch-Byram Drought Index and Fosberg Fire Weather Index. The meteorological data used to calculate the fire indices are the dynamical downscaling produced by the North American Regional Climate Change Assessment Program (NARCCAP). Current fire potential generally increases from the eastern to western coast and from cool to warm season. Fire potential has large seasonal and inter-annual variability and spatial connections. Fire potential has shown overall increasing trends in recent decades. The trends are projected to continue this century due to the greenhouse effect. Future fire potential will increase significantly in the Rocky Mountains all seasons and in the Southeast during summer and autumn. Future climate change will also reduce the windows of prescribed burning, which is one of the forest management tools for reducing wildfire risks. The research results are expected to provide useful information for assessing the ecological, environmental, and social impacts of future wildfires and developing mitigation strategies.

  20. Materials research for fusion

    NASA Astrophysics Data System (ADS)

    Knaster, J.; Moeslang, A.; Muroga, T.

    2016-05-01

    Fusion materials research started in the early 1970s following the observation of the degradation of irradiated materials used in the first commercial fission reactors. The technological challenges of fusion energy are intimately linked with the availability of suitable materials capable of reliably withstanding the extremely severe operational conditions of fusion reactors. Although fission and fusion materials exhibit common features, fusion materials research is broader. The harder mono-energetic spectrum associated with the deuterium-tritium fusion neutrons (14.1 MeV compared to <2 MeV on average for fission neutrons) releases significant amounts of hydrogen and helium as transmutation products that might lead to a (at present undetermined) degradation of structural materials after a few years of operation. Overcoming the historical lack of a fusion-relevant neutron source for materials testing is an essential pending step in fusion roadmaps. Structural materials development, together with research on functional materials capable of sustaining unprecedented power densities during plasma operation in a fusion reactor, have been the subject of decades of worldwide research efforts underpinning the present maturity of the fusion materials research programme.

  1. AeroCom INSITU Project: Comparison of Aerosol Optical Properties from In-situ Surface Measurements and Model Simulations

    NASA Astrophysics Data System (ADS)

    Schmeisser, L.; Andrews, E.; Schulz, M.; Fiebig, M.; Zhang, K.; Randles, C. A.; Myhre, G.; Chin, M.; Stier, P.; Takemura, T.; Krol, M. C.; Bian, H.; Skeie, R. B.; da Silva, A. M., Jr.; Kokkola, H.; Laakso, A.; Ghan, S.; Easter, R. C.

    2015-12-01

    AeroCom, an open international collaboration of scientists seeking to improve global aerosol models, recently initiated a project comparing model output to in-situ, surface-based measurements of aerosol optical properties. The model/measurement comparison project, called INSITU, aims to evaluate the performance of a suite of AeroCom aerosol models with site-specific observational data in order to inform iterative improvements to model aerosol modules. Surface in-situ data have the unique property of being traceable to physical standards, which is a big asset in accomplishing the overarching goal of bettering the accuracy of aerosol processes and predicative capability of global climate models. The INSITU project looks at how well models reproduce aerosol climatologies on a variety of time scales, aerosol characteristics and behaviors (e.g., aerosol persistence and the systematic relationships between aerosol optical properties), and aerosol trends. Though INSITU is a multi-year endeavor, preliminary phases of the analysis, using GOCART and other models participating in this AeroCom project, show substantial model biases in absorption and scattering coefficients compared to surface measurements, though the sign and magnitude of the bias varies with location and optical property. Spatial patterns in the biases highlight model weaknesses, e.g., the inability of models to properly simulate aerosol characteristics at sites with complex topography (see Figure 1). Additionally, differences in modeled and measured systematic variability of aerosol optical properties suggest that some models are not accurately capturing specific aerosol co-dependencies, for example, the tendency of in-situ surface single scattering albedo to decrease with decreasing aerosol extinction coefficient. This study elucidates specific problems with current aerosol models and suggests additional model runs and perturbations that could further evaluate the discrepancies between measured and modeled

  2. INTRODUCTION: Status report on fusion research

    NASA Astrophysics Data System (ADS)

    Burkart, Werner

    2005-10-01

    members' personal views on the latest achievements in fusion research, including magnetic and inertial confinement scenarios. The report describes fusion fundamentals and progress in fusion science and technology, with ITER as a possible partner in the realization of self-sustainable burning plasma. The importance of the socio-economic aspects of energy production using fusion power plants is also covered. Noting that applications of plasma science are of broad interest to the Member States, the report addresses the topic of plasma physics to assist in understanding the achievements of better coatings, cheaper light sources, improved heat-resistant materials and other high-technology materials. Nuclear fusion energy production is intrinsically safe, but for ITER the full range of hazards will need to be addressed, including minimising radiation exposure, to accomplish the goal of a sustainable and environmentally acceptable production of energy. We anticipate that the role of the Agency will in future evolve from supporting scientific projects and fostering information exchange to the preparation of safety principles and guidelines for the operation of burning fusion plasmas with a Q > 1. Technical progress in inertial and magnetic confinement, as well as in alternative concepts, will lead to a further increase in international cooperation. New means of communication will be needed, utilizing the best resources of modern information technology to advance interest in fusion. However, today the basis of scientific progress is still through journal publications and, with this in mind, we trust that this report will find an interested readership. We acknowledge with thanks the support of the members of the IFRC as an advisory body to the Agency. Seven chairmen have presided over the IFRC since its first meeting in 1971 in Madison, USA, ensuring that the IAEA fusion efforts were based on the best professional advice possible, and that information on fusion developments has

  3. The Ohio River Valley CO2 Storage Project AEP Mountaineer Plant, West Virginia Numerical Simulation and Risk Assessment Report

    SciTech Connect

    Neeraj Gupta

    2008-03-31

    A series of numerical simulations of carbon dioxide (CO{sub 2}) injection were conducted as part of a program to assess the potential for geologic sequestration in deep geologic reservoirs (the Rose Run and Copper Ridge formations), at the American Electric Power (AEP) Mountaineer Power Plant outside of New Haven, West Virginia. The simulations were executed using the H{sub 2}O-CO{sub 2}-NaCl operational mode of the Subsurface Transport Over Multiple Phases (STOMP) simulator (White and Oostrom, 2006). The objective of the Rose Run formation modeling was to predict CO{sub 2} injection rates using data from the core analysis conducted on the samples. A systematic screening procedure was applied to the Ohio River Valley CO{sub 2} storage site utilizing the Features, Elements, and Processes (FEP) database for geological storage of CO{sub 2} (Savage et al., 2004). The objective of the screening was to identify potential risk categories for the long-term geological storage of CO{sub 2} at the Mountaineer Power Plant in New Haven, West Virginia. Over 130 FEPs in seven main classes were assessed for the project based on site characterization information gathered in a geological background study, testing in a deep well drilled on the site, and general site conditions. In evaluating the database, it was apparent that many of the items were not applicable to the Mountaineer site based its geologic framework and environmental setting. Nine FEPs were identified for further consideration for the site. These FEPs generally fell into categories related to variations in subsurface geology, well completion materials, and the behavior of CO{sub 2} in the subsurface. Results from the screening were used to provide guidance on injection system design, developing a monitoring program, performing reservoir simulations, and other risk assessment efforts. Initial work indicates that the significant FEPs may be accounted for by focusing the storage program on these potential issues. The

  4. Using Discrete Event Simulation to predict KPI's at a Projected Emergency Room.

    PubMed

    Concha, Pablo; Neriz, Liliana; Parada, Danilo; Ramis, Francisco

    2015-01-01

    Discrete Event Simulation (DES) is a powerful factor in the design of clinical facilities. DES enables facilities to be built or adapted to achieve the expected Key Performance Indicators (KPI's) such as average waiting times according to acuity, average stay times and others. Our computational model was built and validated using expert judgment and supporting statistical data. One scenario studied resulted in a 50% decrease in the average cycle time of patients compared to the original model, mainly by modifying the patient's attention model. PMID:26262262

  5. Review of the Fusion Theory and Computing Program. Fusion Energy Sciences Advisory Committee (FESAC)

    SciTech Connect

    Antonsen, Thomas M.; Berry, Lee A.; Brown, Michael R.; Dahlburg, Jill P.; Davidson, Ronald C.; Greenwald, Martin; Hegna, Chris C.; McCurdy, William; Newman, David E.; Pellegrini, Claudio; Phillips, Cynthia K.; Post, Douglass E.; Rosenbluth, Marshall N.; Sheffield, John; Simonen, Thomas C.; Van Dam, James

    2001-08-01

    At the November 14-15, 2000, meeting of the Fusion Energy Sciences Advisory Committee, a Panel was set up to address questions about the Theory and Computing program, posed in a charge from the Office of Fusion Energy Sciences (see Appendix A). This area was of theory and computing/simulations had been considered in the FESAC Knoxville meeting of 1999 and in the deliberations of the Integrated Program Planning Activity (IPPA) in 2000. A National Research Council committee provided a detailed review of the scientific quality of the fusion energy sciences program, including theory and computing, in 2000.

  6. Overview of theory and modeling in the heavy ion fusion virtual national laboratory

    NASA Astrophysics Data System (ADS)

    Davidson, R. C.; Kaganovich, I. D.; Lee, W. W.; Qin, H.; Startsev, E. A.; Tzenov, S.; Friedman, A.; Barnard, J. J.; Cohen, R. H.; Grote, D. P.; Lund, S. M.; Sharp, W. M.; Celata, C. M.; de Hoon, M.; Henestroza, E.; Lee, E. P.; Yu, S. S.; Vay, J.-L.; Welch, D. R.; Rose, D. V.; Olson, C. L.

    2002-07-01

    This article presents analytical and simulation studies of intense heavy ion beam propagation, including the injection, acceleration, transport and compression phases, and beam transport and focusing in background plasma in the target chamber. Analytical theory and simulations that support the High Current Experiment (HCX), the Neutralized Transport Experiment (NTX), and the advanced injector development program, are being used to provide a basic understanding of the nonlinear beam dynamics and collective processes, and to develop design concepts for the next-step Integrated Beam Experiment (IBX), an Integrated Research Experiment (IRE), and a heavy ion fusion driver. Three-dimensional nonlinear perturbative simulations have been applied to collective instabilities driven by beam temperature anisotropy, and to two-stream interactions between the beam ions and any unwanted background electrons; three-dimensional particle-in-cell simulations of the 2-MV electrostatic quadrupole (ESQ) injector have clarified the influence of pulse rise time; analytical studies and simulations of the drift compression process have been carried out; syntheses of a four-dimensional particle distribution function from phase-space projections have been developed; and studies of the generation and trapping of stray electrons in the beam self-fields have been performed. Particle-in-cell simulations, involving preformed plasma, are being used to study the influence of charge and current neutralization on the focusing of the ion beam in NTX and in a fusion chamber.

  7. Overview of theory and modeling in the Heavy Ion Fusion Virtual National Laboratory

    SciTech Connect

    Davidson, R.C.; Kaganovich, I.D.; Lee, W.W.; Qin, H.; Startsev, E.A.; Tzenov, S.; Friedman, A.; Barnard, J.J.; Cohen, R.H.; Grote, D.P.; Lund, S.M.; Sharp, W.M.; Celata, C.M.; de Hoon, M.; Henestroza, E.; Lee, E.P.; Yu, S.S.; Vay, J-L.; Welch, D.R.; Rose, D.V.; Olson, C.L.

    2002-05-01

    This paper presents analytical and simulation studies of intense heavy ion beam propagation, including the injection, acceleration, transport and compression phases, and beam transport and focusing in background plasma in the target chamber. Analytical theory and simulations that support the High Current Experiment (HCX), the Neutralized Transport Experiment (NTX), and the advanced injector development program, are being used to provide a basic understanding of the nonlinear beam dynamics and collective processes, and to develop design concepts for the next-step Integrated Beam Experiment (IBX), an Integrated Research Experiment (IRE), and a heavy ion fusion driver. 3-D nonlinear perturbative simulations have been applied to collective instabilities driven by beam temperature anisotropy, and to two-stream interactions between the beam ions and any unwanted background electrons; 3-D particle-in-cell simulations of the 2 MV Electrostatic Quadrupole (ESQ) injector have clarified the influence of pulse rise time; analytical studies and simulations of the drift compression process have been carried out; syntheses of a 4-D particle distribution function from phase-space projections have been developed; and studies of the generation and trapping of stray electrons in the beam self fields have been performed. Particle-in-cell simulations, involving pre-formed plasma, are being used to study the influence of charge and current neutralization on the focusing of the ion beam in NTX and in a fusion chamber.

  8. Overview of Theory and Modeling in the Heavy Ion Fusion Virtual National Laboratory

    SciTech Connect

    Davidson, R. C.; Kaganovich, I. D.; Lee, W. W.; Qin, H.; Startsev, E. A.; Tzenov, S; Friedman, A; Barnard, J J; Cohen, R H; Grote, D P; Lund, S M; Sharp, W M; Henestroza, E; Lee, E P; Yu, S S; Vay, J -L; Welch, D R; Rose, D V; Olson, C L; Celata, C. M.

    2003-04-09

    This paper presents analytical and simulation studies of intense heavy ion beam propagation, including the injection, acceleration, transport and compression phases, and beam transport and focusing in background plasma in the target chamber. Analytical theory and simulations that support the High Current Experiment (HCX), the Neutralized Transport Experiment (NTX), and the advanced injector development program are being used to provide a basic understanding of the nonlinear beam dynamics and collective processes, and to develop design concepts for the next-step Integrated Beam Experiment (IBX), an Integrated Research Experiment (IRE), and a heavy ion fusion driver. Three-dimensional (3-D) nonlinear perturbative simulations have been applied to collective instabilities driven by beam temperature anisotropy and to two-stream interactions between the beam ions and any unwanted background electrons. Three-dimensional particle-in-cell simulations of the 2 MV Electrostatic Quadrupole (ESQ) injector have clarified the influence of pulse rise time. Analytical studies and simulations of the drift compression process have been carried out. Syntheses of a four-dimensional (4-D) particle distribution function from phase-space projections have been developed. And, studies of the generation and trapping of stray electrons in the beam self-fields have been performed. Particle-in-cell simulations, involving preformed plasma, are being used to study the influence of charge and current neutralization on the focusing of the ion beam in Neutralized Transport Experiment and in a fusion chamber.

  9. Numerical simulation of hole-closure experiments on a large centrifuge. [Subseabed Disposal Project

    SciTech Connect

    McTigue, D.F.; Sutherland, H.J.; Dawson, P.R.

    1987-04-01

    The creep closure of a slot in water-saturated clay has been simulated numerically with the finite-element computer code NEPTUNE. The calculations model scaled experiments performed on the Sandia 25-foot centrifuge at 90 g. The material model used represents the clay matrix as a linearly viscous fluid, the volumetric deformation rate of which depends upon the ''effective'' mean normal stress. Simulations have been run for a range of material properties and a variety of possible boundary conditions. The calculations indicate that the deformation is dominated by shearing, while volumetric creep is relatively unimportant, i.e., the characteristic time for the deformation is short in comparison to that for the flow of interstitial fluid. Velocities of the order observed in the centrifuge experiments (approx.1 m/s) are calculated for a shear viscosity of 10/sup 3/ Pa . s, which is of a magnitude consistent with values reported in the literature. The overall kinematics of the experimental flow field is reproduced satisfactorily for reasonable boundary conditions on the material in the neighborhood of the hole. The calculations demonstrate that the formulation implemented in the NEPTUNE computer code is able to predict certain observable aspects of the creep deformation of water-saturated clay. Thus, the comparison between experimental observations and computational results provides partial validation of the code.

  10. The immersed boundary projection method and its application to simulation and control of flows around low-aspect-ratio wings

    NASA Astrophysics Data System (ADS)

    Taira, Kunihiko

    First, we present a new formulation of the immersed boundary method that is algebraically identical to the traditional fractional step algorithm. This method, called the immersed boundary projection method, allows for the simulations of incompressible flows over arbitrarily shaped bodies under motion and/or deformation in both two and three dimensions. The no-slip condition along the immersed boundary is enforced simultaneously with the incompressibility constraint through a single projection. The boundary force is determined implicitly without any constitutive relations for the rigid body formulation, which in turn allows the use of high CFL numbers in our simulations compared to past methods. Next, the above immersed boundary projection method is used to analyze three-dimensional separated flows around low-aspect-ratio flat-plate wings. A number of simulations highlighting the unsteady nature of the separated flows are performed for Re=300 and 500 with various aspect ratios, angles of attack, and planform geometries. The aspect ratio and angle of attack are found to have a large influence on the stability of the wake profile and the force experienced by the low-aspect-ratio wing. At early times, following an impulsive start, topologies of the wake vortices are found to be the same across different aspect ratios and angles of attack. Behind low-aspect-ratio rectangular plates, leading-edge vortices form and eventually separate as hairpin vortices following the start-up. This phenomenon is found to be similar to dynamic stall observed behind pitching plates. The detached structure would then interact with the tip vortices, reducing the downward velocity induced by the tip vortices acting upon the leading-edge vortex. At large time, depending on the aspect ratio and angles of attack, the wakes reach one of the three states: (i) a steady state, (ii) a periodic unsteady state, or (iii) an aperiodic unsteady state. We have observed that the tip effects in three

  11. Using simulated historical time series to prioritize fuel treatments on landscapes across the United States: The LANDFIRE prototype project

    USGS Publications Warehouse

    Keane, R.E.; Rollins, M.; Zhu, Z.-L.

    2007-01-01

    Canopy and surface fuels in many fire-prone forests of the United States have increased over the last 70 years as a result of modern fire exclusion policies, grazing, and other land management activities. The Healthy Forest Restoration Act and National Fire Plan establish a national commitment to reduce fire hazard and restore fire-adapted ecosystems across the USA. The primary index used to prioritize treatment areas across the nation is Fire Regime Condition Class (FRCC) computed as departures of current conditions from the historical fire and landscape conditions. This paper describes a process that uses an extensive set of ecological models to map FRCC from a departure statistic computed from simulated time series of historical landscape composition. This mapping process uses a data-driven, biophysical approach where georeferenced field data, biogeochemical simulation models, and spatial data libraries are integrated using spatial statistical modeling to map environmental gradients that are then used to predict vegetation and fuels characteristics over space. These characteristics are then fed into a landscape fire and succession simulation model to simulate a time series of historical landscape compositions that are then compared to the composition of current landscapes to compute departure, and the FRCC values. Intermediate products from this process are then used to create ancillary vegetation, fuels, and fire regime layers that are useful in the eventual planning and implementation of fuel and restoration treatments at local scales. The complex integration of varied ecological models at different scales is described and problems encountered during the implementation of this process in the LANDFIRE prototype project are addressed. ?? 2007 Elsevier B.V. All rights reserved.

  12. Muon Catalyzed Fusion

    NASA Technical Reports Server (NTRS)

    Armour, Edward A.G.

    2007-01-01

    Muon catalyzed fusion is a process in which a negatively charged muon combines with two nuclei of isotopes of hydrogen, e.g, a proton and a deuteron or a deuteron and a triton, to form a muonic molecular ion in which the binding is so tight that nuclear fusion occurs. The muon is normally released after fusion has taken place and so can catalyze further fusions. As the muon has a mean lifetime of 2.2 microseconds, this is the maximum period over which a muon can participate in this process. This article gives an outline of the history of muon catalyzed fusion from 1947, when it was first realised that such a process might occur, to the present day. It includes a description of the contribution that Drachrnan has made to the theory of muon catalyzed fusion and the influence this has had on the author's research.

  13. Status of fusion maintenance

    SciTech Connect

    Fuller, G.M.

    1984-01-01

    Effective maintenance will be an essential ingredient in determining fusion system productivity. This level of productivity will result only after close attention is paid to the entire system as an entity and appropriate integration of the elements is made. The status of fusion maintenance is reviewed in the context of the entire system. While there are many challenging developmental tasks ahead in fusion maintenance, the required technologies are available in several high-technology industries, including nuclear fission.

  14. Fusion: The controversy continues

    SciTech Connect

    1989-07-01

    Nuclear fusion-the power of the stars that promises mankind an inexhaustible supply of energy-seems concurrently much closer and still distant this month. The recent flurry of announcements concerning the achievement of a cold fusion reaction has-if nothing else-underscored the historic importance of the basic fusion reaction which uses hydrogen ions to fuel an energy-producing reaction.

  15. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    SciTech Connect

    Spentzouris, P.; Cary, J.; McInnes, L.C.; Mori, W.; Ng, C.; Ng, E.; Ryne, R.; /LBL, Berkeley

    2011-11-14

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessary accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization

  16. The Aquila comparison project: the effects of feedback and numerical methods on simulations of galaxy formation

    NASA Astrophysics Data System (ADS)

    Scannapieco, C.; Wadepuhl, M.; Parry, O. H.; Navarro, J. F.; Jenkins, A.; Springel, V.; Teyssier, R.; Carlson, E.; Couchman, H. M. P.; Crain, R. A.; Dalla Vecchia, C.; Frenk, C. S.; Kobayashi, C.; Monaco, P.; Murante, G.; Okamoto, T.; Quinn, T.; Schaye, J.; Stinson, G. S.; Theuns, T.; Wadsley, J.; White, S. D. M.; Woods, R.

    2012-06-01

    We compare the results of various cosmological gas-dynamical codes used to simulate the formation of a galaxy in the Λ cold dark matter structure formation paradigm. The various runs (13 in total) differ in their numerical hydrodynamical treatment [smoothed particle hydrodynamics (SPH), moving mesh and adaptive mesh refinement] but share the same initial conditions and adopt in each case their latest published model of gas cooling, star formation and feedback. Despite the common halo assembly history, we find large code-to-code variations in the stellar mass, size, morphology and gas content of the galaxy at z= 0, due mainly to the different implementations of star formation and feedback. Compared with observation, most codes tend to produce an overly massive galaxy, smaller and less gas rich than typical spirals, with a massive bulge and a declining rotation curve. A stellar disc is discernible in most simulations, although its prominence varies widely from code to code. There is a well-defined trend between the effects of feedback and the severity of the disagreement with observed spirals. In general, models that are more effective at limiting the baryonic mass of the galaxy come closer to matching observed galaxy scaling laws, but often to the detriment of the disc component. Although numerical convergence is not particularly good for any of the codes, our conclusions hold at two different numerical resolutions. Some differences can also be traced to the different numerical techniques; for example, more gas seems able to cool and become available for star formation in grid-based codes than in SPH. However, this effect is small compared to the variations induced by different feedback prescriptions. We conclude that state-of-the-art simulations cannot yet uniquely predict the properties of the baryonic component of a galaxy, even when the assembly history of its host halo is fully specified. Developing feedback algorithms that can effectively regulate the mass of

  17. Energy Simulations of Commercial Buildings for DOE’s Standards Development Projects

    SciTech Connect

    Somasundaram, Sriram; Winiarski, David W.; Taylor, Zachary T.; Jarnagin, Ronald E.

    2006-01-01

    The U.S. Department of Energy (DOE) has been mandated by the U.S. Congress to promulgate energy conservation standards for certain commercial and industrial equipment [Energy Policy and Conservation Act, 42 United States Code 6311 et seq. (EPCA)], in particular specific classes of commercial space conditioning and service water heating equipment. In support of the DOE rulemakings that help establish these standards, Pacific Northwest National Laboratory (PNNL) conducted energy simulation analysis to develop energy consumption characteristics and energy load profiles for commercial buildings. DOE uses life-cycle cost effectiveness as a key criterion in establishing energy conservation standards. In the U.S., however, electrical energy costs for commercial buildings can vary by time of day or year, and peak electrical demand can play a significant role in determining the total cost of energy for a commercial building. Hence, it is important to understand not only total electrical energy consumption but also building electric load profiles during the year.

  18. Is social projection based on simulation or theory? Why new methods are needed for differentiating

    PubMed Central

    Bazinger, Claudia; Kühberger, Anton

    2012-01-01

    The literature on social cognition reports many instances of a phenomenon titled ‘social projection’ or ‘egocentric bias’. These terms indicate egocentric predictions, i.e., an over-reliance on the self when predicting the cognition, emotion, or behavior of other people. The classic method to diagnose egocentric prediction is to establish high correlations between our own and other people's cognition, emotion, or behavior. We argue that this method is incorrect because there is a different way to come to a correlation between own and predicted states, namely, through the use of theoretical knowledge. Thus, the use of correlational measures is not sufficient to identify the source of social predictions. Based on the distinction between simulation theory and theory theory, we propose the following alternative methods for inferring prediction strategies: independent vs. juxtaposed predictions, the use of ‘hot’ mental processes, and the use of participants’ self-reports. PMID:23209342

  19. Fusion programs in applied plasma physics and development and technology at GA Technologies, Inc.

    NASA Astrophysics Data System (ADS)

    Overskei, D. O.

    1988-01-01

    Research carried out by GA for the Department of Energy Office of Fusion Energy provides key information and insight necessary for the development of fusion power systems. Highlights of the fusion theory effort described in this report include progress in numerical simulations of turbulent transport in tokamak plasmas, extension of novel theories of the H-mode, development and application of advanced codes for evaluating ECRF current drive efficiency, and new understanding and techniques for dealing with high beta tokamak equilibria. Experimental plasma research efforts are addresssing several important issues in fusion research. Neutron and alpha particle spectroscopy and triton confinement diagnostics are being developed to enable fusion researchers to understand alpha particle confinement and slowdown in burning plasmas. Development of Li beam diagnostic systems continued and has shown a capability for measuring both magnetic field pitch angle and relative current density profiles. Experiments on Ergodic Magnetic Divertor (EMD) phenomena on the Texas Experimental Tokamak (TEXT) continued to demonstrate low plasma edge temperatures and impurity reduction that make the concept attractive for reactor applications. GA led efforts continuing the Resonant Island Divertor (RID) experiments on TEXT using the EMD as a controlled magnetic perturbation. Research carried out in GA's Development and Technology programs included reactor systems design studies, and development of ferritic steels suitable for use as a structural material in fusion reactors. In the reactor systems design area, GA participated in the TITAN Reserved Field Pinch (RFP) Reactor Design Study. GA is responsible for project operation, safety design and analysis, and blanket shield neutronics calculations for this study.

  20. Magnetic-confinement fusion

    NASA Astrophysics Data System (ADS)

    Ongena, J.; Koch, R.; Wolf, R.; Zohm, H.

    2016-05-01

    Our modern society requires environmentally friendly solutions for energy production. Energy can be released not only from the fission of heavy nuclei but also from the fusion of light nuclei. Nuclear fusion is an important option for a clean and safe solution for our long-term energy needs. The extremely high temperatures required for the fusion reaction are routinely realized in several magnetic-fusion machines. Since the early 1990s, up to 16 MW of fusion power has been released in pulses of a few seconds, corresponding to a power multiplication close to break-even. Our understanding of the very complex behaviour of a magnetized plasma at temperatures between 150 and 200 million °C surrounded by cold walls has also advanced substantially. This steady progress has resulted in the construction of ITER, a fusion device with a planned fusion power output of 500 MW in pulses of 400 s. ITER should provide answers to remaining important questions on the integration of physics and technology, through a full-size demonstration of a tenfold power multiplication, and on nuclear safety aspects. Here we review the basic physics underlying magnetic fusion: past achievements, present efforts and the prospects for future production of electrical energy. We also discuss questions related to the safety, waste management and decommissioning of a future fusion power plant.

  1. Meteorite fusion crust variability.

    NASA Astrophysics Data System (ADS)

    Thaisen, Kevin G.; Taylor, Lawrence A.

    2009-06-01

    Two assumptions commonly employed in meteorite interpretation are that fusion crust compositions represent the bulk-rock chemistry of the interior meteorite and that the vesicles within the fusion crust result from the release of implanted solar wind volatiles. Electron microprobe analyses of thin sections from lunar meteorite Miller Range (MIL) 05035 and eucrite Bates Nunataks (BTN) 00300 were performed to determine if the chemical compositions of the fusion crust varied and/or represented the published bulk rock composition. It was determined that fusion crust compositions are significantly influenced by the incorporation of fragments from the substrate, and by the composition and grain size of those minerals. Because of compositional heterogeneities throughout the meteorite, one cannot assume that fusion crust composition represents the bulk rock composition. If the compositional variability within the fusion crust and mineralogical differences among thin sections goes unnoticed, then the perceived composition and petrogenetic models of formation will be incorrect. The formation of vesicles within these fusion crusts were also compared to current theories attributing vesicles to a solar wind origin. Previous work from the STONE-5 experiment, where terrestrial rocks were exposed on the exterior of a spacecraft heatshield, produced a vesicular fusion crust without prolonged exposure to solar wind suggesting that the high temperatures experienced by a meteorite during passage through the Earth's atmosphere are sufficient to cause boiling of the melt. Therefore, the assumption that all vesicles found within a fusion crust are due to the release of implanted volatiles of solar wind may not be justified.

  2. Use of Generalized Fluid System Simulation Program (GFSSP) for Teaching and Performing Senior Design Projects at the Educational Institutions

    NASA Technical Reports Server (NTRS)

    Majumdar, A. K.; Hedayat, A.

    2015-01-01

    This paper describes the experience of the authors in using the Generalized Fluid System Simulation Program (GFSSP) in teaching Design of Thermal Systems class at University of Alabama in Huntsville. GFSSP is a finite volume based thermo-fluid system network analysis code, developed at NASA/Marshall Space Flight Center, and is extensively used in NASA, Department of Defense, and aerospace industries for propulsion system design, analysis, and performance evaluation. The educational version of GFSSP is freely available to all US higher education institutions. The main purpose of the paper is to illustrate the utilization of this user-friendly code for the thermal systems design and fluid engineering courses and to encourage the instructors to utilize the code for the class assignments as well as senior design projects.

  3. Projection on Proper elements for code control: Verification, numerical convergence, and reduced models. Application to plasma turbulence simulations

    NASA Astrophysics Data System (ADS)

    Cartier-Michaud, T.; Ghendrih, P.; Sarazin, Y.; Abiteboul, J.; Bufferand, H.; Dif-Pradalier, G.; Garbet, X.; Grandgirard, V.; Latu, G.; Norscini, C.; Passeron, C.; Tamain, P.

    2016-02-01

    The Projection on Proper elements (PoPe) is a novel method of code control dedicated to (1) checking the correct implementation of models, (2) determining the convergence of numerical methods, and (3) characterizing the residual errors of any given solution at very low cost. The basic idea is to establish a bijection between a simulation and a set of equations that generate it. Recovering equations is direct and relies on a statistical measure of the weight of the various operators. This method can be used in any number of dimensions and any regime, including chaotic ones. This method also provides a procedure to design reduced models and quantify its ratio of cost to benefit. PoPe is applied to a kinetic and a fluid code of plasma turbulence.

  4. Solving the chemical master equation by a fast adaptive finite state projection based on the stochastic simulation algorithm.

    PubMed

    Sidje, R B; Vo, H D

    2015-11-01

    The mathematical framework of the chemical master equation (CME) uses a Markov chain to model the biochemical reactions that are taking place within a biological cell. Computing the transient probability distribution of this Markov chain allows us to track the composition of molecules inside the cell over time, with important practical applications in a number of areas such as molecular biology or medicine. However the CME is typically difficult to solve, since the state space involved can be very large or even countably infinite. We present a novel way of using the stochastic simulation algorithm (SSA) to reduce the size of the finite state projection (FSP) method. Numerical experiments that demonstrate the effectiveness of the reduction are included.

  5. The Neurona at Home project: Simulating a large-scale cellular automata brain in a distributed computing environment

    NASA Astrophysics Data System (ADS)

    Acedo, L.; Villanueva-Oller, J.; Moraño, J. A.; Villanueva, R.-J.

    2013-01-01

    The Berkeley Open Infrastructure for Network Computing (BOINC) has become the standard open source solution for grid computing in the Internet. Volunteers use their computers to complete an small part of the task assigned by a dedicated server. We have developed a BOINC project called Neurona@Home whose objective is to simulate a cellular automata random network with, at least, one million neurons. We consider a cellular automata version of the integrate-and-fire model in which excitatory and inhibitory nodes can activate or deactivate neighbor nodes according to a set of probabilistic rules. Our aim is to determine the phase diagram of the model and its behaviour and to compare it with the electroencephalographic signals measured in real brains.

  6. Computational problems in magnetic fusion research

    SciTech Connect

    Killeen, J.

    1981-08-31

    Numerical calculations have had an important role in fusion research since its beginning, but the application of computers to plasma physics has advanced rapidly in the last few years. One reason for this is the increasing sophistication of the mathematical models of plasma behavior, and another is the increased speed and memory of the computers which made it reasonable to consider numerical simulation of fusion devices. The behavior of a plasma is simulated by a variety of numerical models. Some models used for short times give detailed knowledge of the plasma on a microscopic scale, while other models used for much longer times compute macroscopic properties of the plasma dynamics. The computer models used in fusion research are surveyed. One of the most active areas of research is in time-dependent, three-dimensional, resistive magnetohydrodynamic models. These codes are reviewed briefly.

  7. Simulations of DT experiments in TFTR

    SciTech Connect

    Budny, R.; Bell, M.G.; Biglari, H.; Bitter, M.; Bush, C.; Cheng, C.Z.; Fredrickson, E.; Grek, B.; Hill, K.W.; Hsuan, H.; Janos, A.; Jassby, D.L.; Johnson, D.; Johnson, L.C.; LeBlanc, B.; McCune, D.C.; Mikkelsen, D.R.; Park, H.; Ramsey, A.T.; Sabbagh, S.A.; Scott, S.; Schivell, J.; Strachan, J.D.; Stratton, B.C.; Synakowski, E.; Taylor, G.; Zarnstorff, M.C.; Zweben, S.J.

    1991-12-01

    A transport code (TRANSP) is used to simulate future deuterium-tritium experiments (DT) in TFTR. The simulations are derived from 14 TFTR DD discharges, and the modeling of one supershot is discussed in detail to indicate the degree of accuracy of the TRANSP modeling. Fusion energy yields and {alpha}-particle parameters are calculated, including profiles of the {alpha} slowing down time, average energy, and of the Alfven speed and frequency. Two types of simulations are discussed. The main emphasis is on the DT equivalent, where an equal mix of D and T is substituted for the D in the initial target plasma, and for the D{sup O} in the neutral-beam injection, but the other measured beam and plasma parameters are unchanged. This simulation does not assume that {alpha} heating will enhance the plasma parameters, or that confinement will increase with T. The maximum relative fusion yield calculated for these simulations is Q{sub DT} {approx} 0.3, and the maximum {alpha} contribution to the central toroidal {beta} is {beta}{sub {alpha}}(0) {approx} 0.5%. The stability of toroidicity-induced Alfven eigenmodes (TAE) and kinetic ballooning modes (KBM) is discussed. The TAE mode is predicted to become unstable for some of the equivalent simulations, particularly after the termination of neutral beam injection. In the second type of simulation, empirical supershot scaling relations are used to project the performance at the maximum expected beam power. The MHD stability of the simulations is discussed.

  8. The NINJA-2 project: detecting and characterizing gravitational waveforms modelled using numerical binary black hole simulations

    NASA Astrophysics Data System (ADS)

    Aasi, J.; Abbott, B. P.; Abbott, R.; Abbott, T.; Abernathy, M. R.; Accadia, T.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Affeldt, C.; Agathos, M.; Aggarwal, N.; Aguiar, O. D.; Ain, A.; Ajith, P.; Alemic, A.; Allen, B.; Allocca, A.; Amariutei, D.; Andersen, M.; Anderson, R.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C.; Areeda, J.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Austin, L.; Aylott, B. E.; Babak, S.; Baker, P. T.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barbet, M.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barton, M. A.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Bauchrowitz, J.; Bauer, Th S.; Behnke, B.; Bejger, M.; Beker, M. G.; Belczynski, C.; Bell, A. S.; Bell, C.; Bergmann, G.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Beyersdorf, P. T.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Biscans, S.; Bitossi, M.; Bizouard, M. A.; Black, E.; Blackburn, J. K.; Blackburn, L.; Blair, D.; Bloemen, S.; Blom, M.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bond, C.; Bondu, F.; Bonelli, L.; Bonnand, R.; Bork, R.; Born, M.; Boschi, V.; Bose, Sukanta; Bosi, L.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Bridges, D. O.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brückner, F.; Buchman, S.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Burman, R.; Buskulic, D.; Buy, C.; Cadonati, L.; Cagnoli, G.; Calderón Bustillo, J.; Calloni, E.; Camp, J. B.; Campsie, P.; Cannon, K. C.; Canuel, B.; Cao, J.; Capano, C. D.; Carbognani, F.; Carbone, L.; Caride, S.; Castiglia, A.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Celerier, C.; Cella, G.; Cepeda, C.; Cesarini, E.; Chakraborty, R.; Chalermsongsak, T.; Chamberlin, S. J.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, X.; Chen, Y.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Chow, J.; Christensen, N.; Chu, Q.; Chua, S. S. Y.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C.; Colombini, M.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Cook, D.; Corbitt, T. R.; Cordier, M.; Cornish, N.; Corpuz, A.; Corsi, A.; Costa, C. A.; Coughlin, M. W.; Coughlin, S.; Coulon, J.-P.; Countryman, S.; Couvares, P.; Coward, D. M.; Cowart, M.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dahl, K.; Dal Canton, T.; Damjanic, M.; Danilishin, S. L.; D'Antonio, S.; Danzmann, K.; Dattilo, V.; Daveloza, H.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; Dayanga, T.; Debreczeni, G.; Degallaix, J.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Dhurandhar, S.; Díaz, M.; Di Fiore, L.; Di Lieto, A.; Di Palma, I.; Di Virgilio, A.; Donath, A.; Donovan, F.; Dooley, K. L.; Doravari, S.; Dossa, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Dwyer, S.; Eberle, T.; Edo, T.; Edwards, M.; Effler, A.; Eggenstein, H.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Endrőczi, G.; Essick, R.; Etzel, T.; Evans, M.; Evans, T.; Factourovich, M.; Fafone, V.; Fairhurst, S.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fehrmann, H.; Fejer, M. M.; Feldbaum, D.; Feroz, F.; Ferrante, I.; Ferrini, F.; Fidecaro, F.; Finn, L. S.; Fiori, I.; Fisher, R. P.; Flaminio, R.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frede, M.; Frei, Z.; Freise, A.; Frey, R.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gair, J.; Gammaitoni, L.; Gaonkar, S.; Garufi, F.; Gehrels, N.; Gemme, G.; Genin, E.; Gennai, A.; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, C.; Gleason, J.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gordon, N.; Gorodetsky, M. L.; Gossan, S.; Goßler, S.; Gouaty, R.; Gräf, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greenhalgh, R. J. S.; Gretarsson, A. M.; Groot, P.; Grote, H.; Grover, K.; Grunewald, S.; Guidi, G. M.; Guido, C.; Gushwa, K.; Gustafson, E. K.; Gustafson, R.; Hammer, D.; Hammond, G.; Hanke, M.; Hanks, J.; Hanna, C.; Hanson, J.; Harms, J.; Harry, G. M.; Harry, I. W.; Harstad, E. D.; Hart, M.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Heidmann, A.; Heintze, M.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Heptonstall, A. W.; Heurs, M.; Hewitson, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Holt, K.; Hooper, S.; Hopkins, P.; Hosken, D. J.; Hough, J.; Howell, E. J.; Hu, Y.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh, M.; Huynh-Dinh, T.; Ingram, D. R.; Inta, R.; Isogai, T.; Ivanov, A.; Iyer, B. R.; Izumi, K.; Jacobson, M.; James, E.; Jang, H.; Jaranowski, P.; Ji, Y.

    2014-06-01

    The Numerical INJection Analysis (NINJA) project is a collaborative effort between members of the numerical relativity and gravitational-wave (GW) astrophysics communities. The purpose of NINJA is to study the ability to detect GWs emitted from merging binary black holes (BBH) and recover their parameters with next-generation GW observatories. We report here on the results of the second NINJA project, NINJA-2, which employs 60 complete BBH hybrid waveforms consisting of a numerical portion modelling the late inspiral, merger, and ringdown stitched to a post-Newtonian portion modelling the early inspiral. In a ‘blind injection challenge’ similar to that conducted in recent Laser Interferometer Gravitational Wave Observatory (LIGO) and Virgo science runs, we added seven hybrid waveforms to two months of data recoloured to predictions of Advanced LIGO (aLIGO) and Advanced Virgo (AdV) sensitivity curves during their first observing runs. The resulting data was analysed by GW detection algorithms and 6 of the waveforms were recovered with false alarm rates smaller than 1 in a thousand years. Parameter-estimation algorithms were run on each of these waveforms to explore the ability to constrain the masses, component angular momenta and sky position of these waveforms. We find that the strong degeneracy between the mass ratio and the BHs’ angular momenta will make it difficult to precisely estimate these parameters with aLIGO and AdV. We also perform a large-scale Monte Carlo study to assess the ability to recover each of the 60 hybrid waveforms with early aLIGO and AdV sensitivity curves. Our results predict that early aLIGO and AdV will have a volume-weighted average sensitive distance of 300 Mpc (1 Gpc) for 10M⊙ + 10M⊙ (50M⊙ + 50M⊙) BBH coalescences. We demonstrate that neglecting the component angular momenta in the waveform models used in matched-filtering will result in a reduction in sensitivity for systems with large component angular momenta. This

  9. Future engineering needs of mirror fusion reactors

    SciTech Connect

    Thomassen, K.I.

    1982-07-30

    Fusion research has matured during the last decade and significant insight into the future program needs has emerged. While some will properly note that the crystal ball is cloudy, it is equally important to note that the shape and outline of our course is discernable. In this short summary paper, I will draw upon the National Mirror Program Plan for mirror projects and on available design studies of these projects to put the specific needs of the mirror program in perspective.

  10. The need and prospects for improved fusion reactors

    NASA Astrophysics Data System (ADS)

    Krakowski, R. A.; Miller, R. L.; Hagenson, R. L.

    1986-09-01

    Conceptual fusion reactor studies over the past 10-15 yr have projected systems that may be too large, complex, and costly to be of commercial interest. One main direction for improved fusion reactors points toward smaller, higher-power-density approaches. First-order economic issues (i.e., unit direct cost and cost of electricity) are used to support the need for more compact fusion reactors. The results of a number of recent conceptual designs of reversed-field pinch, spheromak, and tokamak fusion reactors are summarized as examples of more compact approaches. While a focus has been placed on increasing the fusion-power-core mass power density beyond the minimum economic threshold of 100-200 kWe/tonne, other means by which the overall attractiveness of fusion as a long-term energy source are also addressed.

  11. A methodology for hard/soft information fusion in the condition monitoring of aircraft

    NASA Astrophysics Data System (ADS)

    Bernardo, Joseph T.

    2013-05-01

    Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.

  12. Impact of a statistical bias correction on the projected simulated hydrological changes obtained from three GCMs and two hydrology models

    NASA Astrophysics Data System (ADS)

    Hagemann, Stefan; Chen, Cui; Haerter, Jan O.; Gerten, Dieter; Heinke, Jens; Piani, Claudio

    2010-05-01

    Future climate model scenarios depend crucially on their adequate representation of the hydrological cycle. Within the European project "Water and Global Change" (WATCH) special care is taken to couple state-of-the-art climate model output to a suite of hydrological models. This coupling is expected to lead to a better assessment of changes in the hydrological cycle. However, due to the systematic model errors of climate models, their output is often not directly applicable as input for hydrological models. Thus, the methodology of a statistical bias correction has been developed, which can be used for correcting climate model output to produce internally consistent fields that have the same statistical intensity distribution as the observations. As observations, global re-analysed daily data of precipitation and temperature are used that are obtained in the WATCH project. We will apply the bias correction to global climate model data of precipitation and temperature from the GCMs ECHAM5/MPIOM, CNRM-CM3 and LMDZ-4, and intercompare the bias corrected data to the original GCM data and the observations. Then, the orginal and the bias corrected GCM data will be used to force two global hydrology models: (1) the hydrological model of the Max Planck Institute for Meteorology (MPI-HM) consisting of the Simplified Land surface (SL) scheme and the Hydrological Discharge (HD) model, and (2) the dynamic vegetation model LPJmL operated by the Potsdam Institute for Climate Impact Research. The impact of the bias correction on the projected simulated hydrological changes will be analysed, and the resulting behaviour of the two hydrology models will be compared.

  13. Fostering interprofessional communication through case discussions and simulated ward rounds in nursing and medical education: A pilot project

    PubMed Central

    Wershofen, Birgit; Heitzmann, Nicole; Beltermann, Esther; Fischer, Martin R.

    2016-01-01

    Background: Poor communication between physicians and nursing staff could result in inadequate interprofessional collaboration with negative effects on patient health. In order to ensure optimal health care for patients, it is important to strengthen interprofessional communication and collaboration between physicians and nurses during their education. Aim: The aim of this project is to foster communication for medical and nursing students through interprofessional case discussions and simulated ward rounds as a form of training. Method: In 2013-15 a total of 39 nursing students and 22 medical students participated in eight seminars, each covering case discussions and simulated ward rounds. The seminar was evaluated based on student assessment of the educational objectives. Results: Students who voluntarily signed up for the seminar profited from the interprofessional interaction and gathered positive experiences working in a team. Conclusion: Through practicing case discussions and ward rounds as a group, interprofessional communication could be fostered between medical and nursing students. Students took advantage of the opportunity to ask those from other profession questions and realized that interprofessional interaction can lead to improved health care. PMID:27280139

  14. The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP): Overview and Description of Models, Simulations and Climate Diagnostics

    NASA Technical Reports Server (NTRS)

    Lamarque, J.-F.; Shindell, D. T.; Naik, V.; Plummer, D.; Josse, B.; Righi, M.; Rumbold, S. T.; Schulz, M.; Skeie, R. B.; Strode, S.; Young, P. J.; Cionni, I.; Dalsoren, S.; Eyring, V.; Bergmann, D.; Cameron-Smith, P.; Collins, W. J.; Doherty, R.; Faluvegi, G.; Folberth, G.; Ghan, S. J.; Horowitz, L. W.; Lee, Y. H.; MacKenzie, I. A.; Nagashima, T.

    2013-01-01

    The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP) consists of a series of time slice experiments targeting the long-term changes in atmospheric composition between 1850 and 2100, with the goal of documenting composition changes and the associated radiative forcing. In this overview paper, we introduce the ACCMIP activity, the various simulations performed (with a requested set of 14) and the associated model output. The 16 ACCMIP models have a wide range of horizontal and vertical resolutions, vertical extent, chemistry schemes and interaction with radiation and clouds. While anthropogenic and biomass burning emissions were specified for all time slices in the ACCMIP protocol, it is found that the natural emissions are responsible for a significant range across models, mostly in the case of ozone precursors. The analysis of selected present-day climate diagnostics (precipitation, temperature, specific humidity and zonal wind) reveals biases consistent with state-of-the-art climate models. The model-to- model comparison of changes in temperature, specific humidity and zonal wind between 1850 and 2000 and between 2000 and 2100 indicates mostly consistent results. However, models that are clear outliers are different enough from the other models to significantly affect their simulation of atmospheric chemistry.

  15. NIHAO project - I. Reproducing the inefficiency of galaxy formation across cosmic time with a large sample of cosmological hydrodynamical simulations

    NASA Astrophysics Data System (ADS)

    Wang, Liang; Dutton, Aaron A.; Stinson, Gregory S.; Macciò, Andrea V.; Penzo, Camilla; Kang, Xi; Keller, Ben W.; Wadsley, James

    2015-11-01

    We introduce project NIHAO (Numerical Investigation of a Hundred Astrophysical Objects), a set of 100 cosmological zoom-in hydrodynamical simulations performed using the GASOLINE code, with an improved implementation of the SPH algorithm. The haloes in our study range from dwarf (M200 ˜ 5 × 109 M⊙) to Milky Way (M200 ˜ 2 × 1012 M⊙) masses, and represent an unbiased sampling of merger histories, concentrations and spin parameters. The particle masses and force softenings are chosen to resolve the mass profile to below 1 per cent of the virial radius at all masses, ensuring that galaxy half-light radii are well resolved. Using the same treatment of star formation and stellar feedback for every object, the simulated galaxies reproduce the observed inefficiency of galaxy formation across cosmic time as expressed through the stellar mass versus halo mass relation, and the star formation rate versus stellar mass relation. We thus conclude that stellar feedback is the chief piece of physics required to limit the efficiency of star formation in galaxies less massive than the Milky Way.

  16. A Simulation of the Front End Signal Digitization for the ATLAS Muon Spectrometer thin RPC trigger upgrade project

    NASA Astrophysics Data System (ADS)

    Meng, Xiangting; Chapman, John; Levin, Daniel; Dai, Tiesheng; Zhu, Junjie; Zhou, Bing; Um Atlas Group Team

    2016-03-01

    The ATLAS Muon Spectrometer Phase-I (and Phase-II) upgrade includes the BIS78 muon trigger detector project: two sets of eight very thin Resistive Place Chambers (tRPCs) combined with small Monitored Drift Tube (MDT) chambers in the pseudorapidity region 1<| η|<1.3. The tRPCs will be comprised of triplet readout layer in each of the eta and azimuthal phi coordinates, with about 400 readout strips per layer. The anticipated hit rate is 100-200 kHz per strip. Digitization of the strip signals will be done by 32-channel CERN HPTDC chips. The HPTDC is a highly configurable ASIC designed by the CERN Microelectronics group. It can work in both trigger and trigger-less modes, be readout in parallel or serially. For Phase-I operation, a stringent latency requirement of 43 bunch crossings (1075 ns) is imposed. The latency budget for the front end digitization must be kept to a minimal value, ideally less than 350 ns. We conducted detailed HPTDC latency simulations using the Behavioral Verilog code from the CERN group. We will report the results of these simulations run for the anticipated detector operating environment and for various HPTDC configurations.

  17. Evaluation of Present-day Aerosols over China Simulated from the Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP)

    NASA Astrophysics Data System (ADS)

    Liao, H.; Chang, W.

    2014-12-01

    High concentrations of aerosols over China lead to strong radiative forcing that is important for both regional and global climate. To understand the representation of aerosols in China in current global climate models, we evaluate extensively the simulated present-day aerosol concentrations and aerosol optical depth (AOD) over China from the 12 models that participated in Atmospheric Chemistry & Climate Model Intercomparison Project (ACCMIP), by using ground-based measurements and satellite remote sensing. Ground-based measurements of aerosol concentrations used in this work include those from the China Meteorological Administration (CMA) Atmosphere Watch Network (CAWNET) and the observed fine-mode aerosol concentrations collected from the literature. The ground-based measurements of AOD in China are taken from the AErosol RObotic NETwork (AERONET), the sites with CIMEL sun photometer operated by Institute of Atmospheric Physics, Chinese Academy of Sciences, and from Chinese Sun Hazemeter Network (CSHNET). We find that the ACCMIP models generally underestimate concentrations of all major aerosol species in China. On an annual mean basis, the multi-model mean concentrations of sulfate, nitrate, ammonium, black carbon, and organic carbon are underestimated by 63%, 73%, 54%, 53%, and 59%, respectively. The multi-model mean AOD values show low biases of 20-40% at studied sites in China. The ACCMIP models can reproduce seasonal variation of nitrate but cannot capture well the seasonal variations of other aerosol species. Our analyses indicate that current global models generally underestimate the role of aerosols in China in climate simulations.

  18. Ferrocyanide Safety Project: Comparison of actual and simulated ferrocyanide waste properties

    SciTech Connect

    Scheele, R.D.; Burger, L.L.; Sell, R.L.; Bredt, P.R.; Barrington, R.J.

    1994-09-01

    In the 1950s, additional high-level radioactive waste storage capacity was needed to accommodate the wastes that would result from the production of recovery of additional nuclear defense materials. To provide this additional waste storage capacity, the Hanford Site operating contractor developed a process to decontaminate aqueous wastes by precipitating radiocesium as an alkali nickel ferrocyanide; this process allowed disposal of the aqueous waste. The radiocesium scavenging process as developed was used to decontaminate (1) first-cycle bismuth phosphate (BiPO{sub 4}) wastes, (2) acidic wastes resulting from uranium recovery operations, and (3) the supernate from neutralized uranium recovery wastes. The radiocesium scavenging process was often coupled with other scavenging processes to remove radiostrontium and radiocobalt. Because all defense materials recovery processes used nitric acid solutions, all of the wastes contained nitrate, which is a strong oxidizer. The variety of wastes treated, and the occasional coupling of radiostrontium and radiocobalt scavenging processes with the radiocesium scavenging process, resulted in ferrocyanide-bearing wastes having many different compositions. In this report, we compare selected physical, chemical, and radiochemical properties measured for Tanks C-109 and C-112 wastes and selected physical and chemical properties of simulated ferrocyanide wastes to assess the representativeness of stimulants prepared by WHC.

  19. Markov Chain Monte Carlo simulation for projection of end stage renal disease patients in Greece.

    PubMed

    Rodina-Theocharaki, A; Bliznakova, K; Pallikarakis, N

    2012-07-01

    End stage renal disease (ESRD) treatment methods are considered to be among the most expensive procedures for chronic conditions worldwide which also have severe impact on patients' quality of life. During the last decade, Greece has been among the countries with the highest incidence and prevalence, while at the same time with the lowest kidney transplantation rates. Predicting future patients' number on Renal Replacement Therapy (RRT) is essential for health care providers in order to achieve more effective resource management. In this study a Markov Chain Monte Carlo (MCMC) simulation is presented for predicting the future number of ESRD patients for the period 2009-2020 in Greece. The MCMC model comprises Monte Carlo sampling techniques applied on probability distributions of the constructed Markov Chain. The model predicts that there will be 15,147 prevalent patients on RRT in Greece by 2020. Additionally, a cost-effectiveness analysis was performed on a scenario of gradually reducing the hemodialysis patients in favor of increasing the transplantation number by 2020. The proposed scenario showed net savings of 86.54 million Euros for the period 2009-2020 compared to the base-case prediction. PMID:22024418

  20. Satellite quenching timescales in clusters from projected phase space measurements matched to simulated orbits

    NASA Astrophysics Data System (ADS)

    Oman, Kyle A.; Hudson, Michael J.

    2016-09-01

    We measure the star formation quenching efficiency and timescale in cluster environments. Our method uses N-body simulations to estimate the probability distribution of possible orbits for a sample of observed SDSS galaxies in and around clusters based on their position and velocity offsets from their host cluster. We study the relationship between their star formation rates and their likely orbital histories via a simple model in which star formation is quenched once a delay time after infall has elapsed. Our orbit library method is designed to isolate the environmental effect on the star formation rate due to a galaxy's present-day host cluster from `pre-processing' in previous group hosts. We find that quenching of satellite galaxies of all stellar masses in our sample (109 - 1011.5 M⊙) by massive (> 1013 M⊙) clusters is essentially 100 per cent efficient. Our fits show that all galaxies quench on their first infall, approximately at or within a Gyr of their first pericentric passage. There is little variation in the onset of quenching from galaxy-to-galaxy: the spread in this time is at most ˜2 Gyr at fixed M★. Higher mass satellites quench earlier, with very little dependence on host cluster mass in the range probed by our sample.