Multi-physics CFD simulations in engineering
NASA Astrophysics Data System (ADS)
Yamamoto, Makoto
2013-08-01
Nowadays Computational Fluid Dynamics (CFD) software is adopted as a design and analysis tool in a great number of engineering fields. We can say that single-physics CFD has been sufficiently matured in the practical point of view. The main target of existing CFD software is single-phase flows such as water and air. However, many multi-physics problems exist in engineering. Most of them consist of flow and other physics, and the interactions between different physics are very important. Obviously, multi-physics phenomena are critical in developing machines and processes. A multi-physics phenomenon seems to be very complex, and it is so difficult to be predicted by adding other physics to flow phenomenon. Therefore, multi-physics CFD techniques are still under research and development. This would be caused from the facts that processing speed of current computers is not fast enough for conducting a multi-physics simulation, and furthermore physical models except for flow physics have not been suitably established. Therefore, in near future, we have to develop various physical models and efficient CFD techniques, in order to success multi-physics simulations in engineering. In the present paper, I will describe the present states of multi-physics CFD simulations, and then show some numerical results such as ice accretion and electro-chemical machining process of a three-dimensional compressor blade which were obtained in my laboratory. Multi-physics CFD simulations would be a key technology in near future.
Dynamic Biological Functioning Important for Simulating and Stabilizing Ocean Biogeochemistry
NASA Astrophysics Data System (ADS)
Buchanan, P. J.; Matear, R. J.; Chase, Z.; Phipps, S. J.; Bindoff, N. L.
2018-04-01
The biogeochemistry of the ocean exerts a strong influence on the climate by modulating atmospheric greenhouse gases. In turn, ocean biogeochemistry depends on numerous physical and biological processes that change over space and time. Accurately simulating these processes is fundamental for accurately simulating the ocean's role within the climate. However, our simulation of these processes is often simplistic, despite a growing understanding of underlying biological dynamics. Here we explore how new parameterizations of biological processes affect simulated biogeochemical properties in a global ocean model. We combine 6 different physical realizations with 6 different biogeochemical parameterizations (36 unique ocean states). The biogeochemical parameterizations, all previously published, aim to more accurately represent the response of ocean biology to changing physical conditions. We make three major findings. First, oxygen, carbon, alkalinity, and phosphate fields are more sensitive to changes in the ocean's physical state. Only nitrate is more sensitive to changes in biological processes, and we suggest that assessment protocols for ocean biogeochemical models formally include the marine nitrogen cycle to assess their performance. Second, we show that dynamic variations in the production, remineralization, and stoichiometry of organic matter in response to changing environmental conditions benefit the simulation of ocean biogeochemistry. Third, dynamic biological functioning reduces the sensitivity of biogeochemical properties to physical change. Carbon and nitrogen inventories were 50% and 20% less sensitive to physical changes, respectively, in simulations that incorporated dynamic biological functioning. These results highlight the importance of a dynamic biology for ocean properties and climate.
Electrical Storm Simulation to Improve the Learning Physics Process
ERIC Educational Resources Information Center
Martínez Muñoz, Miriam; Jiménez Rodríguez, María Lourdes; Gutiérrez de Mesa, José Antonio
2013-01-01
This work is part of a research project whose main objective is to understand the impact that the use of Information and Communication Technology (ICT) has on the teaching and learning process on the subject of Physics. We will show that, with the use of a storm simulator, physics students improve their learning process on one hand they understand…
Physical Uncertainty Bounds (PUB)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughan, Diane Elizabeth; Preston, Dean L.
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switchingmore » out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.« less
ERIC Educational Resources Information Center
Weiss, Charles J.
2017-01-01
An introduction to digital stochastic simulations for modeling a variety of physical and chemical processes is presented. Despite the importance of stochastic simulations in chemistry, the prevalence of turn-key software solutions can impose a layer of abstraction between the user and the underlying approach obscuring the methodology being…
Displaying Computer Simulations Of Physical Phenomena
NASA Technical Reports Server (NTRS)
Watson, Val
1991-01-01
Paper discusses computer simulation as means of experiencing and learning to understand physical phenomena. Covers both present simulation capabilities and major advances expected in near future. Visual, aural, tactile, and kinesthetic effects used to teach such physical sciences as dynamics of fluids. Recommends classrooms in universities, government, and industry be linked to advanced computing centers so computer simulations integrated into education process.
Development of IR imaging system simulator
NASA Astrophysics Data System (ADS)
Xiang, Xinglang; He, Guojing; Dong, Weike; Dong, Lu
2017-02-01
To overcome the disadvantages of the tradition semi-physical simulation and injection simulation equipment in the performance evaluation of the infrared imaging system (IRIS), a low-cost and reconfigurable IRIS simulator, which can simulate the realistic physical process of infrared imaging, is proposed to test and evaluate the performance of the IRIS. According to the theoretical simulation framework and the theoretical models of the IRIS, the architecture of the IRIS simulator is constructed. The 3D scenes are generated and the infrared atmospheric transmission effects are simulated using OGRE technology in real-time on the computer. The physical effects of the IRIS are classified as the signal response characteristic, modulation transfer characteristic and noise characteristic, and they are simulated on the single-board signal processing platform based on the core processor FPGA in real-time using high-speed parallel computation method.
Kinetic Theory and Simulation of Single-Channel Water Transport
NASA Astrophysics Data System (ADS)
Tajkhorshid, Emad; Zhu, Fangqiang; Schulten, Klaus
Water translocation between various compartments of a system is a fundamental process in biology of all living cells and in a wide variety of technological problems. The process is of interest in different fields of physiology, physical chemistry, and physics, and many scientists have tried to describe the process through physical models. Owing to advances in computer simulation of molecular processes at an atomic level, water transport has been studied in a variety of molecular systems ranging from biological water channels to artificial nanotubes. While simulations have successfully described various kinetic aspects of water transport, offering a simple, unified model to describe trans-channel translocation of water turned out to be a nontrivial task.
Method for simulating discontinuous physical systems
Baty, Roy S.; Vaughn, Mark R.
2001-01-01
The mathematical foundations of conventional numerical simulation of physical systems provide no consistent description of the behavior of such systems when subjected to discontinuous physical influences. As a result, the numerical simulation of such problems requires ad hoc encoding of specific experimental results in order to address the behavior of such discontinuous physical systems. In the present invention, these foundations are replaced by a new combination of generalized function theory and nonstandard analysis. The result is a class of new approaches to the numerical simulation of physical systems which allows the accurate and well-behaved simulation of discontinuous and other difficult physical systems, as well as simpler physical systems. Applications of this new class of numerical simulation techniques to process control, robotics, and apparatus design are outlined.
Physical Processes and Applications of the Monte Carlo Radiative Energy Deposition (MRED) Code
NASA Astrophysics Data System (ADS)
Reed, Robert A.; Weller, Robert A.; Mendenhall, Marcus H.; Fleetwood, Daniel M.; Warren, Kevin M.; Sierawski, Brian D.; King, Michael P.; Schrimpf, Ronald D.; Auden, Elizabeth C.
2015-08-01
MRED is a Python-language scriptable computer application that simulates radiation transport. It is the computational engine for the on-line tool CRÈME-MC. MRED is based on c++ code from Geant4 with additional Fortran components to simulate electron transport and nuclear reactions with high precision. We provide a detailed description of the structure of MRED and the implementation of the simulation of physical processes used to simulate radiation effects in electronic devices and circuits. Extensive discussion and references are provided that illustrate the validation of models used to implement specific simulations of relevant physical processes. Several applications of MRED are summarized that demonstrate its ability to predict and describe basic physical phenomena associated with irradiation of electronic circuits and devices. These include effects from single particle radiation (including both direct ionization and indirect ionization effects), dose enhancement effects, and displacement damage effects. MRED simulations have also helped to identify new single event upset mechanisms not previously observed by experiment, but since confirmed, including upsets due to muons and energetic electrons.
NASA Technical Reports Server (NTRS)
1981-01-01
The development of a coal gasification system design and mass and energy balance simulation program for the TVA and other similar facilities is described. The materials-process-product model (MPPM) and the advanced system for process engineering (ASPEN) computer program were selected from available steady state and dynamic models. The MPPM was selected to serve as the basis for development of system level design model structure because it provided the capability for process block material and energy balance and high-level systems sizing and costing. The ASPEN simulation serves as the basis for assessing detailed component models for the system design modeling program. The ASPEN components were analyzed to identify particular process blocks and data packages (physical properties) which could be extracted and used in the system design modeling program. While ASPEN physical properties calculation routines are capable of generating physical properties required for process simulation, not all required physical property data are available, and must be user-entered.
An integrated algorithm for hypersonic fluid-thermal-structural numerical simulation
NASA Astrophysics Data System (ADS)
Li, Jia-Wei; Wang, Jiang-Feng
2018-05-01
In this paper, a fluid-structural-thermal integrated method is presented based on finite volume method. A unified integral equations system is developed as the control equations for physical process of aero-heating and structural heat transfer. The whole physical field is discretized by using an up-wind finite volume method. To demonstrate its capability, the numerical simulation of Mach 6.47 flow over stainless steel cylinder shows a good agreement with measured values, and this method dynamically simulates the objective physical processes. Thus, the integrated algorithm proves to be efficient and reliable.
ERIC Educational Resources Information Center
Koka, Andre
2017-01-01
This study examined the effectiveness of a brief theory-based intervention on muscular strength among adolescents in a physical education setting. The intervention adopted a process-based mental simulation technique. The self-reported frequency of practising for and actual levels of abdominal muscular strength/endurance as one component of…
Evaluating crown fire rate of spread predictions from physics-based models
C. M. Hoffman; J. Ziegler; J. Canfield; R. R. Linn; W. Mell; C. H. Sieg; F. Pimont
2015-01-01
Modeling the behavior of crown fires is challenging due to the complex set of coupled processes that drive the characteristics of a spreading wildfire and the large range of spatial and temporal scales over which these processes occur. Detailed physics-based modeling approaches such as FIRETEC and the Wildland Urban Interface Fire Dynamics Simulator (WFDS) simulate...
An Empirical Study of Combining Communicating Processes in a Parallel Discrete Event Simulation
1990-12-01
dynamics of the cost/performance criteria which typically made up computer resource acquisition decisions . offering a broad range of tradeoffs in the way... prcesses has a significant impact on simulation performance. It is the hypothesis of this 3-4 SYSTEM DECOMPOSITION PHYSICAL SYSTEM 1: N PHYSICAL PROCESS 1...EMPTY)) next-event = pop(next-event-queue); lp-clock = next-event - time; Simulate next event departure- consume event-enqueue new event end while; If no
The Australian Computational Earth Systems Simulator
NASA Astrophysics Data System (ADS)
Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.
2001-12-01
Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.
Physics-based interactive volume manipulation for sharing surgical process.
Nakao, Megumi; Minato, Kotaro
2010-05-01
This paper presents a new set of techniques by which surgeons can interactively manipulate patient-specific volumetric models for sharing surgical process. To handle physical interaction between the surgical tools and organs, we propose a simple surface-constraint-based manipulation algorithm to consistently simulate common surgical manipulations such as grasping, holding and retraction. Our computation model is capable of simulating soft-tissue deformation and incision in real time. We also present visualization techniques in order to rapidly visualize time-varying, volumetric information on the deformed image. This paper demonstrates the success of the proposed methods in enabling the simulation of surgical processes, and the ways in which this simulation facilitates preoperative planning and rehearsal.
Virtual milk for modelling and simulation of dairy processes.
Munir, M T; Zhang, Y; Yu, W; Wilson, D I; Young, B R
2016-05-01
The modeling of dairy processing using a generic process simulator suffers from shortcomings, given that many simulators do not contain milk components in their component libraries. Recently, pseudo-milk components for a commercial process simulator were proposed for simulation and the current work extends this pseudo-milk concept by studying the effect of both total milk solids and temperature on key physical properties such as thermal conductivity, density, viscosity, and heat capacity. This paper also uses expanded fluid and power law models to predict milk viscosity over the temperature range from 4 to 75°C and develops a succinct regressed model for heat capacity as a function of temperature and fat composition. The pseudo-milk was validated by comparing the simulated and actual values of the physical properties of milk. The milk thermal conductivity, density, viscosity, and heat capacity showed differences of less than 2, 4, 3, and 1.5%, respectively, between the simulated results and actual values. This work extends the capabilities of the previously proposed pseudo-milk and of a process simulator to model dairy processes, processing different types of milk (e.g., whole milk, skim milk, and concentrated milk) with different intrinsic compositions, and to predict correct material and energy balances for dairy processes. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Teaching Harmonic Motion in Trigonometry: Inductive Inquiry Supported by Physics Simulations
ERIC Educational Resources Information Center
Sokolowski, Andrzej; Rackley, Robin
2011-01-01
In this article, the authors present a lesson whose goal is to utilise a scientific environment to immerse a trigonometry student in the process of mathematical modelling. The scientific environment utilised during this activity is a physics simulation called "Wave on a String" created by the PhET Interactive Simulations Project at…
Bencala, Kenneth E.
1984-01-01
Solute transport in streams is determined by the interaction of physical and chemical processes. Data from an injection experiment for chloride and several cations indicate significant influence of solutestreambed processes on transport in a mountain stream. These data are interpreted in terms of transient storage processes for all tracers and sorption processes for the cations. Process parameter values are estimated with simulations based on coupled quasi-two-dimensional transport and first-order mass transfer sorption. Comparative simulations demonstrate the relative roles of the physical and chemical processes in determining solute transport. During the first 24 hours of the experiment, chloride concentrations were attenuated relative to expected plateau levels. Additional attenuation occurred for the sorbing cation strontium. The simulations account for these storage processes. Parameter values determined by calibration compare favorably with estimates from other studies in mountain streams. Without further calibration, the transport of potassium and lithium is adequately simulated using parameters determined in the chloride-strontium simulation and with measured cation distribution coefficients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation.
The physics of bat echolocation: Signal processing techniques
NASA Astrophysics Data System (ADS)
Denny, Mark
2004-12-01
The physical principles and signal processing techniques underlying bat echolocation are investigated. It is shown, by calculation and simulation, how the measured echolocation performance of bats can be achieved.
The Monash University Interactive Simple Climate Model
NASA Astrophysics Data System (ADS)
Dommenget, D.
2013-12-01
The Monash university interactive simple climate model is a web-based interface that allows students and the general public to explore the physical simulation of the climate system with a real global climate model. It is based on the Globally Resolved Energy Balance (GREB) model, which is a climate model published by Dommenget and Floeter [2011] in the international peer review science journal Climate Dynamics. The model simulates most of the main physical processes in the climate system in a very simplistic way and therefore allows very fast and simple climate model simulations on a normal PC computer. Despite its simplicity the model simulates the climate response to external forcings, such as doubling of the CO2 concentrations very realistically (similar to state of the art climate models). The Monash simple climate model web-interface allows you to study the results of more than a 2000 different model experiments in an interactive way and it allows you to study a number of tutorials on the interactions of physical processes in the climate system and solve some puzzles. By switching OFF/ON physical processes you can deconstruct the climate and learn how all the different processes interact to generate the observed climate and how the processes interact to generate the IPCC predicted climate change for anthropogenic CO2 increase. The presentation will illustrate how this web-base tool works and what are the possibilities in teaching students with this tool are.
Semi-physical Simulation Platform of a Parafoil Nonlinear Dynamic System
NASA Astrophysics Data System (ADS)
Gao, Hai-Tao; Yang, Sheng-Bo; Zhu, Er-Lin; Sun, Qing-Lin; Chen, Zeng-Qiang; Kang, Xiao-Feng
2013-11-01
Focusing on the problems in the process of simulation and experiment on a parafoil nonlinear dynamic system, such as limited methods, high cost and low efficiency we present a semi-physical simulation platform. It is designed by connecting parts of physical objects to a computer, and remedies the defect that a computer simulation is divorced from a real environment absolutely. The main components of the platform and its functions, as well as simulation flows, are introduced. The feasibility and validity are verified through a simulation experiment. The experimental results show that the platform has significance for improving the quality of the parafoil fixed-point airdrop system, shortening the development cycle and saving cost.
Expanded Processing Techniques for EMI Systems
2012-07-01
possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and mapping...possible to perform better target detection using physics-based algorithms and the entire data set, rather than simulating a simpler data set and...54! Figure 4.25: Plots of simulated MetalMapper data for two oblate spheroidal targets
Mirus, B.B.; Ebel, B.A.; Heppner, C.S.; Loague, K.
2011-01-01
Concept development simulation with distributed, physics-based models provides a quantitative approach for investigating runoff generation processes across environmental conditions. Disparities within data sets employed to design and parameterize boundary value problems used in heuristic simulation inevitably introduce various levels of bias. The objective was to evaluate the impact of boundary value problem complexity on process representation for different runoff generation mechanisms. The comprehensive physics-based hydrologic response model InHM has been employed to generate base case simulations for four well-characterized catchments. The C3 and CB catchments are located within steep, forested environments dominated by subsurface stormflow; the TW and R5 catchments are located in gently sloping rangeland environments dominated by Dunne and Horton overland flows. Observational details are well captured within all four of the base case simulations, but the characterization of soil depth, permeability, rainfall intensity, and evapotranspiration differs for each. These differences are investigated through the conversion of each base case into a reduced case scenario, all sharing the same level of complexity. Evaluation of how individual boundary value problem characteristics impact simulated runoff generation processes is facilitated by quantitative analysis of integrated and distributed responses at high spatial and temporal resolution. Generally, the base case reduction causes moderate changes in discharge and runoff patterns, with the dominant process remaining unchanged. Moderate differences between the base and reduced cases highlight the importance of detailed field observations for parameterizing and evaluating physics-based models. Overall, similarities between the base and reduced cases indicate that the simpler boundary value problems may be useful for concept development simulation to investigate fundamental controls on the spectrum of runoff generation mechanisms. Copyright 2011 by the American Geophysical Union.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Huan; Cheng, Liang; Chuah, Mooi Choo
In the generation, transmission, and distribution sectors of the smart grid, intelligence of field devices is realized by programmable logic controllers (PLCs). Many smart-grid subsystems are essentially cyber-physical energy systems (CPES): For instance, the power system process (i.e., the physical part) within a substation is monitored and controlled by a SCADA network with hosts running miscellaneous applications (i.e., the cyber part). To study the interactions between the cyber and physical components of a CPES, several co-simulation platforms have been proposed. However, the network simulators/emulators of these platforms do not include a detailed traffic model that takes into account the impactsmore » of the execution model of PLCs on traffic characteristics. As a result, network traces generated by co-simulation only reveal the impacts of the physical process on the contents of the traffic generated by SCADA hosts, whereas the distinction between PLCs and computing nodes (e.g., a hardened computer running a process visualization application) has been overlooked. To generate realistic network traces using co-simulation for the design and evaluation of applications relying on accurate traffic profiles, it is necessary to establish a traffic model for PLCs. In this work, we propose a parameterized model for PLCs that can be incorporated into existing co-simulation platforms. We focus on the DNP3 subsystem of slave PLCs, which automates the processing of packets from the DNP3 master. To validate our approach, we extract model parameters from both the configuration and network traces of real PLCs. Simulated network traces are generated and compared against those from PLCs. Our evaluation shows that our proposed model captures the essential traffic characteristics of DNP3 slave PLCs, which can be used to extend existing co-simulation platforms and gain further insights into the behaviors of CPES.« less
The management submodel of the Wind Erosion Prediction System
USDA-ARS?s Scientific Manuscript database
The Wind Erosion Prediction System (WEPS) is a process-based, daily time-step, computer model that predicts soil erosion via simulation of the physical processes controlling wind erosion. WEPS is comprised of several individual modules (submodels) that reflect different sets of physical processes, ...
NASA Astrophysics Data System (ADS)
Yang, Yuansheng; Zhao, Fuze; Feng, Xiaohui
2017-10-01
The dispersion of carbon nanotubes (CNTs) in AZ91D melt by ultrasonic processing and microstructure formation of CNTs/AZ91D composite were studied using numerical and physical simulations. The sound field and acoustic streaming were predicted using finite element method. Meanwhile, optimal immersion depth of the ultrasonic probe and suitable ultrasonic power were obtained. Single-bubble model was used to predict ultrasonic cavitation in AZ91D melt. The relationship between sound pressure amplitude and ultrasonic cavitation was established. Physical simulations of acoustic streaming and ultrasonic cavitation agreed well with the numerical simulations. It was confirmed that the dispersion of carbon nanotubes was remarkably improved by ultrasonic processing. Microstructure formation of CNTs/AZ91D composite was numerically simulated using cellular automation method. In addition, grain refinement was achieved and the growth of dendrites was changed due to the uniform dispersion of CNTs.
NASA Astrophysics Data System (ADS)
Ammouri, Aymen; Ben Salah, Walid; Khachroumi, Sofiane; Ben Salah, Tarek; Kourda, Ferid; Morel, Hervé
2014-05-01
Design of integrated power converters needs prototype-less approaches. Specific simulations are required for investigation and validation process. Simulation relies on active and passive device models. Models of planar devices, for instance, are still not available in power simulator tools. There is, thus, a specific limitation during the simulation process of integrated power systems. The paper focuses on the development of a physically-based planar inductor model and its validation inside a power converter during transient switching. The planar inductor model remains a complex device to model, particularly when the skin, the proximity and the parasitic capacitances effects are taken into account. Heterogeneous simulation scheme, including circuit and device models, is successfully implemented in VHDL-AMS language and simulated in Simplorer platform. The mixed simulation results has been favorably tested and compared with practical measurements. It is found that the multi-domain simulation results and measurements data are in close agreement.
NASA Astrophysics Data System (ADS)
Rousseau, A. N.; Álvarez; Yu, X.; Savary, S.; Duffy, C.
2015-12-01
Most physically-based hydrological models simulate to various extents the relevant watershed processes occurring at different spatiotemporal scales. These models use different physical domain representations (e.g., hydrological response units, discretized control volumes) and numerical solution techniques (e.g., finite difference method, finite element method) as well as a variety of approximations for representing the physical processes. Despite the fact that several models have been developed so far, very few inter-comparison studies have been conducted to check beyond streamflows whether different modeling approaches could simulate in a similar fashion the other processes at the watershed scale. In this study, PIHM (Qu and Duffy, 2007), a fully coupled, distributed model, and HYDROTEL (Fortin et al., 2001; Turcotte et al., 2003, 2007), a pseudo-coupled, semi-distributed model, were compared to check whether the models could corroborate observed streamflows while equally representing other processes as well such as evapotranspiration, snow accumulation/melt or infiltration, etc. For this study, the Young Womans Creek watershed, PA, was used to compare: streamflows (channel routing), actual evapotranspiration, snow water equivalent (snow accumulation and melt), infiltration, recharge, shallow water depth above the soil surface (surface flow), lateral flow into the river (surface and subsurface flow) and height of the saturated soil column (subsurface flow). Despite a lack of observed data for contrasting most of the simulated processes, it can be said that the two models can be used as simulation tools for streamflows, actual evapotranspiration, infiltration, lateral flows into the river, and height of the saturated soil column. However, each process presents particular differences as a result of the physical parameters and the modeling approaches used by each model. Potentially, these differences should be object of further analyses to definitively confirm or reject modeling hypotheses.
NASA Astrophysics Data System (ADS)
Lin, Caiyan; Zhang, Zhongfeng; Pu, Zhaoxia; Wang, Fengyun
2017-10-01
A series of numerical simulations is conducted to understand the formation, evolution, and dissipation of an advection fog event over Shanghai Pudong International Airport (ZSPD) with the Weather Research and Forecasting (WRF) model. Using the current operational settings at the Meteorological Center of East China Air Traffic Management Bureau, the WRF model successfully predicts the fog event at ZSPD. Additional numerical experiments are performed to examine the physical processes associated with the fog event. The results indicate that prediction of this particular fog event is sensitive to microphysical schemes for the time of fog dissipation but not for the time of fog onset. The simulated timing of the arrival and dissipation of the fog, as well as the cloud distribution, is substantially sensitive to the planetary boundary layer and radiation (both longwave and shortwave) processes. Moreover, varying forecast lead times also produces different simulation results for the fog event regarding its onset and duration, suggesting a trade-off between more accurate initial conditions and a proper forecast lead time that allows model physical processes to spin up adequately during the fog simulation. The overall outcomes from this study imply that the complexity of physical processes and their interactions within the WRF model during fog evolution and dissipation is a key area of future research.
NASA Astrophysics Data System (ADS)
Sadi, Toufik; Mehonic, Adnan; Montesi, Luca; Buckwell, Mark; Kenyon, Anthony; Asenov, Asen
2018-02-01
We employ an advanced three-dimensional (3D) electro-thermal simulator to explore the physics and potential of oxide-based resistive random-access memory (RRAM) cells. The physical simulation model has been developed recently, and couples a kinetic Monte Carlo study of electron and ionic transport to the self-heating phenomenon while accounting carefully for the physics of vacancy generation and recombination, and trapping mechanisms. The simulation framework successfully captures resistance switching, including the electroforming, set and reset processes, by modeling the dynamics of conductive filaments in the 3D space. This work focuses on the promising yet less studied RRAM structures based on silicon-rich silica (SiO x ) RRAMs. We explain the intrinsic nature of resistance switching of the SiO x layer, analyze the effect of self-heating on device performance, highlight the role of the initial vacancy distributions acting as precursors for switching, and also stress the importance of using 3D physics-based models to capture accurately the switching processes. The simulation work is backed by experimental studies. The simulator is useful for improving our understanding of the little-known physics of SiO x resistive memory devices, as well as other oxide-based RRAM systems (e.g. transition metal oxide RRAMs), offering design and optimization capabilities with regard to the reliability and variability of memory cells.
Visualization Methods for Viability Studies of Inspection Modules for the Space Shuttle
NASA Technical Reports Server (NTRS)
Mobasher, Amir A.
2005-01-01
An effective simulation of an object, process, or task must be similar to that object, process, or task. A simulation could consist of a physical device, a set of mathematical equations, a computer program, a person, or some combination of these. There are many reasons for the use of simulators. Although some of the reasons are unique to a specific situation, there are many general reasons and purposes for using simulators. Some are listed but not limited to (1) Safety, (2) Scarce resources, (3) Teaching/education, (4) Additional capabilities, (5) Flexibility and (6) Cost. Robot simulators are in use for all of these reasons. Virtual environments such as simulators will eliminate physical contact with humans and hence will increase the safety of work environment. Corporations with limited funding and resources may utilize simulators to accomplish their goals while saving manpower and money. A computer simulation is safer than working with a real robot. Robots are typically a scarce resource. Schools typically don t have a large number of robots, if any. Factories don t want the robots not performing useful work unless absolutely necessary. Robot simulators are useful in teaching robotics. A simulator gives a student hands-on experience, if only with a simulator. The simulator is more flexible. A user can quickly change the robot configuration, workcell, or even replace the robot with a different one altogether. In order to be useful, a robot simulator must create a model that accurately performs like the real robot. A powerful simulator is usually thought of as a combination of a CAD package with simulation capabilities. Computer Aided Design (CAD) techniques are used extensively by engineers in virtually all areas of engineering. Parts are designed interactively aided by the graphical display of both wireframe and more realistic shaded renderings. Once a part s dimensions have been specified to the CAD package, designers can view the part from any direction to examine how it will look and perform in relation to other parts. If changes are deemed necessary, the designer can easily make the changes and view the results graphically. However, a complex process of moving parts intended for operation in a complex environment can only be fully understood through the process of animated graphical simulation. A CAD package with simulation capabilities allows the designer to develop geometrical models of the process being designed, as well as the environment in which the process will be used, and then test the process in graphical animation much as the actual physical system would be run . By being able to operate the system of moving and stationary parts, the designer is able to see in simulation how the system will perform under a wide variety of conditions. If, for example, undesired collisions occur between parts of the system, design changes can be easily made without the expense or potential danger of testing the physical system.
Visell, Yon
2015-04-01
This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.
NASA Astrophysics Data System (ADS)
Poppe, Christian; Dörr, Dominik; Henning, Frank; Kärger, Luise
2018-05-01
Wet compression moulding (WCM) provides large-scale production potential for continuously fiber reinforced components as a promising alternative to resin transfer moulding (RTM). Lower cycle times are possible due to parallelization of the process steps draping, infiltration and curing during moulding (viscous draping). Experimental and theoretical investigations indicate a strong mutual dependency between the physical mechanisms, which occur during draping and mould filling (fluid-structure-interaction). Thus, key process parameters, like fiber orientation, fiber volume fraction, cavity pressure and the amount and viscosity of the resin are physically coupled. To enable time and cost efficient product and process development throughout all design stages, accurate process simulation tools are desirable. Separated draping and mould filling simulation models, as appropriate for the sequential RTM-process, cannot be applied for the WCM process due to the above outlined physical couplings. Within this study, a two-dimensional Darcy-Propagation-Element (DPE-2D) based on a finite element formulation with additional control volumes (FE/CV) is presented, verified and applied to forming simulation of a generic geometry, as a first step towards a fluid-structure-interaction model taking into account simultaneous resin infiltration and draping. The model is implemented in the commercial FE-Solver Abaqus by means of several user subroutines considering simultaneous draping and 2D-infiltration mechanisms. Darcy's equation is solved with respect to a local fiber orientation. Furthermore, the material model can access the local fluid domain properties to update the mechanical forming material parameter, which enables further investigations on the coupled physical mechanisms.
Automated Extraction of Flow Features
NASA Technical Reports Server (NTRS)
Dorney, Suzanne (Technical Monitor); Haimes, Robert
2005-01-01
Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.
Automated Extraction of Flow Features
NASA Technical Reports Server (NTRS)
Dorney, Suzanne (Technical Monitor); Haimes, Robert
2004-01-01
Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, recirculation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; iso-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for (co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.
Mapping the Limitations of Breakthrough Analysis in Fixed-Bed Adsorption
NASA Technical Reports Server (NTRS)
Knox, James Clinton
2017-01-01
The separation of gases through adsorption plays an important role in the chemical processing industry, where the separation step is often the costliest part of a chemical process and thus worthy of careful study and optimization. This work developed a number of new, archival aspects on the computer simulations used for the refinement and design of these gas adsorption processes: 1. Presented a new approach to fit the undetermined heat and mass transfer coefficients in the axially dispersed plug flow equation and associated balance equations 2. Examined and described the conditions where non-physical simulation results can arise 3. Presented an approach to determine the limits of the axial dispersion and LDF mass transfer terms above which non-physical simulation results occur.
Nonlinear intrinsic variables and state reconstruction in multiscale simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dsilva, Carmeline J., E-mail: cdsilva@princeton.edu; Talmon, Ronen, E-mail: ronen.talmon@yale.edu; Coifman, Ronald R., E-mail: coifman@math.yale.edu
2013-11-14
Finding informative low-dimensional descriptions of high-dimensional simulation data (like the ones arising in molecular dynamics or kinetic Monte Carlo simulations of physical and chemical processes) is crucial to understanding physical phenomena, and can also dramatically assist in accelerating the simulations themselves. In this paper, we discuss and illustrate the use of nonlinear intrinsic variables (NIV) in the mining of high-dimensional multiscale simulation data. In particular, we focus on the way NIV allows us to functionally merge different simulation ensembles, and different partial observations of these ensembles, as well as to infer variables not explicitly measured. The approach relies on certainmore » simple features of the underlying process variability to filter out measurement noise and systematically recover a unique reference coordinate frame. We illustrate the approach through two distinct sets of atomistic simulations: a stochastic simulation of an enzyme reaction network exhibiting both fast and slow time scales, and a molecular dynamics simulation of alanine dipeptide in explicit water.« less
Nonlinear intrinsic variables and state reconstruction in multiscale simulations
NASA Astrophysics Data System (ADS)
Dsilva, Carmeline J.; Talmon, Ronen; Rabin, Neta; Coifman, Ronald R.; Kevrekidis, Ioannis G.
2013-11-01
Finding informative low-dimensional descriptions of high-dimensional simulation data (like the ones arising in molecular dynamics or kinetic Monte Carlo simulations of physical and chemical processes) is crucial to understanding physical phenomena, and can also dramatically assist in accelerating the simulations themselves. In this paper, we discuss and illustrate the use of nonlinear intrinsic variables (NIV) in the mining of high-dimensional multiscale simulation data. In particular, we focus on the way NIV allows us to functionally merge different simulation ensembles, and different partial observations of these ensembles, as well as to infer variables not explicitly measured. The approach relies on certain simple features of the underlying process variability to filter out measurement noise and systematically recover a unique reference coordinate frame. We illustrate the approach through two distinct sets of atomistic simulations: a stochastic simulation of an enzyme reaction network exhibiting both fast and slow time scales, and a molecular dynamics simulation of alanine dipeptide in explicit water.
NASA Astrophysics Data System (ADS)
Forouzan, Mehdi M.; Chao, Chien-Wei; Bustamante, Danilo; Mazzeo, Brian A.; Wheeler, Dean R.
2016-04-01
The fabrication process of Li-ion battery electrodes plays a prominent role in the microstructure and corresponding cell performance. Here, a mesoscale particle dynamics simulation is developed to relate the manufacturing process of a cathode containing Toda NCM-523 active material to physical and structural properties of the dried film. Particle interactions are simulated with shifted-force Lennard-Jones and granular Hertzian functions. LAMMPS, a freely available particle simulator, is used to generate particle trajectories and resulting predicted properties. To make simulations of the full film thickness feasible, the carbon binder domain (CBD) is approximated with μm-scale particles, each representing about 1000 carbon black particles and associated binder. Metrics for model parameterization and validation are measured experimentally and include the following: slurry viscosity, elasticity of the dried film, shrinkage ratio during drying, volume fraction of phases, slurry and dried film densities, and microstructure cross sections. Simulation results are in substantial agreement with experiment, showing that the simulations reasonably reproduce the relevant physics of particle arrangement during fabrication.
Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes
NASA Technical Reports Server (NTRS)
Williams Colin P.
1999-01-01
Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.
Workshop on data acquisition and trigger system simulations for high energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1992-12-31
This report discusses the following topics: DAQSIM: A data acquisition system simulation tool; Front end and DCC Simulations for the SDC Straw Tube System; Simulation of Non-Blocklng Data Acquisition Architectures; Simulation Studies of the SDC Data Collection Chip; Correlation Studies of the Data Collection Circuit & The Design of a Queue for this Circuit; Fast Data Compression & Transmission from a Silicon Strip Wafer; Simulation of SCI Protocols in Modsim; Visual Design with vVHDL; Stochastic Simulation of Asynchronous Buffers; SDC Trigger Simulations; Trigger Rates, DAQ & Online Processing at the SSC; Planned Enhancements to MODSEM II & SIMOBJECT -- anmore » Overview -- R.; DAGAR -- A synthesis system; Proposed Silicon Compiler for Physics Applications; Timed -- LOTOS in a PROLOG Environment: an Algebraic language for Simulation; Modeling and Simulation of an Event Builder for High Energy Physics Data Acquisition Systems; A Verilog Simulation for the CDF DAQ; Simulation to Design with Verilog; The DZero Data Acquisition System: Model and Measurements; DZero Trigger Level 1.5 Modeling; Strategies Optimizing Data Load in the DZero Triggers; Simulation of the DZero Level 2 Data Acquisition System; A Fast Method for Calculating DZero Level 1 Jet Trigger Properties and Physics Input to DAQ Studies.« less
A glacier runoff extension to the Precipitation Runoff Modeling System
A. E. Van Beusekom; R. J. Viger
2016-01-01
A module to simulate glacier runoff, PRMSglacier, was added to PRMS (Precipitation Runoff Modeling System), a distributed-parameter, physical-process hydrological simulation code. The extension does not require extensive on-glacier measurements or computational expense but still relies on physical principles over empirical relations as much as is feasible while...
Jiang, Xianan; Waliser, Duane E.; Xavier, Prince K.; ...
2015-05-27
Aimed at reducing deficiencies in representing the Madden-Julian oscillation (MJO) in general circulation models (GCMs), a global model evaluation project on vertical structure and physical processes of the MJO was coordinated. In this paper, results from the climate simulation component of this project are reported. Here, it is shown that the MJO remains a great challenge in these latest generation GCMs. The systematic eastward propagation of the MJO is only well simulated in about one fourth of the total participating models. The observed vertical westward tilt with altitude of the MJO is well simulated in good MJO models but notmore » in the poor ones. Damped Kelvin wave responses to the east of convection in the lower troposphere could be responsible for the missing MJO preconditioning process in these poor MJO models. Several process-oriented diagnostics were conducted to discriminate key processes for realistic MJO simulations. While large-scale rainfall partition and low-level mean zonal winds over the Indo-Pacific in a model are not found to be closely associated with its MJO skill, two metrics, including the low-level relative humidity difference between high- and low-rain events and seasonal mean gross moist stability, exhibit statistically significant correlations with the MJO performance. It is further indicated that increased cloud-radiative feedback tends to be associated with reduced amplitude of intraseasonal variability, which is incompatible with the radiative instability theory previously proposed for the MJO. Finally, results in this study confirm that inclusion of air-sea interaction can lead to significant improvement in simulating the MJO.« less
NASA Astrophysics Data System (ADS)
Limatahu, I.; Sutoyo, S.; Wasis; Prahani, B. K.
2018-03-01
In the previous research, CCDSR (Condition, Construction, Development, Simulation, and Reflection) learning model has been developed to improve science process skills for pre-service physics teacher. This research is aimed to analyze the effectiveness of CCDSR learning model towards the improvement skills of creating lesson plan and worksheet of Science Process Skill (SPS) for pre-service physics teacher in academic year 2016/2017. This research used one group pre-test and post-test design on 12 pre-service physics teacher at Physics Education, University of Khairun. Data collection was conducted through test and observation. Creating lesson plan and worksheet SPS skills of pre-service physics teacher measurement were conducted through Science Process Skill Evaluation Sheet (SPSES). The data analysis technique was done by Wilcoxon t-test and n-gain. The CCDSR learning model consists of 5 phases, including (1) Condition, (2) Construction, (3) Development, (4) Simulation, and (5) Reflection. The results showed that there was a significant increase in creating lesson plan and worksheet SPS skills of pre-service physics teacher at α = 5% and n-gain average of moderate category. Thus, the CCDSR learning model is effective for improving skills of creating lesson plan and worksheet SPS for pre-service physics teacher.
Mock Data Challenge for the MPD/NICA Experiment on the HybriLIT Cluster
NASA Astrophysics Data System (ADS)
Gertsenberger, Konstantin; Rogachevsky, Oleg
2018-02-01
Simulation of data processing before receiving first experimental data is an important issue in high-energy physics experiments. This article presents the current Event Data Model and the Mock Data Challenge for the MPD experiment at the NICA accelerator complex which uses ongoing simulation studies to exercise in a stress-testing the distributed computing infrastructure and experiment software in the full production environment from simulated data through the physical analysis.
Temperature and composition profile during double-track laser cladding of H13 tool steel
NASA Astrophysics Data System (ADS)
He, X.; Yu, G.; Mazumder, J.
2010-01-01
Multi-track laser cladding is now applied commercially in a range of industries such as automotive, mining and aerospace due to its diversified potential for material processing. The knowledge of temperature, velocity and composition distribution history is essential for a better understanding of the process and subsequent microstructure evolution and properties. Numerical simulation not only helps to understand the complex physical phenomena and underlying principles involved in this process, but it can also be used in the process prediction and system control. The double-track coaxial laser cladding with H13 tool steel powder injection is simulated using a comprehensive three-dimensional model, based on the mass, momentum, energy conservation and solute transport equation. Some important physical phenomena, such as heat transfer, phase changes, mass addition and fluid flow, are taken into account in the calculation. The physical properties for a mixture of solid and liquid phase are defined by treating it as a continuum media. The velocity of the laser beam during the transition between two tracks is considered. The evolution of temperature and composition of different monitoring locations is simulated.
Mathematical modeling of high-pH chemical flooding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhuyan, D.; Lake, L.W.; Pope, G.A.
1990-05-01
This paper describes a generalized compositional reservoir simulator for high-pH chemical flooding processes. This simulator combines the reaction chemistry associated with these processes with the extensive physical- and flow-property modeling schemes of an existing micellar/polymer flood simulator, UTCHEM. Application of the model is illustrated for cases from a simple alkaline preflush to surfactant-enhanced alkaline-polymer flooding.
NASA Astrophysics Data System (ADS)
Supurwoko; Cari; Sarwanto; Sukarmin; Fauzi, Ahmad; Faradilla, Lisa; Summa Dewi, Tiarasita
2017-11-01
The process of learning and teaching in Physics is often confronted with abstract concepts. It makes difficulty for students to understand and teachers to teach the concept. One of the materials that has an abstract concept is Compton Effect. The purpose of this research is to evaluate computer simulation model on Compton Effect material which is used to improve high thinking ability of Physics teacher candidate students. This research is a case study. The subject is students at physics educations who have attended Modern Physics lectures. Data were obtained through essay test for measuring students’ high-order thinking skills and quisioners for measuring students’ responses. The results obtained indicate that computer simulation model can be used to improve students’ high order thinking skill and can be used to improve students’ responses. With this result it is suggested that the audiences use the simulation media in learning
Modeling DNP3 Traffic Characteristics of Field Devices in SCADA Systems of the Smart Grid
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Huan; Cheng, Liang; Chuah, Mooi Choo
In the generation, transmission, and distribution sectors of the smart grid, intelligence of field devices is realized by programmable logic controllers (PLCs). Many smart-grid subsystems are essentially cyber-physical energy systems (CPES): For instance, the power system process (i.e., the physical part) within a substation is monitored and controlled by a SCADA network with hosts running miscellaneous applications (i.e., the cyber part). To study the interactions between the cyber and physical components of a CPES, several co-simulation platforms have been proposed. However, the network simulators/emulators of these platforms do not include a detailed traffic model that takes into account the impactsmore » of the execution model of PLCs on traffic characteristics. As a result, network traces generated by co-simulation only reveal the impacts of the physical process on the contents of the traffic generated by SCADA hosts, whereas the distinction between PLCs and computing nodes (e.g., a hardened computer running a process visualization application) has been overlooked. To generate realistic network traces using co-simulation for the design and evaluation of applications relying on accurate traffic profiles, it is necessary to establish a traffic model for PLCs. In this work, we propose a parameterized model for PLCs that can be incorporated into existing co-simulation platforms. We focus on the DNP3 subsystem of slave PLCs, which automates the processing of packets from the DNP3 master. To validate our approach, we extract model parameters from both the configuration and network traces of real PLCs. Simulated network traces are generated and compared against those from PLCs. Our evaluation shows that our proposed model captures the essential traffic characteristics of DNP3 slave PLCs, which can be used to extend existing co-simulation platforms and gain further insights into the behaviors of CPES.« less
Fast Simulation of the Impact Parameter Calculation of Electrons through Pair Production
NASA Astrophysics Data System (ADS)
Bang, Hyesun; Kweon, MinJung; Huh, Kyoung Bum; Pachmayer, Yvonne
2018-05-01
A fast simulation method is introduced that reduces tremendously the time required for the impact parameter calculation, a key observable in physics analyses of high energy physics experiments and detector optimisation studies. The impact parameter of electrons produced through pair production was calculated considering key related processes using the Bethe-Heitler formula, the Tsai formula and a simple geometric model. The calculations were performed at various conditions and the results were compared with those from full GEANT4 simulations. The computation time using this fast simulation method is 104 times shorter than that of the full GEANT4 simulation.
Ultracold-atom quantum simulator for attosecond science
NASA Astrophysics Data System (ADS)
Sala, Simon; Förster, Johann; Saenz, Alejandro
2017-01-01
A quantum simulator based on ultracold optically trapped atoms for simulating the physics of atoms and molecules in ultrashort intense laser fields is introduced. The slowing down by about 13 orders of magnitude allows one to watch in slow motion the tunneling and recollision processes that form the heart of attosecond science. The extreme flexibility of the simulator promises a deeper understanding of strong-field physics, especially for many-body systems beyond the reach of classical computers. The quantum simulator can experimentally straightforwardly be realized and is shown to recover the ionization characteristics of atoms in the different regimes of laser-matter interaction.
The new car following model considering vehicle dynamics influence and numerical simulation
NASA Astrophysics Data System (ADS)
Sun, Dihua; Liu, Hui; Zhang, Geng; Zhao, Min
2015-12-01
In this paper, the car following model is investigated by considering the vehicle dynamics in a cyber physical view. In fact, that driving is a typical cyber physical process which couples the cyber aspect of the vehicles' information and driving decision tightly with the dynamics and physics of the vehicles and traffic environment. However, the influence from the physical (vehicle) view was been ignored in the previous car following models. In order to describe the car following behavior more reasonably in real traffic, a new car following model by considering vehicle dynamics (for short, D-CFM) is proposed. In this paper, we take the full velocity difference (FVD) car following model as a case. The stability condition is given on the base of the control theory. The analytical method and numerical simulation results show that the new models can describe the evolution of traffic congestion. The simulations also show vehicles with a more actual acceleration of starting process than early models.
Simulating galactic dust grain evolution on a moving mesh
NASA Astrophysics Data System (ADS)
McKinnon, Ryan; Vogelsberger, Mark; Torrey, Paul; Marinacci, Federico; Kannan, Rahul
2018-05-01
Interstellar dust is an important component of the galactic ecosystem, playing a key role in multiple galaxy formation processes. We present a novel numerical framework for the dynamics and size evolution of dust grains implemented in the moving-mesh hydrodynamics code AREPO suited for cosmological galaxy formation simulations. We employ a particle-based method for dust subject to dynamical forces including drag and gravity. The drag force is implemented using a second-order semi-implicit integrator and validated using several dust-hydrodynamical test problems. Each dust particle has a grain size distribution, describing the local abundance of grains of different sizes. The grain size distribution is discretised with a second-order piecewise linear method and evolves in time according to various dust physical processes, including accretion, sputtering, shattering, and coagulation. We present a novel scheme for stochastically forming dust during stellar evolution and new methods for sub-cycling of dust physics time-steps. Using this model, we simulate an isolated disc galaxy to study the impact of dust physical processes that shape the interstellar grain size distribution. We demonstrate, for example, how dust shattering shifts the grain size distribution to smaller sizes resulting in a significant rise of radiation extinction from optical to near-ultraviolet wavelengths. Our framework for simulating dust and gas mixtures can readily be extended to account for other dynamical processes relevant in galaxy formation, like magnetohydrodynamics, radiation pressure, and thermo-chemical processes.
NASA Technical Reports Server (NTRS)
Howe, Christina L.; Weller, Robert A.; Reed, Robert A.; Sierawski, Brian D.; Marshall, Paul W.; Marshall, Cheryl J.; Mendenhall, Marcus H.; Schrimpf, Ronald D.
2007-01-01
The proton induced charge deposition in a well characterized silicon P-i-N focal plane array is analyzed with Monte Carlo based simulations. These simulations include all physical processes, together with pile up, to accurately describe the experimental data. Simulation results reveal important high energy events not easily detected through experiment due to low statistics. The effects of each physical mechanism on the device response is shown for a single proton energy as well as a full proton space flux.
Using artificial intelligence to control fluid flow computations
NASA Technical Reports Server (NTRS)
Gelsey, Andrew
1992-01-01
Computational simulation is an essential tool for the prediction of fluid flow. Many powerful simulation programs exist today. However, using these programs to reliably analyze fluid flow and other physical situations requires considerable human effort and expertise to set up a simulation, determine whether the output makes sense, and repeatedly run the simulation with different inputs until a satisfactory result is achieved. Automating this process is not only of considerable practical importance but will also significantly advance basic artificial intelligence (AI) research in reasoning about the physical world.
NASA Astrophysics Data System (ADS)
Dwivany, Fenny Martha; Esyanti, Rizkita R.; Prapaisie, Adeline; Puspa Kirana, Listya; Latief, Chunaeni; Ginaldi, Ari
2016-11-01
The objective of the research was to determine the effect of microgravity simulation by 3D clinostat on Cavendish banana (Musa acuminata AAA group) ripening process. In this study, physical, physiological changes as well as genes expression were analysed. The result showed that in microgravity simulation condition ripening process in banana was delayed and the MaACOl, MaACSl and MaACS5 gene expression were affected.
A Physics-Based Engineering Approach to Predict the Cross Section for Advanced SRAMs
NASA Astrophysics Data System (ADS)
Li, Lei; Zhou, Wanting; Liu, Huihua
2012-12-01
This paper presents a physics-based engineering approach to estimate the heavy ion induced upset cross section for 6T SRAM cells from layout and technology parameters. The new approach calculates the effects of radiation with junction photocurrent, which is derived based on device physics. The new and simple approach handles the problem by using simple SPICE simulations. At first, the approach uses a standard SPICE program on a typical PC to predict the SPICE-simulated curve of the collected charge vs. its affected distance from the drain-body junction with the derived junction photocurrent. And then, the SPICE-simulated curve is used to calculate the heavy ion induced upset cross section with a simple model, which considers that the SEU cross section of a SRAM cell is more related to a “radius of influence” around a heavy ion strike than to the physical size of a diffusion node in the layout for advanced SRAMs in nano-scale process technologies. The calculated upset cross section based on this method is in good agreement with the test results for 6T SRAM cells processed using 90 nm process technology.
WE-DE-202-01: Connecting Nanoscale Physics to Initial DNA Damage Through Track Structure Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuemann, J.
Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less
Long-term simulation of vertical transport process and its impact on bottom DO in Chesapeake Bay
NASA Astrophysics Data System (ADS)
Du, J.; Shen, J.
2016-02-01
Hypoxia in coastal waters is a widespread phenomenon that appears to have been growing globally for at least 60 years. The fact that physical transport processes and biological processes are equally important in determining the bottom DO in Chesapeake Bay is commonly agreed. However, the quantitative impact of physical transport processes is rarely documented. In this study, we use a timescale, vertical exchange time (VET), to quantify the impact of all physical processes that might have on the bottom DO. Simulation of VET from 1985 to 2012 is conducted and the monthly observed DO data along the deep channel in the Bay's main stem is collected. A conceptual bottom DO budget model is applied, using the VET to quantify the physical condition and net oxygen consumption rate to quantify biological activities. The DO budget model results show that the interannual variations of physical conditions accounts for 88.8% of the interannual variations of observed DO. The high similarity between the VET spatial pattern and the observed DO suggests that physical processes play a key role in regulating the DO condition. Model results also show that long-term VET has a slight increase in summer, but no statistically significant trend is found. Correlations among southerly wind strength, North Atlantic Oscillation index, and VET demonstrate that the physical condition in the Chesapeake Bay is highly controlled by the large-scale climate variation. The relationship is most significant during the summer, when the southerly wind dominates throughout the Chesapeake Bay.
3D finite element modelling of sheet metal blanking process
NASA Astrophysics Data System (ADS)
Bohdal, Lukasz; Kukielka, Leon; Chodor, Jaroslaw; Kulakowska, Agnieszka; Patyk, Radoslaw; Kaldunski, Pawel
2018-05-01
The shearing process such as the blanking of sheet metals has been used often to prepare workpieces for subsequent forming operations. The use of FEM simulation is increasing for investigation and optimizing the blanking process. In the current literature a blanking FEM simulations for the limited capability and large computational cost of the three dimensional (3D) analysis has been largely limited to two dimensional (2D) plane axis-symmetry problems. However, a significant progress in modelling which takes into account the influence of real material (e.g. microstructure of the material), physical and technological conditions can be obtained by using 3D numerical analysis methods in this area. The objective of this paper is to present 3D finite element analysis of the ductile fracture, strain distribution and stress in blanking process with the assumption geometrical and physical nonlinearities. The physical, mathematical and computer model of the process are elaborated. Dynamic effects, mechanical coupling, constitutive damage law and contact friction are taken into account. The application in ANSYS/LS-DYNA program is elaborated. The effect of the main process parameter a blanking clearance on the deformation of 1018 steel and quality of the blank's sheared edge is analyzed. The results of computer simulations can be used to forecasting quality of the final parts optimization.
Simulation Needs and Priorities of the Fermilab Intensity Frontier
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elvira, V. D.; Genser, K. L.; Hatcher, R.
2015-06-11
Over a two-year period, the Physics and Detector Simulations (PDS) group of the Fermilab Scientific Computing Division (SCD), collected information from Fermilab Intensity Frontier experiments on their simulation needs and concerns. The process and results of these activities are documented here.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elber, Ron
Atomically detailed computer simulations of complex molecular events attracted the imagination of many researchers in the field as providing comprehensive information on chemical, biological, and physical processes. However, one of the greatest limitations of these simulations is of time scales. The physical time scales accessible to straightforward simulations are too short to address many interesting and important molecular events. In the last decade significant advances were made in different directions (theory, software, and hardware) that significantly expand the capabilities and accuracies of these techniques. This perspective describes and critically examines some of these advances.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wardle, Kent E.; Frey, Kurt; Pereira, Candido
2014-02-02
This task is aimed at predictive modeling of solvent extraction processes in typical extraction equipment through multiple simulation methods at various scales of resolution. We have conducted detailed continuum fluid dynamics simulation on the process unit level as well as simulations of the molecular-level physical interactions which govern extraction chemistry. Through combination of information gained through simulations at each of these two tiers along with advanced techniques such as the Lattice Boltzmann Method (LBM) which can bridge these two scales, we can develop the tools to work towards predictive simulation for solvent extraction on the equipment scale (Figure 1). Themore » goal of such a tool-along with enabling optimized design and operation of extraction units-would be to allow prediction of stage extraction effrciency under specified conditions. Simulation efforts on each of the two scales will be described below. As the initial application of FELBM in the work performed during FYl0 has been on annular mixing it will be discussed in context of the continuum-scale. In the future, however, it is anticipated that the real value of FELBM will be in its use as a tool for sub-grid model development through highly refined DNS-like multiphase simulations facilitating exploration and development of droplet models including breakup and coalescence which will be needed for the large-scale simulations where droplet level physics cannot be resolved. In this area, it can have a significant advantage over traditional CFD methods as its high computational efficiency allows exploration of significantly greater physical detail especially as computational resources increase in the future.« less
Exploring the implication of climate process uncertainties within the Earth System Framework
NASA Astrophysics Data System (ADS)
Booth, B.; Lambert, F. H.; McNeal, D.; Harris, G.; Sexton, D.; Boulton, C.; Murphy, J.
2011-12-01
Uncertainties in the magnitude of future climate change have been a focus of a great deal of research. Much of the work with General Circulation Models has focused on the atmospheric response to changes in atmospheric composition, while other processes remain outside these frameworks. Here we introduce an ensemble of new simulations, based on an Earth System configuration of HadCM3C, designed to explored uncertainties in both physical (atmospheric, oceanic and aerosol physics) and carbon cycle processes, using perturbed parameter approaches previously used to explore atmospheric uncertainty. Framed in the context of the climate response to future changes in emissions, the resultant future projections represent significantly broader uncertainty than existing concentration driven GCM assessments. The systematic nature of the ensemble design enables interactions between components to be explored. For example, we show how metrics of physical processes (such as climate sensitivity) are also influenced carbon cycle parameters. The suggestion from this work is that carbon cycle processes represent a comparable contribution to uncertainty in future climate projections as contributions from atmospheric feedbacks more conventionally explored. The broad range of climate responses explored within these ensembles, rather than representing a reason for inaction, provide information on lower likelihood but high impact changes. For example while the majority of these simulations suggest that future Amazon forest extent is resilient to the projected climate changes, a small number simulate dramatic forest dieback. This ensemble represents a framework to examine these risks, breaking them down into physical processes (such as ocean temperature drivers of rainfall change) and vegetation processes (where uncertainties point towards requirements for new observational constraints).
Hydrological modelling in forested systems | Science ...
This chapter provides a brief overview of forest hydrology modelling approaches for answering important global research and management questions. Many hundreds of hydrological models have been applied globally across multiple decades to represent and predict forest hydrological processes. The focus of this chapter is on process-based models and approaches, specifically 'forest hydrology models'; that is, physically based simulation tools that quantify compartments of the forest hydrological cycle. Physically based models can be considered those that describe the conservation of mass, momentum and/or energy. The purpose of this chapter is to provide a brief overview of forest hydrology modeling approaches for answering important global research and management questions. The focus of this chapter is on process-based models and approaches, specifically “forest hydrology models”, i.e., physically-based simulation tools that quantify compartments of the forest hydrological cycle.
The Application of SNiPER to the JUNO Simulation
NASA Astrophysics Data System (ADS)
Lin, Tao; Zou, Jiaheng; Li, Weidong; Deng, Ziyan; Fang, Xiao; Cao, Guofu; Huang, Xingtao; You, Zhengyun; JUNO Collaboration
2017-10-01
The JUNO (Jiangmen Underground Neutrino Observatory) is a multipurpose neutrino experiment which is designed to determine neutrino mass hierarchy and precisely measure oscillation parameters. As one of the important systems, the JUNO offline software is being developed using the SNiPER software. In this proceeding, we focus on the requirements of JUNO simulation and present the working solution based on the SNiPER. The JUNO simulation framework is in charge of managing event data, detector geometries and materials, physics processes, simulation truth information etc. It glues physics generator, detector simulation and electronics simulation modules together to achieve a full simulation chain. In the implementation of the framework, many attractive characteristics of the SNiPER have been used, such as dynamic loading, flexible flow control, multiple event management and Python binding. Furthermore, additional efforts have been made to make both detector and electronics simulation flexible enough to accommodate and optimize different detector designs. For the Geant4-based detector simulation, each sub-detector component is implemented as a SNiPER tool which is a dynamically loadable and configurable plugin. So it is possible to select the detector configuration at runtime. The framework provides the event loop to drive the detector simulation and interacts with the Geant4 which is implemented as a passive service. All levels of user actions are wrapped into different customizable tools, so that user functions can be easily extended by just adding new tools. The electronics simulation has been implemented by following an event driven scheme. The SNiPER task component is used to simulate data processing steps in the electronics modules. The electronics and trigger are synchronized by triggered events containing possible physics signals. The JUNO simulation software has been released and is being used by the JUNO collaboration to do detector design optimization, event reconstruction algorithm development and physics sensitivity studies.
Process Modeling and Dynamic Simulation for EAST Helium Refrigerator
NASA Astrophysics Data System (ADS)
Lu, Xiaofei; Fu, Peng; Zhuang, Ming; Qiu, Lilong; Hu, Liangbing
2016-06-01
In this paper, the process modeling and dynamic simulation for the EAST helium refrigerator has been completed. The cryogenic process model is described and the main components are customized in detail. The process model is controlled by the PLC simulator, and the realtime communication between the process model and the controllers is achieved by a customized interface. Validation of the process model has been confirmed based on EAST experimental data during the cool down process of 300-80 K. Simulation results indicate that this process simulator is able to reproduce dynamic behaviors of the EAST helium refrigerator very well for the operation of long pulsed plasma discharge. The cryogenic process simulator based on control architecture is available for operation optimization and control design of EAST cryogenic systems to cope with the long pulsed heat loads in the future. supported by National Natural Science Foundation of China (No. 51306195) and Key Laboratory of Cryogenics, Technical Institute of Physics and Chemistry, CAS (No. CRYO201408)
NASA Technical Reports Server (NTRS)
Hochhalter, J. D.; Glaessgen, E. H.; Ingraffea, A. R.; Aquino, W. A.
2009-01-01
Fracture processes within a material begin at the nanometer length scale at which the formation, propagation, and interaction of fundamental damage mechanisms occur. Physics-based modeling of these atomic processes quickly becomes computationally intractable as the system size increases. Thus, a multiscale modeling method, based on the aggregation of fundamental damage processes occurring at the nanoscale within a cohesive zone model, is under development and will enable computationally feasible and physically meaningful microscale fracture simulation in polycrystalline metals. This method employs atomistic simulation to provide an optimization loop with an initial prediction of a cohesive zone model (CZM). This initial CZM is then applied at the crack front region within a finite element model. The optimization procedure iterates upon the CZM until the finite element model acceptably reproduces the near-crack-front displacement fields obtained from experimental observation. With this approach, a comparison can be made between the original CZM predicted by atomistic simulation and the converged CZM that is based on experimental observation. Comparison of the two CZMs gives insight into how atomistic simulation scales.
NASA Astrophysics Data System (ADS)
Zhang, Jun; Li, Ri Yi
2018-06-01
Building energy simulation is an important supporting tool for green building design and building energy consumption assessment, At present, Building energy simulation software can't meet the needs of energy consumption analysis and cabinet level micro environment control design of prefabricated building. thermal physical model of prefabricated building is proposed in this paper, based on the physical model, the energy consumption calculation software of prefabricated cabin building(PCES) is developed. we can achieve building parameter setting, energy consumption simulation and building thermal process and energy consumption analysis by PCES.
Cesium Eluate Physical Property Determination
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baich, M.A.
2001-02-13
Two bench-scale process simulations of the proposed cesium eluate evaporation process of concentrating eluate produced in the Hanford Site Waste Treatment Plant were conducted. The primary objective of these experiments was to determine the physical properties and the saturation concentration of the eluate evaporator bottoms while producing condensate approximately 0.50 molar HN03.
A unified dislocation density-dependent physical-based constitutive model for cold metal forming
NASA Astrophysics Data System (ADS)
Schacht, K.; Motaman, A. H.; Prahl, U.; Bleck, W.
2017-10-01
Dislocation-density-dependent physical-based constitutive models of metal plasticity while are computationally efficient and history-dependent, can accurately account for varying process parameters such as strain, strain rate and temperature; different loading modes such as continuous deformation, creep and relaxation; microscopic metallurgical processes; and varying chemical composition within an alloy family. Since these models are founded on essential phenomena dominating the deformation, they have a larger range of usability and validity. Also, they are suitable for manufacturing chain simulations since they can efficiently compute the cumulative effect of the various manufacturing processes by following the material state through the entire manufacturing chain and also interpass periods and give a realistic prediction of the material behavior and final product properties. In the physical-based constitutive model of cold metal plasticity introduced in this study, physical processes influencing cold and warm plastic deformation in polycrystalline metals are described using physical/metallurgical internal variables such as dislocation density and effective grain size. The evolution of these internal variables are calculated using adequate equations that describe the physical processes dominating the material behavior during cold plastic deformation. For validation, the model is numerically implemented in general implicit isotropic elasto-viscoplasticity algorithm as a user-defined material subroutine (UMAT) in ABAQUS/Standard and used for finite element simulation of upsetting tests and a complete cold forging cycle of case hardenable MnCr steel family.
Monte Carlo Simulation of Massive Absorbers for Cryogenic Calorimeters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandt, D.; Asai, M.; Brink, P.L.
There is a growing interest in cryogenic calorimeters with macroscopic absorbers for applications such as dark matter direct detection and rare event search experiments. The physics of energy transport in calorimeters with absorber masses exceeding several grams is made complex by the anisotropic nature of the absorber crystals as well as the changing mean free paths as phonons decay to progressively lower energies. We present a Monte Carlo model capable of simulating anisotropic phonon transport in cryogenic crystals. We have initiated the validation process and discuss the level of agreement between our simulation and experimental results reported in the literature,more » focusing on heat pulse propagation in germanium. The simulation framework is implemented using Geant4, a toolkit originally developed for high-energy physics Monte Carlo simulations. Geant4 has also been used for nuclear and accelerator physics, and applications in medical and space sciences. We believe that our current work may open up new avenues for applications in material science and condensed matter physics.« less
Assessing the Effects of Data Compression in Simulations Using Physically Motivated Metrics
Laney, Daniel; Langer, Steven; Weber, Christopher; ...
2014-01-01
This paper examines whether lossy compression can be used effectively in physics simulations as a possible strategy to combat the expected data-movement bottleneck in future high performance computing architectures. We show that, for the codes and simulations we tested, compression levels of 3–5X can be applied without causing significant changes to important physical quantities. Rather than applying signal processing error metrics, we utilize physics-based metrics appropriate for each code to assess the impact of compression. We evaluate three different simulation codes: a Lagrangian shock-hydrodynamics code, an Eulerian higher-order hydrodynamics turbulence modeling code, and an Eulerian coupled laser-plasma interaction code. Wemore » compress relevant quantities after each time-step to approximate the effects of tightly coupled compression and study the compression rates to estimate memory and disk-bandwidth reduction. We find that the error characteristics of compression algorithms must be carefully considered in the context of the underlying physics being modeled.« less
A study of swing-curve physics in diffraction-based overlay
NASA Astrophysics Data System (ADS)
Bhattacharyya, Kaustuve; den Boef, Arie; Storms, Greet; van Heijst, Joost; Noot, Marc; An, Kevin; Park, Noh-Kyoung; Jeon, Se-Ra; Oh, Nang-Lyeom; McNamara, Elliott; van de Mast, Frank; Oh, SeungHwa; Lee, Seung Yoon; Hwang, Chan; Lee, Kuntack
2016-03-01
With the increase of process complexity in advanced nodes, the requirements of process robustness in overlay metrology continues to tighten. Especially with the introduction of newer materials in the film-stack along with typical stack variations (thickness, optical properties, profile asymmetry etc.), the signal formation physics in diffraction-based overlay (DBO) becomes an important aspect to apply in overlay metrology target and recipe selection. In order to address the signal formation physics, an effort is made towards studying the swing-curve phenomena through wavelength and polarizations on production stacks using simulations as well as experimental technique using DBO. The results provide a wealth of information on target and recipe selection for robustness. Details from simulation and measurements will be reported in this technical publication.
An efficient surrogate-based simulation-optimization method for calibrating a regional MODFLOW model
NASA Astrophysics Data System (ADS)
Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.
2017-01-01
Simulation-optimization method entails a large number of model simulations, which is computationally intensive or even prohibitive if the model simulation is extremely time-consuming. Statistical models have been examined as a surrogate of the high-fidelity physical model during simulation-optimization process to tackle this problem. Among them, Multivariate Adaptive Regression Splines (MARS), a non-parametric adaptive regression method, is superior in overcoming problems of high-dimensions and discontinuities of the data. Furthermore, the stability and accuracy of MARS model can be improved by bootstrap aggregating methods, namely, bagging. In this paper, Bagging MARS (BMARS) method is integrated to a surrogate-based simulation-optimization framework to calibrate a three-dimensional MODFLOW model, which is developed to simulate the groundwater flow in an arid hardrock-alluvium region in northwestern Oman. The physical MODFLOW model is surrogated by the statistical model developed using BMARS algorithm. The surrogate model, which is fitted and validated using training dataset generated by the physical model, can approximate solutions rapidly. An efficient Sobol' method is employed to calculate global sensitivities of head outputs to input parameters, which are used to analyze their importance for the model outputs spatiotemporally. Only sensitive parameters are included in the calibration process to further improve the computational efficiency. Normalized root mean square error (NRMSE) between measured and simulated heads at observation wells is used as the objective function to be minimized during optimization. The reasonable history match between the simulated and observed heads demonstrated feasibility of this high-efficient calibration framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael
The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less
NASA Astrophysics Data System (ADS)
Junk, S.
2016-08-01
Today the methods of numerical simulation of sheet metal forming offer a great diversity of possibilities for optimization in product development and in process design. However, the results from simulation are only available as virtual models. Because there are any forming tools available during the early stages of product development, physical models that could serve to represent the virtual results are therefore lacking. Physical 3D-models can be created using 3D-printing and serve as an illustration and present a better understanding of the simulation results. In this way, the results from the simulation can be made more “comprehensible” within a development team. This paper presents the possibilities of 3D-colour printing with particular consideration of the requirements regarding the implementation of sheet metal forming simulation. Using concrete examples of sheet metal forming, the manufacturing of 3D colour models will be expounded upon on the basis of simulation results.
NASA Astrophysics Data System (ADS)
Jin, Yongmei
In recent years, theoretical modeling and computational simulation of microstructure evolution and materials property has been attracting much attention. While significant advances have been made, two major challenges remain. One is the integration of multiple physical phenomena for simulation of complex materials behavior, the other is the bridging over multiple length and time scales in materials modeling and simulation. The research presented in this Thesis is focused mainly on tackling the first major challenge. In this Thesis, a unified Phase Field Microelasticity (PFM) approach is developed. This approach is an advanced version of the phase field method that takes into account the exact elasticity of arbitrarily anisotropic, elastically and structurally inhomogeneous systems. The proposed theory and models are applicable to infinite solids, elastic half-space, and finite bodies with arbitrary-shaped free surfaces, which may undergo various concomitant physical processes. The Phase Field Microelasticity approach is employed to formulate the theories and models of martensitic transformation, dislocation dynamics, and crack evolution in single crystal and polycrystalline solids. It is also used to study strain relaxation in heteroepitaxial thin films through misfit dislocation and surface roughening. Magnetic domain evolution in nanocrystalline thin films is also investigated. Numerous simulation studies are performed. Comparison with analytical predictions and experimental observations are presented. Agreement verities the theory and models as realistic simulation tools for computational materials science and engineering. The same Phase Field Microelasticity formalism of individual models of different physical phenomena makes it easy to integrate multiple physical processes into one unified simulation model, where multiple phenomena are treated as various relaxation modes that together act as one common cooperative phenomenon. The model does not impose a priori constraints on possible microstructure evolution paths. This gives the model predicting power, where material system itself "chooses" the optimal path for multiple processes. The advances made in this Thesis present a significant step forward to overcome the first challenge, mesoscale multi-physics modeling and simulation of materials. At the end of this Thesis, the way to tackle the second challenge, bridging over multiple length and time scales in materials modeling and simulation, is discussed based on connection between the mesoscale Phase Field Microelasticity modeling and microscopic atomistic calculation as well as macroscopic continuum theory.
WE-DE-202-00: Connecting Radiation Physics with Computational Biology
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less
NASA Astrophysics Data System (ADS)
Martin-Bragado, I.; Castrillo, P.; Jaraiz, M.; Pinacho, R.; Rubio, J. E.; Barbolla, J.; Moroz, V.
2005-09-01
Atomistic process simulation is expected to play an important role for the development of next generations of integrated circuits. This work describes an approach for modeling electric charge effects in a three-dimensional atomistic kinetic Monte Carlo process simulator. The proposed model has been applied to the diffusion of electrically active boron and arsenic atoms in silicon. Several key aspects of the underlying physical mechanisms are discussed: (i) the use of the local Debye length to smooth out the atomistic point-charge distribution, (ii) algorithms to correctly update the charge state in a physically accurate and computationally efficient way, and (iii) an efficient implementation of the drift of charged particles in an electric field. High-concentration effects such as band-gap narrowing and degenerate statistics are also taken into account. The efficiency, accuracy, and relevance of the model are discussed.
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-01-01
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779
Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing
2011-04-05
Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.
Technique for forcing high Reynolds number isotropic turbulence in physical space
NASA Astrophysics Data System (ADS)
Palmore, John A.; Desjardins, Olivier
2018-03-01
Many common engineering problems involve the study of turbulence interaction with other physical processes. For many such physical processes, solutions are expressed most naturally in physical space, necessitating the use of physical space solutions. For simulating isotropic turbulence in physical space, linear forcing is a commonly used strategy because it produces realistic turbulence in an easy-to-implement formulation. However, the method resolves a smaller range of scales on the same mesh than spectral forcing. We propose an alternative approach for turbulence forcing in physical space that uses the low-pass filtered velocity field as the basis of the forcing term. This method is shown to double the range of scales captured by linear forcing while maintaining the flexibility and low computational cost of the original method. This translates to a 60% increase of the Taylor microscale Reynolds number on the same mesh. An extension is made to scalar mixing wherein a scalar field is forced to have an arbitrarily chosen, constant variance. Filtered linear forcing of the scalar field allows for control over the length scale of scalar injection, which could be important when simulating scalar mixing.
Computational studies of physical properties of Nb-Si based alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, Lizhi
2015-04-16
The overall goal is to provide physical properties data supplementing experiments for thermodynamic modeling and other simulations such as phase filed simulation for microstructure and continuum simulations for mechanical properties. These predictive computational modeling and simulations may yield insights that can be used to guide materials design, processing, and manufacture. Ultimately, they may lead to usable Nb-Si based alloy which could play an important role in current plight towards greener energy. The main objectives of the proposed projects are: (1) developing a first principles method based supercell approach for calculating thermodynamic and mechanic properties of ordered crystals and disordered latticesmore » including solid solution; (2) application of the supercell approach to Nb-Si base alloy to compute physical properties data that can be used for thermodynamic modeling and other simulations to guide the optimal design of Nb-Si based alloy.« less
Maffeo, C.; Yoo, J.; Comer, J.; Wells, D. B.; Luan, B.; Aksimentiev, A.
2014-01-01
Over the past ten years, the all-atom molecular dynamics method has grown in the scale of both systems and processes amenable to it and in its ability to make quantitative predictions about the behavior of experimental systems. The field of computational DNA research is no exception, witnessing a dramatic increase in the size of systems simulated with atomic resolution, the duration of individual simulations and the realism of the simulation outcomes. In this topical review, we describe the hallmark physical properties of DNA from the perspective of all-atom simulations. We demonstrate the amazing ability of such simulations to reveal the microscopic physical origins of experimentally observed phenomena and we review the frustrating limitations associated with imperfections of present atomic force fields and inadequate sampling. The review is focused on the following four physical properties of DNA: effective electric charge, response to an external mechanical force, interaction with other DNA molecules and behavior in an external electric field. PMID:25238560
Maffeo, C; Yoo, J; Comer, J; Wells, D B; Luan, B; Aksimentiev, A
2014-10-15
Over the past ten years, the all-atom molecular dynamics method has grown in the scale of both systems and processes amenable to it and in its ability to make quantitative predictions about the behavior of experimental systems. The field of computational DNA research is no exception, witnessing a dramatic increase in the size of systems simulated with atomic resolution, the duration of individual simulations and the realism of the simulation outcomes. In this topical review, we describe the hallmark physical properties of DNA from the perspective of all-atom simulations. We demonstrate the amazing ability of such simulations to reveal the microscopic physical origins of experimentally observed phenomena. We also discuss the frustrating limitations associated with imperfections of present atomic force fields and inadequate sampling. The review is focused on the following four physical properties of DNA: effective electric charge, response to an external mechanical force, interaction with other DNA molecules and behavior in an external electric field.
ERIC Educational Resources Information Center
Ingerman, Ake; Linder, Cedric; Marshall, Delia
2009-01-01
This article attempts to describe students' process of learning physics using the notion of experiencing variation as the basic mechanism for learning, and thus explores what variation, with respect to a particular object of learning, that students experience in their process of constituting understanding. Theoretically, the analysis relies on…
Algodoo: A Tool for Encouraging Creativity in Physics Teaching and Learning
NASA Astrophysics Data System (ADS)
Gregorcic, Bor; Bodin, Madelen
2017-01-01
Algodoo (http://www.algodoo.com) is a digital sandbox for physics 2D simulations. It allows students and teachers to easily create simulated "scenes" and explore physics through a user-friendly and visually attractive interface. In this paper, we present different ways in which students and teachers can use Algodoo to visualize and solve physics problems, investigate phenomena and processes, and engage in out-of-school activities and projects. Algodoo, with its approachable interface, inhabits a middle ground between computer games and "serious" computer modeling. It is suitable as an entry-level modeling tool for students of all ages and can facilitate discussions about the role of computer modeling in physics.
Use of Flowtran Simulation in Education
ERIC Educational Resources Information Center
Clark, J. Peter; Sommerfeld, Jude T.
1976-01-01
Describes the use in chemical engineering education of FLOWTRAN, a large steady-state simulator of chemical processes with extensive facilities for physical and thermodynamic data-handling and a large library of equipment modules, including cost estimation capability. (MLH)
Numerical simulations for active tectonic processes: increasing interoperability and performance
NASA Technical Reports Server (NTRS)
Donnellan, A.; Fox, G.; Rundle, J.; McLeod, D.; Tullis, T.; Grant, L.
2002-01-01
The objective of this project is to produce a system to fully model earthquake-related data. This task develops simulation and analysis tools to study the physics of earthquakes using state-of-the-art modeling.
Cardiac examination and the effect of dual-processing instruction in a cardiopulmonary simulator.
Sibbald, Matt; McKinney, James; Cavalcanti, Rodrigo B; Yu, Eric; Wood, David A; Nair, Parvathy; Eva, Kevin W; Hatala, Rose
2013-08-01
Use of dual-processing has been widely touted as a strategy to reduce diagnostic error in clinical medicine. However, this strategy has not been tested among medical trainees with complex diagnostic problems. We sought to determine whether dual-processing instruction could reduce diagnostic error across a spectrum of experience with trainees undertaking cardiac physical exam. Three experiments were conducted using a similar design to teach cardiac physical exam using a cardiopulmonary simulator. One experiment was conducted in each of three groups: experienced, intermediate and novice trainees. In all three experiments, participants were randomized to receive undirected or dual-processing verbal instruction during teaching, practice and testing phases. When tested, dual-processing instruction did not change the probability assigned to the correct diagnosis in any of the three experiments. Among intermediates, there was an apparent interaction between the diagnosis tested and the effect of dual-processing instruction. Among relative novices, dual processing instruction may have dampened the harmful effect of a bias away from the correct diagnosis. Further work is needed to define the role of dual-processing instruction to reduce cognitive error. This study suggests that it cannot be blindly applied to complex diagnostic problems such as cardiac physical exam.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jimbert, P.; Fernandez, J. I.; Eguia, I.
It is well known that one of the main advantages of the high speed forming (HSF) processes is the improvement in the forming limits of the used materials.Using the Electromagnetic Forming (EMF) technology two materials have been tested with different mechanical and physical properties: the AA5754 aluminium and the AZ31B magnesium alloys.The EMF process principle can be described as follows: A significant amount of electrical energy is stored in a bank of capacitors which are suddenly discharged releasing all the stored energy. This electric discharge runs through a coil which generates an intense transient magnetic field. At the same timemore » transient Eddy currents are induced in the electrically conductive part placed some millimetres far from the coil. Another intense magnetic field is generated due to those Eddy currents but on the opposite direction as the one generated by the coil. A big magnetic repulsion force is created between the part and the coil. This magnetic repulsion between both fields is used to launch the blank with no physical contact and obtain the desired deformation on it.The Forming Limit Diagrams (FLD) obtained in the EMF experiments were them compared to the ones obtained with the 'Nakazima' method at conventional deformation speed for both alloys. In parallel to these physical experiments, some simulations were carried out. But trying to simulate this process by FEM is a though work. There are several physics and many factors to take into account in a few microseconds deformation process. And all these factors are tightly related with each other, that is why to this date there is no commercial software able to simulate the EMF process accurately.From LABEIN-Tecnalia we are working with to different softwares to simulate the whole process: Maxwell 3D for the electromagnetic part and PAM-STAMP2G for the mechanical part of the problem.« less
Using a Virtual Experiment to Analyze Infiltration Process from Point to Grid-cell Size Scale
NASA Astrophysics Data System (ADS)
Barrios, M. I.
2013-12-01
The hydrological science requires the emergence of a consistent theoretical corpus driving the relationships between dominant physical processes at different spatial and temporal scales. However, the strong spatial heterogeneities and non-linearities of these processes make difficult the development of multiscale conceptualizations. Therefore, scaling understanding is a key issue to advance this science. This work is focused on the use of virtual experiments to address the scaling of vertical infiltration from a physically based model at point scale to a simplified physically meaningful modeling approach at grid-cell scale. Numerical simulations have the advantage of deal with a wide range of boundary and initial conditions against field experimentation. The aim of the work was to show the utility of numerical simulations to discover relationships between the hydrological parameters at both scales, and to use this synthetic experience as a media to teach the complex nature of this hydrological process. The Green-Ampt model was used to represent vertical infiltration at point scale; and a conceptual storage model was employed to simulate the infiltration process at the grid-cell scale. Lognormal and beta probability distribution functions were assumed to represent the heterogeneity of soil hydraulic parameters at point scale. The linkages between point scale parameters and the grid-cell scale parameters were established by inverse simulations based on the mass balance equation and the averaging of the flow at the point scale. Results have shown numerical stability issues for particular conditions and have revealed the complex nature of the non-linear relationships between models' parameters at both scales and indicate that the parameterization of point scale processes at the coarser scale is governed by the amplification of non-linear effects. The findings of these simulations have been used by the students to identify potential research questions on scale issues. Moreover, the implementation of this virtual lab improved the ability to understand the rationale of these process and how to transfer the mathematical models to computational representations.
Construction material processed using lunar simulant in various environments
NASA Technical Reports Server (NTRS)
Chase, Stan; Ocallaghan-Hay, Bridget; Housman, Ralph; Kindig, Michael; King, John; Montegrande, Kevin; Norris, Raymond; Vanscotter, Ryan; Willenborg, Jonathan; Staubs, Harry
1995-01-01
The manufacture of construction materials from locally available resources in space is an important first step in the establishment of lunar and planetary bases. The objective of the CoMPULSIVE (Construction Material Processed Using Lunar Simulant In Various Environments) experiment is to develop a procedure to produce construction materials by sintering or melting Johnson Space Center Simulant 1 (JSC-1) lunar soil simulant in both earth-based (1-g) and microgravity (approximately 0-g) environments. The characteristics of the resultant materials will be tested to determine its physical and mechanical properties. The physical characteristics include: crystalline, thermal, and electrical properties. The mechanical properties include: compressive tensile, and flexural strengths. The simulant, placed in a sealed graphite crucible, will be heated using a high temperature furnace. The crucible will then be cooled by radiative and forced convective means. The core furnace element consists of space qualified quartz-halogen incandescent lamps with focusing mirrors. Sample temperatures of up to 2200 C are attainable using this heating method.
NASA Astrophysics Data System (ADS)
Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.
2016-12-01
This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for improving the model physics parameterizations.
NASA Astrophysics Data System (ADS)
Hristov, Nebojša; Kari, Aleksandar; Jerković, Damir; Savić, Slobodan; Sirovatka, Radoslav
2015-02-01
Simulation and measurements of muzzle blast overpressure and its physical manifestations are studied in this paper. The use of a silencer can have a great influence on the overpressure intensity. A silencer is regarded as an acoustic transducer and a waveguide. Wave equations for an acoustic dotted source of directed effect are used for physical interpretation of overpressure as an acoustic phenomenon. Decomposition approach has proven to be suitable to describe the formation of the output wave of the wave transducer. Electroacoustic analogies are used for simulations. A measurement chain was used to compare the simulation results with the experimental ones.
Tal, Aner; Wansink, Brian
2011-01-01
Virtual reality (VR) provides a potentially powerful tool for researchers seeking to investigate eating and physical activity. Some unique conditions are necessary to ensure that the psychological processes that influence real eating behavior also influence behavior in VR environments. Accounting for these conditions is critical if VR-assisted research is to accurately reflect real-world situations. The current work discusses key considerations VR researchers must take into account to ensure similar psychological functioning in virtual and actual reality and does so by focusing on the process of spontaneous mental simulation. Spontaneous mental simulation is prevalent under real-world conditions but may be absent under VR conditions, potentially leading to differences in judgment and behavior between virtual and actual reality. For simulation to occur, the virtual environment must be perceived as being available for action. A useful chart is supplied as a reference to help researchers to investigate eating and physical activity more effectively. PMID:21527088
Tal, Aner; Wansink, Brian
2011-03-01
Virtual reality (VR) provides a potentially powerful tool for researchers seeking to investigate eating and physical activity. Some unique conditions are necessary to ensure that the psychological processes that influence real eating behavior also influence behavior in VR environments. Accounting for these conditions is critical if VR-assisted research is to accurately reflect real-world situations. The current work discusses key considerations VR researchers must take into account to ensure similar psychological functioning in virtual and actual reality and does so by focusing on the process of spontaneous mental simulation. Spontaneous mental simulation is prevalent under real-world conditions but may be absent under VR conditions, potentially leading to differences in judgment and behavior between virtual and actual reality. For simulation to occur, the virtual environment must be perceived as being available for action. A useful chart is supplied as a reference to help researchers to investigate eating and physical activity more effectively. © 2011 Diabetes Technology Society.
Cloud physics laboratory project science and applications working group
NASA Technical Reports Server (NTRS)
Hung, R. J.
1977-01-01
The conditions of the expansion chamber under zero gravity environment were simulated. The following three branches of fluid mechanics simulation under low gravity environment were accomplished: (1) oscillation of the water droplet which characterizes the nuclear oscillation in nuclear physics, bubble oscillation of two phase flow in chemical engineering, and water drop oscillation in meteorology; (2) rotation of the droplet which characterizes nuclear fission in nuclear physics, formation of binary stars and rotating stars in astrophysics, and breakup of the water droplet in meteorology; and (3) collision and coalescence of the water droplets which characterizes nuclear fusion in nuclear physics and processes of rain formation in meteorology.
Wieland, Birgit; Ropte, Sven
2017-01-01
The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results. PMID:28981458
Wieland, Birgit; Ropte, Sven
2017-10-05
The production of rotor blades for wind turbines is still a predominantly manual process. Process simulation is an adequate way of improving blade quality without a significant increase in production costs. This paper introduces a module for tolerance simulation for rotor-blade production processes. The investigation focuses on the simulation of temperature distribution for one-sided, self-heated tooling and thick laminates. Experimental data from rotor-blade production and down-scaled laboratory tests are presented. Based on influencing factors that are identified, a physical model is created and implemented as a simulation. This provides an opportunity to simulate temperature and cure-degree distribution for two-dimensional cross sections. The aim of this simulation is to support production processes. Hence, it is modelled as an in situ simulation with direct input of temperature data and real-time capability. A monolithic part of the rotor blade, the main girder, is used as an example for presenting the results.
Physical and digital simulations for IVA robotics
NASA Technical Reports Server (NTRS)
Hinman, Elaine; Workman, Gary L.
1992-01-01
Space based materials processing experiments can be enhanced through the use of IVA robotic systems. A program to determine requirements for the implementation of robotic systems in a microgravity environment and to develop some preliminary concepts for acceleration control of small, lightweight arms has been initiated with the development of physical and digital simulation capabilities. The physical simulation facilities incorporate a robotic workcell containing a Zymark Zymate II robot instrumented for acceleration measurements, which is able to perform materials transfer functions while flying on NASA's KC-135 aircraft during parabolic manuevers to simulate reduced gravity. Measurements of accelerations occurring during the reduced gravity periods will be used to characterize impacts of robotic accelerations in a microgravity environment in space. Digital simulations are being performed with TREETOPS, a NASA developed software package which is used for the dynamic analysis of systems with a tree topology. Extensive use of both simulation tools will enable the design of robotic systems with enhanced acceleration control for use in the space manufacturing environment.
HEP Software Foundation Community White Paper Working Group - Detector Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apostolakis, J.
A working group on detector simulation was formed as part of the high-energy physics (HEP) Software Foundation's initiative to prepare a Community White Paper that describes the main software challenges and opportunities to be faced in the HEP field over the next decade. The working group met over a period of several months in order to review the current status of the Full and Fast simulation applications of HEP experiments and the improvements that will need to be made in order to meet the goals of future HEP experimental programmes. The scope of the topics covered includes the main componentsmore » of a HEP simulation application, such as MC truth handling, geometry modeling, particle propagation in materials and fields, physics modeling of the interactions of particles with matter, the treatment of pileup and other backgrounds, as well as signal processing and digitisation. The resulting work programme described in this document focuses on the need to improve both the software performance and the physics of detector simulation. The goals are to increase the accuracy of the physics models and expand their applicability to future physics programmes, while achieving large factors in computing performance gains consistent with projections on available computing resources.« less
Different modelling approaches to evaluate nitrogen transport and turnover at the watershed scale
NASA Astrophysics Data System (ADS)
Epelde, Ane Miren; Antiguedad, Iñaki; Brito, David; Jauch, Eduardo; Neves, Ramiro; Garneau, Cyril; Sauvage, Sabine; Sánchez-Pérez, José Miguel
2016-08-01
This study presents the simulation of hydrological processes and nutrient transport and turnover processes using two integrated numerical models: Soil and Water Assessment Tool (SWAT) (Arnold et al., 1998), an empirical and semi-distributed numerical model; and Modelo Hidrodinâmico (MOHID) (Neves, 1985), a physics-based and fully distributed numerical model. This work shows that both models reproduce satisfactorily water and nitrate exportation at the watershed scale at annual and daily basis, MOHID providing slightly better results. At the watershed scale, both SWAT and MOHID simulated similarly and satisfactorily the denitrification amount. However, as MOHID numerical model was the only one able to reproduce adequately the spatial variation of the soil hydrological conditions and water table level fluctuation, it proved to be the only model able of reproducing the spatial variation of the nutrient cycling processes that are dependent to the soil hydrological conditions such as the denitrification process. This evidences the strength of the fully distributed and physics-based models to simulate the spatial variability of nutrient cycling processes that are dependent to the hydrological conditions of the soils.
How to assess the impact of a physical parameterization in simulations of moist convection?
NASA Astrophysics Data System (ADS)
Grabowski, Wojciech
2017-04-01
A numerical model capable in simulating moist convection (e.g., cloud-resolving model or large-eddy simulation model) consists of a fluid flow solver combined with required representations (i.e., parameterizations) of physical processes. The later typically include cloud microphysics, radiative transfer, and unresolved turbulent transport. Traditional approaches to investigate impacts of such parameterizations on convective dynamics involve parallel simulations with different parameterization schemes or with different scheme parameters. Such methodologies are not reliable because of the natural variability of a cloud field that is affected by the feedback between the physics and dynamics. For instance, changing the cloud microphysics typically leads to a different realization of the cloud-scale flow, and separating dynamical and microphysical impacts is difficult. This presentation will present a novel modeling methodology, the piggybacking, that allows studying the impact of a physical parameterization on cloud dynamics with confidence. The focus will be on the impact of cloud microphysics parameterization. Specific examples of the piggybacking approach will include simulations concerning the hypothesized deep convection invigoration in polluted environments, the validity of the saturation adjustment in modeling condensation in moist convection, and separation of physical impacts from statistical uncertainty in simulations applying particle-based Lagrangian microphysics, the super-droplet method.
Planetary geology: Impact processes on asteroids
NASA Technical Reports Server (NTRS)
Chapman, C. R.; Davis, D. R.; Greenberg, R.; Weidenschilling, S. J.
1982-01-01
The fundamental geological and geophysical properties of asteroids were studied by theoretical and simulation studies of their collisional evolution. Numerical simulations incorporating realistic physical models were developed to study the collisional evolution of hypothetical asteroid populations over the age of the solar system. Ideas and models are constrained by the observed distributions of sizes, shapes, and spin rates in the asteroid belt, by properties of Hirayama families, and by experimental studies of cratering and collisional phenomena. It is suggested that many asteroids are gravitationally-bound "rubble piles.' Those that rotate rapidly may have nonspherical quasi-equilibrium shapes, such as ellipsoids or binaries. Through comparison of models with astronomical data, physical properties of these asteroids (including bulk density) are determined, and physical processes that have operated in the solar system in primordial and subsequent epochs are studied.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryan, Frank; Dennis, John; MacCready, Parker
This project aimed to improve long term global climate simulations by resolving and enhancing the representation of the processes involved in the cycling of freshwater through estuaries and coastal regions. This was a collaborative multi-institution project consisting of physical oceanographers, climate model developers, and computational scientists. It specifically targeted the DOE objectives of advancing simulation and predictive capability of climate models through improvements in resolution and physical process representation. The main computational objectives were: 1. To develop computationally efficient, but physically based, parameterizations of estuary and continental shelf mixing processes for use in an Earth System Model (CESM). 2. Tomore » develop a two-way nested regional modeling framework in order to dynamically downscale the climate response of particular coastal ocean regions and to upscale the impact of the regional coastal processes to the global climate in an Earth System Model (CESM). 3. To develop computational infrastructure to enhance the efficiency of data transfer between specific sources and destinations, i.e., a point-to-point communication capability, (used in objective 1) within POP, the ocean component of CESM.« less
NASA Astrophysics Data System (ADS)
Wang, X.; Murtugudde, R. G.; Zhang, D.
2016-12-01
Photosynthesis and respiration are important processes in all ecosystems on the Earth, in which carbon and oxygen are the two main elements. However, the oxygen cycle has received much less attention (relative to the carbon cycle) despite its big role in the earth system. Oxygen is a sensitive indicator of physical and biogeochemical processes in the ocean thus a key parameter for understanding the ocean's ecosystem and biogeochemistry. The Oxygen-Minimum-Zone (OMZ), often seen below 200 m, is a profound feature in the world oceans. There has been evidence of OMZ expansion over the past few decades in the tropical oceans. Climate models project that there would be a continued decline in dissolved oxygen (DO) and an expansion of the tropical OMZs under future warming conditions, which is of great concern because of the implications for marine organisms. We employ a validated three-dimensional model that simulates physical transport (circulation and vertical mixing), biological processes (O2 production and consumption) and ocean-atmosphere O2 exchange to quantify various sources and sinks of DO over 1980-2015. We show how we use observational data to improve our model simulation. Then we assess the spatial and temporal variability in simulated DO in the tropical Pacific Ocean, and explore the impacts of physical and biogeochemical processes on the DO dynamics, with a focus on the MOZ. Our analyses indicate that DO in the OMZ has a positive relationship with the 13ºC isotherm depth and a negative relationship with the concentration of dissolved organic material.
Physics and control of wall turbulence for drag reduction.
Kim, John
2011-04-13
Turbulence physics responsible for high skin-friction drag in turbulent boundary layers is first reviewed. A self-sustaining process of near-wall turbulence structures is then discussed from the perspective of controlling this process for the purpose of skin-friction drag reduction. After recognizing that key parts of this self-sustaining process are linear, a linear systems approach to boundary-layer control is discussed. It is shown that singular-value decomposition analysis of the linear system allows us to examine different approaches to boundary-layer control without carrying out the expensive nonlinear simulations. Results from the linear analysis are consistent with those observed in full nonlinear simulations, thus demonstrating the validity of the linear analysis. Finally, fundamental performance limit expected of optimal control input is discussed.
Simulating the decentralized processes of the human immune system in a virtual anatomy model.
Sarpe, Vladimir; Jacob, Christian
2013-01-01
Many physiological processes within the human body can be perceived and modeled as large systems of interacting particles or swarming agents. The complex processes of the human immune system prove to be challenging to capture and illustrate without proper reference to the spatial distribution of immune-related organs and systems. Our work focuses on physical aspects of immune system processes, which we implement through swarms of agents. This is our first prototype for integrating different immune processes into one comprehensive virtual physiology simulation. Using agent-based methodology and a 3-dimensional modeling and visualization environment (LINDSAY Composer), we present an agent-based simulation of the decentralized processes in the human immune system. The agents in our model - such as immune cells, viruses and cytokines - interact through simulated physics in two different, compartmentalized and decentralized 3-dimensional environments namely, (1) within the tissue and (2) inside a lymph node. While the two environments are separated and perform their computations asynchronously, an abstract form of communication is allowed in order to replicate the exchange, transportation and interaction of immune system agents between these sites. The distribution of simulated processes, that can communicate across multiple, local CPUs or through a network of machines, provides a starting point to build decentralized systems that replicate larger-scale processes within the human body, thus creating integrated simulations with other physiological systems, such as the circulatory, endocrine, or nervous system. Ultimately, this system integration across scales is our goal for the LINDSAY Virtual Human project. Our current immune system simulations extend our previous work on agent-based simulations by introducing advanced visualizations within the context of a virtual human anatomy model. We also demonstrate how to distribute a collection of connected simulations over a network of computers. As a future endeavour, we plan to use parameter tuning techniques on our model to further enhance its biological credibility. We consider these in silico experiments and their associated modeling and optimization techniques as essential components in further enhancing our capabilities of simulating a whole-body, decentralized immune system, to be used both for medical education and research as well as for virtual studies in immunoinformatics.
Training Administrators in Anasynthesis
ERIC Educational Resources Information Center
Silvern, Leonard C.
1971-01-01
The author discusses the application of physical and mathematical systems to non-physical social systems; specifically education and cinema, the process of analysis, synthesis, modeling and simulation. The author describes the course he has developed to instruct students in anasynthesis. (Author/RR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, R.
Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fields, Laura; Genser, Krzysztof; Hatcher, Robert
Geant4 is the leading detector simulation toolkit used in high energy physics to design detectors and to optimize calibration and reconstruction software. It employs a set of carefully validated physics models to simulate interactions of particles with matter across a wide range of interaction energies. These models, especially the hadronic ones, rely largely on directly measured cross-sections and phenomenological predictions with physically motivated parameters estimated by theoretical calculation or measurement. Because these models are tuned to cover a very wide range of possible simulation tasks, they may not always be optimized for a given process or a given material. Thismore » raises several critical questions, e.g. how sensitive Geant4 predictions are to the variations of the model parameters, or what uncertainties are associated with a particular tune of a Geant4 physics model, or a group of models, or how to consistently derive guidance for Geant4 model development and improvement from a wide range of available experimental data. We have designed and implemented a comprehensive, modular, user-friendly software toolkit to study and address such questions. It allows one to easily modify parameters of one or several Geant4 physics models involved in the simulation, and to perform collective analysis of multiple variants of the resulting physics observables of interest and comparison against a variety of corresponding experimental data. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented and illustrated with results obtained with Geant4 key hadronic models.« less
Physical Principle for Generation of Randomness
NASA Technical Reports Server (NTRS)
Zak, Michail
2009-01-01
A physical principle (more precisely, a principle that incorporates mathematical models used in physics) has been conceived as the basis of a method of generating randomness in Monte Carlo simulations. The principle eliminates the need for conventional random-number generators. The Monte Carlo simulation method is among the most powerful computational methods for solving high-dimensional problems in physics, chemistry, economics, and information processing. The Monte Carlo simulation method is especially effective for solving problems in which computational complexity increases exponentially with dimensionality. The main advantage of the Monte Carlo simulation method over other methods is that the demand on computational resources becomes independent of dimensionality. As augmented by the present principle, the Monte Carlo simulation method becomes an even more powerful computational method that is especially useful for solving problems associated with dynamics of fluids, planning, scheduling, and combinatorial optimization. The present principle is based on coupling of dynamical equations with the corresponding Liouville equation. The randomness is generated by non-Lipschitz instability of dynamics triggered and controlled by feedback from the Liouville equation. (In non-Lipschitz dynamics, the derivatives of solutions of the dynamical equations are not required to be bounded.)
Learning from physics-based earthquake simulators: a minimal approach
NASA Astrophysics Data System (ADS)
Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele
2017-04-01
Physics-based earthquake simulators are aimed to generate synthetic seismic catalogs of arbitrary length, accounting for fault interaction, elastic rebound, realistic fault networks, and some simple earthquake nucleation process like rate and state friction. Through comparison of synthetic and real catalogs seismologists can get insights on the earthquake occurrence process. Moreover earthquake simulators can be used to to infer some aspects of the statistical behavior of earthquakes within the simulated region, by analyzing timescales not accessible through observations. The develoment of earthquake simulators is commonly led by the approach "the more physics, the better", pushing seismologists to go towards simulators more earth-like. However, despite the immediate attractiveness, we argue that this kind of approach makes more and more difficult to understand which physical parameters are really relevant to describe the features of the seismic catalog at which we are interested. For this reason, here we take an opposite minimal approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple model may be more informative than a complex one for some specific scientific objectives, because it is more understandable. The model has three main components: the first one is a realistic tectonic setting, i.e., a fault dataset of California; the other two components are quantitative laws for earthquake generation on each single fault, and the Coulomb Failure Function for modeling fault interaction. The final goal of this work is twofold. On one hand, we aim to identify the minimum set of physical ingredients that can satisfactorily reproduce the features of the real seismic catalog, such as short-term seismic cluster, and to investigate on the hypothetical long-term behavior, and faults synchronization. On the other hand, we want to investigate the limits of predictability of the model itself.
NASA Astrophysics Data System (ADS)
Taschuk, M. T.; Tucker, R. T.; LaForge, J. M.; Beaudry, A. L.; Kupsta, M. R.; Brett, M. J.
2013-12-01
The vapour-liquid-solid glancing angle deposition (VLS-GLAD) process is capable of producing complex nanotree structures with control over azimuthal branch orientation and height. We have developed a thin film growth simulation including ballistic deposition, simplified surface diffusion, and droplet-mediated cubic crystal growth for the VLS-GLAD process using the UnrealTM Development Kit. The use of a commercial game engine has provided an interactive environment while allowing a custom physics implementation. Our simulation's output is verified against experimental data, including a volumetric film reconstruction produced using focused ion beam and scanning-electron microscopy (SEM), crystallographic texture, and morphological characteristics such as branch orientation. We achieve excellent morphological and texture agreement with experimental data, as well as qualitative agreement with SEM imagery. The simplified physics in our model reproduces the experimental films, indicating that the dominant role flux geometry plays in the VLS-GLAD competitive growth process responsible for azimuthally oriented branches and biaxial crystal texture evolution. The simulation's successful reproduction of experimental data indicates that it should have predictive power in designing novel VLS-GLAD structures.
NASA Astrophysics Data System (ADS)
Redonnet, S.; Ben Khelil, S.; Bulté, J.; Cunha, G.
2017-09-01
With the objective of aircraft noise mitigation, we here address the numerical characterization of the aeroacoustics by a simplified nose landing gear (NLG), through the use of advanced simulation and signal processing techniques. To this end, the NLG noise physics is first simulated through an advanced hybrid approach, which relies on Computational Fluid Dynamics (CFD) and Computational AeroAcoustics (CAA) calculations. Compared to more traditional hybrid methods (e.g. those relying on the use of an Acoustic Analogy), and although it is used here with some approximations made (e.g. design of the CFD-CAA interface), the present approach does not rely on restrictive assumptions (e.g. equivalent noise source, homogeneous propagation medium), which allows to incorporate more realism into the prediction. In a second step, the outputs coming from such CFD-CAA hybrid calculations are processed through both traditional and advanced post-processing techniques, thus offering to further investigate the NLG's noise source mechanisms. Among other things, this work highlights how advanced computational methodologies are now mature enough to not only simulate realistic problems of airframe noise emission, but also to investigate their underlying physics.
Simulating industrial plasma reactors - A fresh perspective
NASA Astrophysics Data System (ADS)
Mohr, Sebastian; Rahimi, Sara; Tennyson, Jonathan; Ansell, Oliver; Patel, Jash
2016-09-01
A key goal of the presented research project PowerBase is to produce new integration schemes which enable the manufacturability of 3D integrated power smart systems with high precision TSV etched features. The necessary high aspect ratio etch is performed via the BOSCH process. Investigations in industrial research are often use trial and improvement experimental methods. Simulations provide an alternative way to study the influence of external parameters on the final product, whilst also giving insights into the physical processes. This presentation investigates the process of simulating an industrial ICP reactor used over high power (up to 2x5 kW) and pressure (up to 200 mTorr) ranges, analysing the specific procedures to achieve a compromise between physical correctness and computational speed, while testing commonly made assumptions. This includes, for example, the effect of different physical models and the inclusion of different gas phase and surface reactions with the aim of accurately predicting the dependence of surface rates and profiles on external parameters in SF6 and C4F8 discharges. This project has received funding from the Electronic Component Systems for European Leadership Joint Undertaking under Grant Agreement No. 662133 PowerBase.
Sensitivity Analysis and Optimization of Enclosure Radiation with Applications to Crystal Growth
NASA Technical Reports Server (NTRS)
Tiller, Michael M.
1995-01-01
In engineering, simulation software is often used as a convenient means for carrying out experiments to evaluate physical systems. The benefit of using simulations as 'numerical' experiments is that the experimental conditions can be easily modified and repeated at much lower cost than the comparable physical experiment. The goal of these experiments is to 'improve' the process or result of the experiment. In most cases, the computational experiments employ the same trial and error approach as their physical counterparts. When using this approach for complex systems, the cause and effect relationship of the system may never be fully understood and efficient strategies for improvement never utilized. However, it is possible when running simulations to accurately and efficiently determine the sensitivity of the system results with respect to simulation to accurately and efficiently determine the sensitivity of the system results with respect to simulation parameters (e.g., initial conditions, boundary conditions, and material properties) by manipulating the underlying computations. This results in a better understanding of the system dynamics and gives us efficient means to improve processing conditions. We begin by discussing the steps involved in performing simulations. Then we consider how sensitivity information about simulation results can be obtained and ways this information may be used to improve the process or result of the experiment. Next, we discuss optimization and the efficient algorithms which use sensitivity information. We draw on all this information to propose a generalized approach for integrating simulation and optimization, with an emphasis on software programming issues. After discussing our approach to simulation and optimization we consider an application involving crystal growth. This application is interesting because it includes radiative heat transfer. We discuss the computation of radiative new factors and the impact this mode of heat transfer has on our approach. Finally, we will demonstrate the results of our optimization.
NASA Astrophysics Data System (ADS)
Yücel, M.; Emirhan, E.; Bayrak, A.; Ozben, C. S.; Yücel, E. Barlas
2015-11-01
Design and production of a simple and low cost X-ray imaging system that can be used for light industrial applications was targeted in the Nuclear Physics Laboratory of Istanbul Technical University. In this study, production, transmission and detection of X-rays were simulated for the proposed imaging device. OX/70-P dental tube was used and X-ray spectra simulated by Geant4 were validated by comparison with X-ray spectra measured between 20 and 35 keV. Relative detection efficiency of the detector was also determined to confirm the physics processes used in the simulations. Various time optimization tools were performed to reduce the simulation time.
Simulation of secondary emission calorimeter for future colliders
NASA Astrophysics Data System (ADS)
Yetkin, E. A.; Yetkin, T.; Ozok, F.; Iren, E.; Erduran, M. N.
2018-03-01
We present updated results from a simulation study of a conceptual sampling electromagnetic calorimeter based on secondary electron emission process. We implemented the secondary electron emission process in Geant4 as a user physics list and produced the energy spectrum and yield of secondary electrons. The energy resolution of the SEE calorimeter was σ/E = (41%) GeV1/2/√E and the response linearity to electromagnetic showers was to within 1.5%. The simulation results were also compared with a traditional scintillator calorimeter.
NASA Astrophysics Data System (ADS)
Denissenkov, Pavel; Perdikakis, Georgios; Herwig, Falk; Schatz, Hendrik; Ritter, Christian; Pignatari, Marco; Jones, Samuel; Nikas, Stylianos; Spyrou, Artemis
2018-05-01
The first-peak s-process elements Rb, Sr, Y and Zr in the post-AGB star Sakurai's object (V4334 Sagittarii) have been proposed to be the result of i-process nucleosynthesis in a post-AGB very-late thermal pulse event. We estimate the nuclear physics uncertainties in the i-process model predictions to determine whether the remaining discrepancies with observations are significant and point to potential issues with the underlying astrophysical model. We find that the dominant source in the nuclear physics uncertainties are predictions of neutron capture rates on unstable neutron rich nuclei, which can have uncertainties of more than a factor 20 in the band of the i-process. We use a Monte Carlo variation of 52 neutron capture rates and a 1D multi-zone post-processing model for the i-process in Sakurai's object to determine the cumulative effect of these uncertainties on the final elemental abundance predictions. We find that the nuclear physics uncertainties are large and comparable to observational errors. Within these uncertainties the model predictions are consistent with observations. A correlation analysis of the results of our MC simulations reveals that the strongest impact on the predicted abundances of Rb, Sr, Y and Zr is made by the uncertainties in the (n, γ) reaction rates of 85Br, 86Br, 87Kr, 88Kr, 89Kr, 89Rb, 89Sr, and 92Sr. This conclusion is supported by a series of multi-zone simulations in which we increased and decreased to their maximum and minimum limits one or two reaction rates per run. We also show that simple and fast one-zone simulations should not be used instead of more realistic multi-zone stellar simulations for nuclear sensitivity and uncertainty studies of convective–reactive processes. Our findings apply more generally to any i-process site with similar neutron exposure, such as rapidly accreting white dwarfs with near-solar metallicities.
Proposed standards for peer-reviewed publication of computer code
USDA-ARS?s Scientific Manuscript database
Computer simulation models are mathematical abstractions of physical systems. In the area of natural resources and agriculture, these physical systems encompass selected interacting processes in plants, soils, animals, or watersheds. These models are scientific products and have become important i...
Multiphase Reactive Transport and Platelet Ice Accretion in the Sea Ice of McMurdo Sound, Antarctica
NASA Astrophysics Data System (ADS)
Buffo, J. J.; Schmidt, B. E.; Huber, C.
2018-01-01
Sea ice seasonally to interannually forms a thermal, chemical, and physical boundary between the atmosphere and hydrosphere over tens of millions of square kilometers of ocean. Its presence affects both local and global climate and ocean dynamics, ice shelf processes, and biological communities. Accurate incorporation of sea ice growth and decay, and its associated thermal and physiochemical processes, is underrepresented in large-scale models due to the complex physics that dictate oceanic ice formation and evolution. Two phenomena complicate sea ice simulation, particularly in the Antarctic: the multiphase physics of reactive transport brought about by the inhomogeneous solidification of seawater, and the buoyancy driven accretion of platelet ice formed by supercooled ice shelf water onto the basal surface of the overlying ice. Here a one-dimensional finite difference model capable of simulating both processes is developed and tested against ice core data. Temperature, salinity, liquid fraction, fluid velocity, total salt content, and ice structure are computed during model runs. The model results agree well with empirical observations and simulations highlight the effect platelet ice accretion has on overall ice thickness and characteristics. Results from sensitivity studies emphasize the need to further constrain sea ice microstructure and the associated physics, particularly permeability-porosity relationships, if a complete model of sea ice evolution is to be obtained. Additionally, implications for terrestrial ice shelves and icy moons in the solar system are discussed.
Development of Partial Discharging Simulation Test Equipment
NASA Astrophysics Data System (ADS)
Kai, Xue; Genghua, Liu; Yan, Jia; Ziqi, Chai; Jian, Lu
2017-12-01
In the case of partial discharge training for recruits who lack of on-site work experience, the risk of physical shock and damage of the test equipment may be due to the limited skill level and improper operation by new recruits. Partial discharge simulation tester is the use of simulation technology to achieve partial discharge test process simulation, relatively true reproduction of the local discharge process and results, so that the operator in the classroom will be able to get familiar with and understand the use of the test process and equipment.The teacher sets up the instrument to display different partial discharge waveforms so that the trainees can analyze the test results of different partial discharge types.
NASA Astrophysics Data System (ADS)
Chadburn, Sarah E.; Krinner, Gerhard; Porada, Philipp; Bartsch, Annett; Beer, Christian; Belelli Marchesini, Luca; Boike, Julia; Ekici, Altug; Elberling, Bo; Friborg, Thomas; Hugelius, Gustaf; Johansson, Margareta; Kuhry, Peter; Kutzbach, Lars; Langer, Moritz; Lund, Magnus; Parmentier, Frans-Jan W.; Peng, Shushi; Van Huissteden, Ko; Wang, Tao; Westermann, Sebastian; Zhu, Dan; Burke, Eleanor J.
2017-11-01
It is important that climate models can accurately simulate the terrestrial carbon cycle in the Arctic due to the large and potentially labile carbon stocks found in permafrost-affected environments, which can lead to a positive climate feedback, along with the possibility of future carbon sinks from northward expansion of vegetation under climate warming. Here we evaluate the simulation of tundra carbon stocks and fluxes in three land surface schemes that each form part of major Earth system models (JSBACH, Germany; JULES, UK; ORCHIDEE, France). We use a site-level approach in which comprehensive, high-frequency datasets allow us to disentangle the importance of different processes. The models have improved physical permafrost processes and there is a reasonable correspondence between the simulated and measured physical variables, including soil temperature, soil moisture and snow. We show that if the models simulate the correct leaf area index (LAI), the standard C3 photosynthesis schemes produce the correct order of magnitude of carbon fluxes. Therefore, simulating the correct LAI is one of the first priorities. LAI depends quite strongly on climatic variables alone, as we see by the fact that the dynamic vegetation model can simulate most of the differences in LAI between sites, based almost entirely on climate inputs. However, we also identify an influence from nutrient limitation as the LAI becomes too large at some of the more nutrient-limited sites. We conclude that including moss as well as vascular plants is of primary importance to the carbon budget, as moss contributes a large fraction to the seasonal CO2 flux in nutrient-limited conditions. Moss photosynthetic activity can be strongly influenced by the moisture content of moss, and the carbon uptake can be significantly different from vascular plants with a similar LAI. The soil carbon stocks depend strongly on the rate of input of carbon from the vegetation to the soil, and our analysis suggests that an improved simulation of photosynthesis would also lead to an improved simulation of soil carbon stocks. However, the stocks are also influenced by soil carbon burial (e.g. through cryoturbation) and the rate of heterotrophic respiration, which depends on the soil physical state. More detailed below-ground measurements are needed to fully evaluate biological and physical soil processes. Furthermore, even if these processes are well modelled, the soil carbon profiles cannot resemble peat layers as peat accumulation processes are not represented in the models. Thus, we identify three priority areas for model development: (1) dynamic vegetation including (a) climate and (b) nutrient limitation effects; (2) adding moss as a plant functional type; and an (3) improved vertical profile of soil carbon including peat processes.
NASA Astrophysics Data System (ADS)
Stegen, Ronald; Gassmann, Matthias
2017-04-01
The use of a broad variation of agrochemicals is essential for the modern industrialized agriculture. During the last decades, the awareness of the side effects of their use has grown and with it the requirement to reproduce, understand and predict the behaviour of these agrochemicals in the environment, in order to optimize their use and minimize the side effects. The modern modelling has made great progress in understanding and predicting these chemicals with digital methods. While the behaviour of the applied chemicals is often investigated and modelled, most studies only simulate parent chemicals, considering total annihilation of the substance. However, due to a diversity of chemical, physical and biological processes, the substances are rather transformed into new chemicals, which themselves are transformed until, at the end of the chain, the substance is completely mineralized. During this process, the fate of each transformation product is determined by its own environmental characteristics and the pathway and results of transformation can differ largely by substance and environmental influences, that can occur in different compartments of the same site. Simulating transformation products introduces additional model uncertainties. Thus, the calibration effort increases compared to simulations of the transport and degradation of the primary substance alone. The simulation of the necessary physical processes needs a lot of calculation time. Due to that, few physically-based models offer the possibility to simulate transformation products at all, mostly at the field scale. The few models available for the catchment scale are not optimized for this duty, i.e. they are only able to simulate a single parent compound and up to two transformation products. Thus, for simulations of large physico-chemical parameter spaces, the enormous calculation time of the underlying hydrological model diminishes the overall performance. In this study, the structure of the model ZIN-AGRITRA is re-designed for the transport and transformation of an unlimited amount of agrochemicals in the soil-water-plant system at catchment scale. The focus is, besides a good hydrological standard, on a flexible variation of transformation processes and the optimization for the use of large numbers of different substances. Due to the new design, a reduction of the calculation time per tested substance is acquired, allowing faster testing of parameter spaces. Additionally, the new concept allows for the consideration of different transformation processes and products in different environmental compartments. A first test of calculation time improvements and flexible transformation pathways was performed in a Mediterranean meso-scale catchment, using the insecticide Chlorpyrifos and two of its transformation products, which emerge from different transformation processes, as test substances.
NASA Astrophysics Data System (ADS)
Ukawa, Akira
1998-05-01
The CP-PACS computer is a massively parallel computer consisting of 2048 processing units and having a peak speed of 614 GFLOPS and 128 GByte of main memory. It was developed over the four years from 1992 to 1996 at the Center for Computational Physics, University of Tsukuba, for large-scale numerical simulations in computational physics, especially those of lattice QCD. The CP-PACS computer has been in full operation for physics computations since October 1996. In this article we describe the chronology of the development, the hardware and software characteristics of the computer, and its performance for lattice QCD simulations.
Bochmann, Esther S; Steffens, Kristina E; Gryczke, Andreas; Wagner, Karl G
2018-03-01
Simulation of HME processes is a valuable tool for increased process understanding and ease of scale-up. However, the experimental determination of all required input parameters is tedious, namely the melt rheology of the amorphous solid dispersion (ASD) in question. Hence, a procedure to simplify the application of hot-melt extrusion (HME) simulation for forming amorphous solid dispersions (ASD) is presented. The commercial 1D simulation software Ludovic ® was used to conduct (i) simulations using a full experimental data set of all input variables including melt rheology and (ii) simulations using model-based melt viscosity data based on the ASDs glass transition and the physical properties of polymeric matrix only. Both types of HME computation were further compared to experimental HME results. Variation in physical properties (e.g. heat capacity, density) and several process characteristics of HME (residence time distribution, energy consumption) among the simulations and experiments were evaluated. The model-based melt viscosity was calculated by using the glass transition temperature (T g ) of the investigated blend and the melt viscosity of the polymeric matrix by means of a T g -viscosity correlation. The results of measured melt viscosity and model-based melt viscosity were similar with only few exceptions, leading to similar HME simulation outcomes. At the end, the experimental effort prior to HME simulation could be minimized and the procedure enables a good starting point for rational development of ASDs by means of HME. As model excipients, Vinylpyrrolidone-vinyl acetate copolymer (COP) in combination with various APIs (carbamazepine, dipyridamole, indomethacin, and ibuprofen) or polyethylene glycol (PEG 1500) as plasticizer were used to form the ASDs. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacFarlane, Joseph J.; Golovkin, I. E.; Woodruff, P. R.
2009-08-07
This Final Report summarizes work performed under DOE STTR Phase II Grant No. DE-FG02-05ER86258 during the project period from August 2006 to August 2009. The project, “Development of Spectral and Atomic Models for Diagnosing Energetic Particle Characteristics in Fast Ignition Experiments,” was led by Prism Computational Sciences (Madison, WI), and involved collaboration with subcontractors University of Nevada-Reno and Voss Scientific (Albuquerque, NM). In this project, we have: Developed and implemented a multi-dimensional, multi-frequency radiation transport model in the LSP hybrid fluid-PIC (particle-in-cell) code [1,2]. Updated the LSP code to support the use of accurate equation-of-state (EOS) tables generated by Prism’smore » PROPACEOS [3] code to compute more accurate temperatures in high energy density physics (HEDP) plasmas. Updated LSP to support the use of Prism’s multi-frequency opacity tables. Generated equation of state and opacity data for LSP simulations for several materials being used in plasma jet experimental studies. Developed and implemented parallel processing techniques for the radiation physics algorithms in LSP. Benchmarked the new radiation transport and radiation physics algorithms in LSP and compared simulation results with analytic solutions and results from numerical radiation-hydrodynamics calculations. Performed simulations using Prism radiation physics codes to address issues related to radiative cooling and ionization dynamics in plasma jet experiments. Performed simulations to study the effects of radiation transport and radiation losses due to electrode contaminants in plasma jet experiments. Updated the LSP code to generate output using NetCDF to provide a better, more flexible interface to SPECT3D [4] in order to post-process LSP output. Updated the SPECT3D code to better support the post-processing of large-scale 2-D and 3-D datasets generated by simulation codes such as LSP. Updated atomic physics modeling to provide for more comprehensive and accurate atomic databases that feed into the radiation physics modeling (spectral simulations and opacity tables). Developed polarization spectroscopy modeling techniques suitable for diagnosing energetic particle characteristics in HEDP experiments. A description of these items is provided in this report. The above efforts lay the groundwork for utilizing the LSP and SPECT3D codes in providing simulation support for DOE-sponsored HEDP experiments, such as plasma jet and fast ignition physics experiments. We believe that taken together, the LSP and SPECT3D codes have unique capabilities for advancing our understanding of the physics of these HEDP plasmas. Based on conversations early in this project with our DOE program manager, Dr. Francis Thio, our efforts emphasized developing radiation physics and atomic modeling capabilities that can be utilized in the LSP PIC code, and performing radiation physics studies for plasma jets. A relatively minor component focused on the development of methods to diagnose energetic particle characteristics in short-pulse laser experiments related to fast ignition physics. The period of performance for the grant was extended by one year to August 2009 with a one-year no-cost extension, at the request of subcontractor University of Nevada-Reno.« less
Li, Yong; Wang, Hanpeng; Zhu, Weishen; Li, Shucai; Liu, Jian
2015-08-31
Fiber Bragg Grating (FBG) sensors are comprehensively recognized as a structural stability monitoring device for all kinds of geo-materials by either embedding into or bonding onto the structural entities. The physical model in geotechnical engineering, which could accurately simulate the construction processes and the effects on the stability of underground caverns on the basis of satisfying the similarity principles, is an actual physical entity. Using a physical model test of underground caverns in Shuangjiangkou Hydropower Station, FBG sensors were used to determine how to model the small displacements of some key monitoring points in the large-scale physical model during excavation. In the process of building the test specimen, it is most successful to embed FBG sensors in the physical model through making an opening and adding some quick-set silicon. The experimental results show that the FBG sensor has higher measuring accuracy than other conventional sensors like electrical resistance strain gages and extensometers. The experimental results are also in good agreement with the numerical simulation results. In conclusion, FBG sensors could effectively measure small displacements of monitoring points in the whole process of the physical model test. The experimental results reveal the deformation and failure characteristics of the surrounding rock mass and make some guidance for the in situ engineering construction.
Li, Yong; Wang, Hanpeng; Zhu, Weishen; Li, Shucai; Liu, Jian
2015-01-01
Fiber Bragg Grating (FBG) sensors are comprehensively recognized as a structural stability monitoring device for all kinds of geo-materials by either embedding into or bonding onto the structural entities. The physical model in geotechnical engineering, which could accurately simulate the construction processes and the effects on the stability of underground caverns on the basis of satisfying the similarity principles, is an actual physical entity. Using a physical model test of underground caverns in Shuangjiangkou Hydropower Station, FBG sensors were used to determine how to model the small displacements of some key monitoring points in the large-scale physical model during excavation. In the process of building the test specimen, it is most successful to embed FBG sensors in the physical model through making an opening and adding some quick-set silicon. The experimental results show that the FBG sensor has higher measuring accuracy than other conventional sensors like electrical resistance strain gages and extensometers. The experimental results are also in good agreement with the numerical simulation results. In conclusion, FBG sensors could effectively measure small displacements of monitoring points in the whole process of the physical model test. The experimental results reveal the deformation and failure characteristics of the surrounding rock mass and make some guidance for the in situ engineering construction. PMID:26404287
ERIC Educational Resources Information Center
Ceberio, Mikel; Almudí, José Manuel; Franco, Ángel
2016-01-01
In recent years, interactive computer simulations have been progressively integrated in the teaching of the sciences and have contributed significant improvements in the teaching-learning process. Practicing problem-solving is a key factor in science and engineering education. The aim of this study was to design simulation-based problem-solving…
Simulating The Dynamical Evolution Of Galaxies In Group And Cluster Environments
NASA Astrophysics Data System (ADS)
Vijayaraghavan, Rukmani
2015-07-01
Galaxy clusters are harsh environments for their constituent galaxies. A variety of physical processes effective in these dense environments transform gas-rich, spiral, star-forming galaxies to elliptical or spheroidal galaxies with very little gas and therefore minimal star formation. The consequences of these processes are well understood observationally. Galaxies in progressively denser environments have systematically declining star formation rates and gas content. However, a theoretical understanding of of where, when, and how these processes act, and the interplay between the various galaxy transformation mechanisms in clusters remains elusive. In this dissertation, I use numerical simulations of cluster mergers as well as galaxies evolving in quiescent environments to develop a theoretical framework to understand some of the physics of galaxy transformation in cluster environments. Galaxies can be transformed in smaller groups before they are accreted by their eventual massive cluster environments, an effect termed `pre-processing'. Galaxy cluster mergers themselves can accelerate many galaxy transformation mechanisms, including tidal and ram pressure stripping of galaxies and galaxy-galaxy collisions and mergers that result in reassemblies of galaxies' stars and gas. Observationally, cluster mergers have distinct velocity and phase-space signatures depending on the observer's line of sight with respect to the merger direction. Using dark matter only as well as hydrodynamic simulations of cluster mergers with random ensembles of particles tagged with galaxy models, I quantify the effects of cluster mergers on galaxy evolution before, during, and after the mergers. Based on my theoretical predictions of the dynamical signatures of these mergers in combination with galaxy transformation signatures, one can observationally identify remnants of mergers and quantify the effect of the environment on galaxies in dense group and cluster environments. The presence of long-lived, hot X-ray emitting coronae observed in a large fraction of group and cluster galaxies is not well-understood. These coronae are not fully stripped by ram pressure and tidal forces that are efficient in these environments. Theoretically, this is a fascinating and challenging problem that involves understanding and simulating the multitude of physical processes in these dense environments that can remove or replenish galaxies' hot coronae. To solve this problem, I have developed and implemented a robust simulation technique where I simulate the evolution of a realistic cluster environment with a population of galaxies and their gas. With this technique, it is possible to isolate and quantify the importance of the various cluster physical processes for coronal survival. To date, I have performed hydrodynamic simulations of galaxies being ram pressure stripped in quiescent group and cluster environments. Using these simulations, I have characterized the physics of ram pressure stripping and investigated the survival of these coronae in the presence of tidal and ram pressure stripping. I have also generated synthetic X-ray observations of these simulated systems to compare with observed coronae. I have also performed magnetohydrodynamic simulations of galaxies evolving in a magnetized intracluster medium plasma to isolate the effect of magnetic fields on coronal evolution, as well the effect of orbiting galaxies in amplifying magnetic fields. This work is an important step towards understanding the effect of cluster environments on galactic gas, and consequently, their long term evolution and impact on star formation rates.
Fast emulation of track reconstruction in the CMS simulation
NASA Astrophysics Data System (ADS)
Komm, Matthias; CMS Collaboration
2017-10-01
Simulated samples of various physics processes are a key ingredient within analyses to unlock the physics behind LHC collision data. Samples with more and more statistics are required to keep up with the increasing amounts of recorded data. During sample generation, significant computing time is spent on the reconstruction of charged particle tracks from energy deposits which additionally scales with the pileup conditions. In CMS, the FastSimulation package is developed for providing a fast alternative to the standard simulation and reconstruction workflow. It employs various techniques to emulate track reconstruction effects in particle collision events. Several analysis groups in CMS are utilizing the package, in particular those requiring many samples to scan the parameter space of physics models (e.g. SUSY) or for the purpose of estimating systematic uncertainties. The strategies for and recent developments in this emulation are presented, including a novel, flexible implementation of tracking emulation while retaining a sufficient, tuneable accuracy.
NASA Astrophysics Data System (ADS)
Caviedes-Voullième, Daniel; García-Navarro, Pilar; Murillo, Javier
2012-07-01
SummaryHydrological simulation of rain-runoff processes is often performed with lumped models which rely on calibration to generate storm hydrographs and study catchment response to rain. In this paper, a distributed, physically-based numerical model is used for runoff simulation in a mountain catchment. This approach offers two advantages. The first is that by using shallow-water equations for runoff flow, there is less freedom to calibrate routing parameters (as compared to, for example, synthetic hydrograph methods). The second, is that spatial distributions of water depth and velocity can be obtained. Furthermore, interactions among the various hydrological processes can be modeled in a physically-based approach which may depend on transient and spatially distributed factors. On the other hand, the undertaken numerical approach relies on accurate terrain representation and mesh selection, which also affects significantly the computational cost of the simulations. Hence, we investigate the response of a gauged catchment with this distributed approach. The methodology consists of analyzing the effects that the mesh has on the simulations by using a range of meshes. Next, friction is applied to the model and the response to variations and interaction with the mesh is studied. Finally, a first approach with the well-known SCS Curve Number method is studied to evaluate its behavior when coupled with a shallow-water model for runoff flow. The results show that mesh selection is of great importance, since it may affect the results in a magnitude as large as physical factors, such as friction. Furthermore, results proved to be less sensitive to roughness spatial distribution than to mesh properties. Finally, the results indicate that SCS-CN may not be suitable for simulating hydrological processes together with a shallow-water model.
Study of the coupling between real gas effects and rarefied effects on hypersonic aerodynamics
NASA Astrophysics Data System (ADS)
Chen, Song; Hu, Yuan; Sun, Quanhua
2012-11-01
Hypersonic vehicles travel across the atmosphere at very high speed, and the surrounding gas experiences complicated physical and chemical processes. These processes produce real gas effects at high temperature and rarefied gas effects at high altitude where the two effects are coupled through molecular collisions. In this study, we aim to identify the individual real gas and rarefied gas effects by simulating hypersonic flow over a 2D cylinder, a sphere and a blunted cone using a continuum-based CFD approach and the direct simulation Monte Carlo method. It is found that physical processes such as vibrational excitation and chemical reaction will reduce significantly the shock stand-off distance and flow temperature for flows having small Knudsen number. The calculated skin friction and surface heat flux will decrease when the real gas effects are considered in simulations. The trend, however, gets weakened as the Knudsen number increases. It is concluded that the rarefied gas effects weaken the real gas effects on hypersonic flows.
General simulation algorithm for autocorrelated binary processes.
Serinaldi, Francesco; Lombardo, Federico
2017-02-01
The apparent ubiquity of binary random processes in physics and many other fields has attracted considerable attention from the modeling community. However, generation of binary sequences with prescribed autocorrelation is a challenging task owing to the discrete nature of the marginal distributions, which makes the application of classical spectral techniques problematic. We show that such methods can effectively be used if we focus on the parent continuous process of beta distributed transition probabilities rather than on the target binary process. This change of paradigm results in a simulation procedure effectively embedding a spectrum-based iterative amplitude-adjusted Fourier transform method devised for continuous processes. The proposed algorithm is fully general, requires minimal assumptions, and can easily simulate binary signals with power-law and exponentially decaying autocorrelation functions corresponding, for instance, to Hurst-Kolmogorov and Markov processes. An application to rainfall intermittency shows that the proposed algorithm can also simulate surrogate data preserving the empirical autocorrelation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, Austin; Chakraborty, Sudipta; Wang, Dexin
This paper presents a cyber-physical testbed, developed to investigate the complex interactions between emerging microgrid technologies such as grid-interactive power sources, control systems, and a wide variety of communication platforms and bandwidths. The cyber-physical testbed consists of three major components for testing and validation: real time models of a distribution feeder model with microgrid assets that are integrated into the National Renewable Energy Laboratory's (NREL) power hardware-in-the-loop (PHIL) platform; real-time capable network-simulator-in-the-loop (NSIL) models; and physical hardware including inverters and a simple system controller. Several load profiles and microgrid configurations were tested to examine the effect on system performance withmore » increasing channel delays and router processing delays in the network simulator. Testing demonstrated that the controller's ability to maintain a target grid import power band was severely diminished with increasing network delays and laid the foundation for future testing of more complex cyber-physical systems.« less
Physics Guided Data Science in the Earth Sciences
NASA Astrophysics Data System (ADS)
Ganguly, A. R.
2017-12-01
Even as the geosciences are becoming relatively data-rich owing to remote sensing and archived model simulations, established physical understanding and process knowledge cannot be ignored. The ability to leverage both physics and data-intensive sciences may lead to new discoveries and predictive insights. A principled approach to physics guided data science, where physics informs feature selection, output constraints, and even the architecture of the learning models, is motivated. The possibility of hybrid physics and data science models at the level of component processes is discussed. The challenges and opportunities, as well as the relations to other approaches such as data assimilation - which also bring physics and data together - are discussed. Case studies are presented in climate, hydrology and meteorology.
Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process
NASA Astrophysics Data System (ADS)
Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.
2017-05-01
Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).
Simulation of Plasma Jet Merger and Liner Formation within the PLX- α Project
NASA Astrophysics Data System (ADS)
Samulyak, Roman; Chen, Hsin-Chiang; Shih, Wen; Hsu, Scott
2015-11-01
Detailed numerical studies of the propagation and merger of high Mach number argon plasma jets and the formation of plasma liners have been performed using the newly developed method of Lagrangian particles (LP). The LP method significantly improves accuracy and mathematical rigor of common particle-based numerical methods such as smooth particle hydrodynamics while preserving their main advantages compared to grid-based methods. A brief overview of the LP method will be presented. The Lagrangian particle code implements main relevant physics models such as an equation of state for argon undergoing atomic physics transformation, radiation losses in thin optical limit, and heat conduction. Simulations of the merger of two plasma jets are compared with experimental data from past PLX experiments. Simulations quantify the effect of oblique shock waves, ionization, and radiation processes on the jet merger process. Results of preliminary simulations of future PLX- alpha experiments involving the ~ π / 2 -solid-angle plasma-liner configuration with 9 guns will also be presented. Partially supported by ARPA-E's ALPHA program.
Li, W.; Ma, Q.; Thorne, R. M.; ...
2016-06-10
Various physical processes are known to cause acceleration, loss, and transport of energetic electrons in the Earth's radiation belts, but their quantitative roles in different time and space need further investigation. During the largest storm over the past decade (17 March 2015), relativistic electrons experienced fairly rapid acceleration up to ~7 MeV within 2 days after an initial substantial dropout, as observed by Van Allen Probes. In the present paper, we evaluate the relative roles of various physical processes during the recovery phase of this large storm using a 3-D diffusion simulation. By quantitatively comparing the observed and simulated electronmore » evolution, we found that chorus plays a critical role in accelerating electrons up to several MeV near the developing peak location and produces characteristic flat-top pitch angle distributions. By only including radial diffusion, the simulation underestimates the observed electron acceleration, while radial diffusion plays an important role in redistributing electrons and potentially accelerates them to even higher energies. Moreover, plasmaspheric hiss is found to provide efficient pitch angle scattering losses for hundreds of keV electrons, while its scattering effect on > 1 MeV electrons is relatively slow. Although an additional loss process is required to fully explain the overestimated electron fluxes at multi-MeV, the combined physical processes of radial diffusion and pitch angle and energy diffusion by chorus and hiss reproduce the observed electron dynamics remarkably well, suggesting that quasi-linear diffusion theory is reasonable to evaluate radiation belt electron dynamics during this big storm.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, W.; Ma, Q.; Thorne, R. M.
Various physical processes are known to cause acceleration, loss, and transport of energetic electrons in the Earth's radiation belts, but their quantitative roles in different time and space need further investigation. During the largest storm over the past decade (17 March 2015), relativistic electrons experienced fairly rapid acceleration up to ~7 MeV within 2 days after an initial substantial dropout, as observed by Van Allen Probes. In the present paper, we evaluate the relative roles of various physical processes during the recovery phase of this large storm using a 3-D diffusion simulation. By quantitatively comparing the observed and simulated electronmore » evolution, we found that chorus plays a critical role in accelerating electrons up to several MeV near the developing peak location and produces characteristic flat-top pitch angle distributions. By only including radial diffusion, the simulation underestimates the observed electron acceleration, while radial diffusion plays an important role in redistributing electrons and potentially accelerates them to even higher energies. Moreover, plasmaspheric hiss is found to provide efficient pitch angle scattering losses for hundreds of keV electrons, while its scattering effect on > 1 MeV electrons is relatively slow. Although an additional loss process is required to fully explain the overestimated electron fluxes at multi-MeV, the combined physical processes of radial diffusion and pitch angle and energy diffusion by chorus and hiss reproduce the observed electron dynamics remarkably well, suggesting that quasi-linear diffusion theory is reasonable to evaluate radiation belt electron dynamics during this big storm.« less
Time domain simulations of preliminary breakdown pulses in natural lightning.
Carlson, B E; Liang, C; Bitzer, P; Christian, H
2015-06-16
Lightning discharge is a complicated process with relevant physical scales spanning many orders of magnitude. In an effort to understand the electrodynamics of lightning and connect physical properties of the channel to observed behavior, we construct a simulation of charge and current flow on a narrow conducting channel embedded in three-dimensional space with the time domain electric field integral equation, the method of moments, and the thin-wire approximation. The method includes approximate treatment of resistance evolution due to lightning channel heating and the corona sheath of charge surrounding the lightning channel. Focusing our attention on preliminary breakdown in natural lightning by simulating stepwise channel extension with a simplified geometry, our simulation reproduces the broad features observed in data collected with the Huntsville Alabama Marx Meter Array. Some deviations in pulse shape details are evident, suggesting future work focusing on the detailed properties of the stepping mechanism. Preliminary breakdown pulses can be reproduced by simulated channel extension Channel heating and corona sheath formation are crucial to proper pulse shape Extension processes and channel orientation significantly affect observations.
Time domain simulations of preliminary breakdown pulses in natural lightning
Carlson, B E; Liang, C; Bitzer, P; Christian, H
2015-01-01
Lightning discharge is a complicated process with relevant physical scales spanning many orders of magnitude. In an effort to understand the electrodynamics of lightning and connect physical properties of the channel to observed behavior, we construct a simulation of charge and current flow on a narrow conducting channel embedded in three-dimensional space with the time domain electric field integral equation, the method of moments, and the thin-wire approximation. The method includes approximate treatment of resistance evolution due to lightning channel heating and the corona sheath of charge surrounding the lightning channel. Focusing our attention on preliminary breakdown in natural lightning by simulating stepwise channel extension with a simplified geometry, our simulation reproduces the broad features observed in data collected with the Huntsville Alabama Marx Meter Array. Some deviations in pulse shape details are evident, suggesting future work focusing on the detailed properties of the stepping mechanism. Key Points Preliminary breakdown pulses can be reproduced by simulated channel extension Channel heating and corona sheath formation are crucial to proper pulse shape Extension processes and channel orientation significantly affect observations PMID:26664815
Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements
NASA Astrophysics Data System (ADS)
Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.
2017-12-01
Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.
Simulation of process identification and controller tuning for flow control system
NASA Astrophysics Data System (ADS)
Chew, I. M.; Wong, F.; Bono, A.; Wong, K. I.
2017-06-01
PID controller is undeniably the most popular method used in controlling various industrial processes. The feature to tune the three elements in PID has allowed the controller to deal with specific needs of the industrial processes. This paper discusses the three elements of control actions and improving robustness of controllers through combination of these control actions in various forms. A plant model is simulated using the Process Control Simulator in order to evaluate the controller performance. At first, the open loop response of the plant is studied by applying a step input to the plant and collecting the output data from the plant. Then, FOPDT of physical model is formed by using both Matlab-Simulink and PRC method. Then, calculation of controller’s setting is performed to find the values of Kc and τi that will give satisfactory control in closed loop system. Then, the performance analysis of closed loop system is obtained by set point tracking analysis and disturbance rejection performance. To optimize the overall physical system performance, a refined tuning of PID or detuning is further conducted to ensure a consistent resultant output of closed loop system reaction to the set point changes and disturbances to the physical model. As a result, the PB = 100 (%) and τi = 2.0 (s) is preferably chosen for setpoint tracking while PB = 100 (%) and τi = 2.5 (s) is selected for rejecting the imposed disturbance to the model. In a nutshell, selecting correlation tuning values is likewise depended on the required control’s objective for the stability performance of overall physical model.
Simplified Physics Based Models Research Topical Report on Task #2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mishra, Srikanta; Ganesh, Priya
We present a simplified-physics based approach, where only the most important physical processes are modeled, to develop and validate simplified predictive models of CO2 sequestration in deep saline formation. The system of interest is a single vertical well injecting supercritical CO2 into a 2-D layered reservoir-caprock system with variable layer permeabilities. We use a set of well-designed full-physics compositional simulations to understand key processes and parameters affecting pressure propagation and buoyant plume migration. Based on these simulations, we have developed correlations for dimensionless injectivity as a function of the slope of fractional-flow curve, variance of layer permeability values, and themore » nature of vertical permeability arrangement. The same variables, along with a modified gravity number, can be used to develop a correlation for the total storage efficiency within the CO2 plume footprint. Similar correlations are also developed to predict the average pressure within the injection reservoir, and the pressure buildup within the caprock.« less
Faculty development through simulation-based education in physical therapist education.
Greenwood, Kristin Curry; Ewell, Sara B
2018-01-01
The use of simulation-based education (SBE) in health professions, such as physical therapy, requires faculty to expand their teaching practice and development. The impact of this teaching on the individual faculty member, and how their teaching process changes or develops, is not fully understood. The purpose of this study was to explore individual physical therapist faculty members' experience with SBE and how those experiences may have transformed their teaching practice to answer the research questions: How do physical therapist faculty develop through including SBE and are there commonalities among educators? An interpretive phenomenological analysis approach was used with a small sample of subjects who participated in three individual semi-structured interviews. Interview questions were created through the lens of transformative learning theory to allow faculty transformations to be uncovered. A two-step thematic coding process was conducted across participants to identify commonalities of faculty experiences with SBE in physical therapist education. Credibility and trustworthiness were achieved through member checking and expert external review. Thematic findings were validated with transcript excerpts and research field notes. Eight physical therapist faculty members (25% male) with a range of 3 to 16 years of incorporating SBE shared their individual experiences. Four common themes related to faculty development were identified across the participants. Themes identified are the following: faculty strengthen their professional identity as physical therapists, faculty are affected by their introduction and training with simulation, faculty develop their interprofessional education through SBE, and faculty experiences with SBE facilitate professional growth. Physical therapist educators had similarities in their experiences with SBE that transformed their teaching practice and professional development. This study provides insight into what physical therapist faculty may experience when adopting SBE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, S.
Radiation therapy for the treatment of cancer has been established as a highly precise and effective way to eradicate a localized region of diseased tissue. To achieve further significant gains in the therapeutic ratio, we need to move towards biologically optimized treatment planning. To achieve this goal, we need to understand how the radiation-type dependent patterns of induced energy depositions within the cell (physics) connect via molecular, cellular and tissue reactions to treatment outcome such as tumor control and undesirable effects on normal tissue. Several computational biology approaches have been developed connecting physics to biology. Monte Carlo simulations are themore » most accurate method to calculate physical dose distributions at the nanometer scale, however simulations at the DNA scale are slow and repair processes are generally not simulated. Alternative models that rely on the random formation of individual DNA lesions within one or two turns of the DNA have been shown to reproduce the clusters of DNA lesions, including single strand breaks (SSBs), double strand breaks (DSBs) without the need for detailed track structure simulations. Efficient computational simulations of initial DNA damage induction facilitate computational modeling of DNA repair and other molecular and cellular processes. Mechanistic, multiscale models provide a useful conceptual framework to test biological hypotheses and help connect fundamental information about track structure and dosimetry at the sub-cellular level to dose-response effects on larger scales. In this symposium we will learn about the current state of the art of computational approaches estimating radiation damage at the cellular and sub-cellular scale. How can understanding the physics interactions at the DNA level be used to predict biological outcome? We will discuss if and how such calculations are relevant to advance our understanding of radiation damage and its repair, or, if the underlying biological processes are too complex for a mechanistic approach. Can computer simulations be used to guide future biological research? We will debate the feasibility of explaining biology from a physicists’ perspective. Learning Objectives: Understand the potential applications and limitations of computational methods for dose-response modeling at the molecular, cellular and tissue levels Learn about mechanism of action underlying the induction, repair and biological processing of damage to DNA and other constituents Understand how effects and processes at one biological scale impact on biological processes and outcomes on other scales J. Schuemann, NCI/NIH grantsS. McMahon, Funding: European Commission FP7 (grant EC FP7 MC-IOF-623630)« less
Experimental Validation of Various Temperature Modells for Semi-Physical Tyre Model Approaches
NASA Astrophysics Data System (ADS)
Hackl, Andreas; Scherndl, Christoph; Hirschberg, Wolfgang; Lex, Cornelia
2017-10-01
With increasing level of complexity and automation in the area of automotive engineering, the simulation of safety relevant Advanced Driver Assistance Systems (ADAS) leads to increasing accuracy demands in the description of tyre contact forces. In recent years, with improvement in tyre simulation, the needs for coping with tyre temperatures and the resulting changes in tyre characteristics are rising significantly. Therefore, experimental validation of three different temperature model approaches is carried out, discussed and compared in the scope of this article. To investigate or rather evaluate the range of application of the presented approaches in combination with respect of further implementation in semi-physical tyre models, the main focus lies on the a physical parameterisation. Aside from good modelling accuracy, focus is held on computational time and complexity of the parameterisation process. To evaluate this process and discuss the results, measurements from a Hoosier racing tyre 6.0 / 18.0 10 LCO C2000 from an industrial flat test bench are used. Finally the simulation results are compared with the measurement data.
Mechanism study and numerical simulation of Uranium nitriding induced by high energy laser
NASA Astrophysics Data System (ADS)
Zhu, Yuan; Xu, Jingjing; Qi, Yanwen; Li, Shengpeng; Zhao, Hui
2018-06-01
The gradients of interfacial tension induced by local heating led to Marangoni convection, which had a significant effect on surface formation and the process of mass transport in the laser nitriding of uranium. An experimental observation of the underlying processes was very difficult. In present study, the Marangoni convection was considered and the computational fluid dynamic (CFD) analysis technique of FLUENT program was performed to determine the physical processes such as heat transfer and mass transport. The progress of gas-liquid falling film desorption was presented by combining phase-change model with fluid volume function (VOF) model. The time-dependent distribution of the temperature had been derived. Moreover, the concentration and distribution of nitrogen across the laser spot are calculated. The simulation results matched with the experimental data. The numerical resolution method provided a better approach to know the physical processes and dependencies of the coating formation.
Software For Design Of Life-Support Systems
NASA Technical Reports Server (NTRS)
Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.
1991-01-01
Design Assistant Workstation (DAWN) computer program is prototype of expert software system for analysis and design of regenerative, physical/chemical life-support systems that revitalize air, reclaim water, produce food, and treat waste. Incorporates both conventional software for quantitative mathematical modeling of physical, chemical, and biological processes and expert system offering user stored knowledge about materials and processes. Constructs task tree as it leads user through simulated process, offers alternatives, and indicates where alternative not feasible. Also enables user to jump from one design level to another.
Compound simulator IR radiation characteristics test and calibration
NASA Astrophysics Data System (ADS)
Li, Yanhong; Zhang, Li; Li, Fan; Tian, Yi; Yang, Yang; Li, Zhuo; Shi, Rui
2015-10-01
The Hardware-in-the-loop simulation can establish the target/interference physical radiation and interception of product flight process in the testing room. In particular, the simulation of environment is more difficult for high radiation energy and complicated interference model. Here the development in IR scene generation produced by a fiber array imaging transducer with circumferential lamp spot sources is introduced. The IR simulation capability includes effective simulation of aircraft signatures and point-source IR countermeasures. Two point-sources as interference can move in two-dimension random directions. For simulation the process of interference release, the radiation and motion characteristic is tested. Through the zero calibration for optical axis of simulator, the radiation can be well projected to the product detector. The test and calibration results show the new type compound simulator can be used in the hardware-in-the-loop simulation trial.
Sensitivity of air quality simulation to smoke plume rise
Yongqiang Liu; Gary Achtemeier; Scott Goodrick
2008-01-01
Plume rise is the height smoke plumes can reach. This information is needed by air quality models such as the Community Multiscale Air Quality (CMAQ) model to simulate physical and chemical processes of point-source fire emissions. This study seeks to understand the importance of plume rise to CMAQ air quality simulation of prescribed burning to plume rise. CMAQ...
Space-filling designs for computer experiments: A review
Joseph, V. Roshan
2016-01-29
Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less
Space-filling designs for computer experiments: A review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joseph, V. Roshan
Improving the quality of a product/process using a computer simulator is a much less expensive option than the real physical testing. However, simulation using computationally intensive computer models can be time consuming and therefore, directly doing the optimization on the computer simulator can be infeasible. Experimental design and statistical modeling techniques can be used for overcoming this problem. This article reviews experimental designs known as space-filling designs that are suitable for computer simulations. In the review, a special emphasis is given for a recently developed space-filling design called maximum projection design. Furthermore, its advantages are illustrated using a simulation conductedmore » for optimizing a milling process.« less
A New Numerical Simulation technology of Multistage Fracturing in Horizontal Well
NASA Astrophysics Data System (ADS)
Cheng, Ning; Kang, Kaifeng; Li, Jianming; Liu, Tao; Ding, Kun
2017-11-01
Horizontal multi-stage fracturing is recognized the effective development technology of unconventional oil resources. Geological mechanics in the numerical simulation of hydraulic fracturing technology occupies very important position, compared with the conventional numerical simulation technology, because of considering the influence of geological mechanics. New numerical simulation of hydraulic fracturing can more effectively optimize the design of fracturing and evaluate the production after fracturing. This paper studies is based on the three-dimensional stress and rock physics parameters model, using the latest fluid-solid coupling numerical simulation technology to engrave the extension process of fracture and describes the change of stress field in fracturing process, finally predict the production situation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haxton, Wick
2012-03-07
This project was focused on simulations of core-collapse supernovae on parallel platforms. The intent was to address a number of linked issues: the treatment of hydrodynamics and neutrino diffusion in two and three dimensions; the treatment of the underlying nuclear microphysics that governs neutrino transport and neutrino energy deposition; the understanding of the associated nucleosynthesis, including the r-process and neutrino process; the investigation of the consequences of new neutrino phenomena, such as oscillations; and the characterization of the neutrino signal that might be recorded in terrestrial detectors. This was a collaborative effort with Oak Ridge National Laboratory, State University ofmore » New York at Stony Brook, University of Illinois at Urbana-Champaign, University of California at San Diego, University of Tennessee at Knoxville, Florida Atlantic University, North Carolina State University, and Clemson. The collaborations tie together experts in hydrodynamics, nuclear physics, computer science, and neutrino physics. The University of Washington contributions to this effort include the further development of techniques to solve the Bloch-Horowitz equation for effective interactions and operators; collaborative efforts on developing a parallel Lanczos code; investigating the nuclear and neutrino physics governing the r-process and neutrino physics; and exploring the effects of new neutrino physics on the explosion mechanism, nucleosynthesis, and terrestrial supernova neutrino detection.« less
Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment
NASA Astrophysics Data System (ADS)
Brietzke, G. B.; Hainzl, S.; Zöller, G.
2012-04-01
As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).
A domain-decomposed multi-model plasma simulation of collisionless magnetic reconnection
NASA Astrophysics Data System (ADS)
Datta, I. A. M.; Shumlak, U.; Ho, A.; Miller, S. T.
2017-10-01
Collisionless magnetic reconnection is a process relevant to many areas of plasma physics in which energy stored in magnetic fields within highly conductive plasmas is rapidly converted into kinetic and thermal energy. Both in natural phenomena such as solar flares and terrestrial aurora as well as in magnetic confinement fusion experiments, the reconnection process is observed on timescales much shorter than those predicted by a resistive MHD model. As a result, this topic is an active area of research in which plasma models with varying fidelity have been tested in order to understand the proper physics explaining the reconnection process. In this research, a hybrid multi-model simulation employing the Hall-MHD and two-fluid plasma models on a decomposed domain is used to study this problem. The simulation is set up using the WARPXM code developed at the University of Washington, which uses a discontinuous Galerkin Runge-Kutta finite element algorithm and implements boundary conditions between models in the domain to couple their variable sets. The goal of the current work is to determine the parameter regimes most appropriate for each model to maintain sufficient physical fidelity over the whole domain while minimizing computational expense. This work is supported by a Grant from US AFOSR.
Modelling the pelagic nitrogen cycle and vertical particle flux in the Norwegian sea
NASA Astrophysics Data System (ADS)
Haupt, Olaf J.; Wolf, Uli; v. Bodungen, Bodo
1999-02-01
A 1D Eulerian ecosystem model (BIological Ocean Model) for the Norwegian Sea was developed to investigate the dynamics of pelagic ecosystems. The BIOM combines six biochemical compartments and simulates the annual nitrogen cycle with specific focus on production, modification and sedimentation of particles in the water column. The external forcing and physical framework is based on a simulated annual cycle of global radiation and an annual mixed-layer cycle derived from field data. The vertical resolution of the model is given by an exponential grid with 200 depth layers, allowing specific parameterization of various sinking velocities, breakdown of particles and the remineralization processes. The aim of the numerical experiments is the simulation of ecosystem dynamics considering the specific biogeochemical properties of the Norwegian Sea, for example the life cycle of the dominant copepod Calanus finmarchicus. The results of the simulations were validated with field data. Model results are in good agreement with field data for the lower trophic levels of the food web. With increasing complexity of the organisms the differences increase between simulated processes and field data. Results of the numerical simulations suggest that BIOM is well adapted to investigate a physically controlled ecosystem. The simulation of grazing controlled pelagic ecosystems, like the Norwegian Sea, requires adaptations of parameterization to the specific ecosystem features. By using seasonally adaptation of the most sensible processes like utilization of light by phytoplankton and grazing by zooplankton results were greatly improved.
Quantum information processing with superconducting circuits: a review.
Wendin, G
2017-10-01
During the last ten years, superconducting circuits have passed from being interesting physical devices to becoming contenders for near-future useful and scalable quantum information processing (QIP). Advanced quantum simulation experiments have been shown with up to nine qubits, while a demonstration of quantum supremacy with fifty qubits is anticipated in just a few years. Quantum supremacy means that the quantum system can no longer be simulated by the most powerful classical supercomputers. Integrated classical-quantum computing systems are already emerging that can be used for software development and experimentation, even via web interfaces. Therefore, the time is ripe for describing some of the recent development of superconducting devices, systems and applications. As such, the discussion of superconducting qubits and circuits is limited to devices that are proven useful for current or near future applications. Consequently, the centre of interest is the practical applications of QIP, such as computation and simulation in Physics and Chemistry.
Quantum information processing with superconducting circuits: a review
NASA Astrophysics Data System (ADS)
Wendin, G.
2017-10-01
During the last ten years, superconducting circuits have passed from being interesting physical devices to becoming contenders for near-future useful and scalable quantum information processing (QIP). Advanced quantum simulation experiments have been shown with up to nine qubits, while a demonstration of quantum supremacy with fifty qubits is anticipated in just a few years. Quantum supremacy means that the quantum system can no longer be simulated by the most powerful classical supercomputers. Integrated classical-quantum computing systems are already emerging that can be used for software development and experimentation, even via web interfaces. Therefore, the time is ripe for describing some of the recent development of superconducting devices, systems and applications. As such, the discussion of superconducting qubits and circuits is limited to devices that are proven useful for current or near future applications. Consequently, the centre of interest is the practical applications of QIP, such as computation and simulation in Physics and Chemistry.
Benchmarking sheath subgrid boundary conditions for macroscopic-scale simulations
NASA Astrophysics Data System (ADS)
Jenkins, T. G.; Smithe, D. N.
2015-02-01
The formation of sheaths near metallic or dielectric-coated wall materials in contact with a plasma is ubiquitous, often giving rise to physical phenomena (sputtering, secondary electron emission, etc) which influence plasma properties and dynamics both near and far from the material interface. In this paper, we use first-principles PIC simulations of such interfaces to formulate a subgrid sheath boundary condition which encapsulates fundamental aspects of the sheath behavior at the interface. Such a boundary condition, based on the capacitive behavior of the sheath, is shown to be useful in fluid simulations wherein sheath scale lengths are substantially smaller than scale lengths for other relevant physical processes (e.g. radiofrequency wavelengths), in that it enables kinetic processes associated with the presence of the sheath to be numerically modeled without explicit resolution of spatial and temporal sheath scales such as electron Debye length or plasma frequency.
The distribution of density in supersonic turbulence
NASA Astrophysics Data System (ADS)
Squire, Jonathan; Hopkins, Philip F.
2017-11-01
We propose a model for the statistics of the mass density in supersonic turbulence, which plays a crucial role in star formation and the physics of the interstellar medium (ISM). The model is derived by considering the density to be arranged as a collection of strong shocks of width ˜ M^{-2}, where M is the turbulent Mach number. With two physically motivated parameters, the model predicts all density statistics for M>1 turbulence: the density probability distribution and its intermittency (deviation from lognormality), the density variance-Mach number relation, power spectra and structure functions. For the proposed model parameters, reasonable agreement is seen between model predictions and numerical simulations, albeit within the large uncertainties associated with current simulation results. More generally, the model could provide a useful framework for more detailed analysis of future simulations and observational data. Due to the simple physical motivations for the model in terms of shocks, it is straightforward to generalize to more complex physical processes, which will be helpful in future more detailed applications to the ISM. We see good qualitative agreement between such extensions and recent simulations of non-isothermal turbulence.
Geant4 simulations of a wide-angle x-ray focusing telescope
NASA Astrophysics Data System (ADS)
Zhao, Donghua; Zhang, Chen; Yuan, Weimin; Zhang, Shuangnan; Willingale, Richard; Ling, Zhixing
2017-06-01
The rapid development of X-ray astronomy has been made possible by widely deploying X-ray focusing telescopes on board many X-ray satellites. Geant4 is a very powerful toolkit for Monte Carlo simulations and has remarkable abilities to model complex geometrical configurations. However, the library of physical processes available in Geant4 lacks a description of the reflection of X-ray photons at a grazing incident angle which is the core physical process in the simulation of X-ray focusing telescopes. The scattering of low-energy charged particles from the mirror surfaces is another noteworthy process which is not yet incorporated into Geant4. Here we describe a Monte Carlo model of a simplified wide-angle X-ray focusing telescope adopting lobster-eye optics and a silicon detector using the Geant4 toolkit. With this model, we simulate the X-ray tracing, proton scattering and background detection. We find that: (1) the effective area obtained using Geant4 is in agreement with that obtained using Q software with an average difference of less than 3%; (2) X-rays are the dominant background source below 10 keV; (3) the sensitivity of the telescope is better by at least one order of magnitude than that of a coded mask telescope with the same physical dimensions; (4) the number of protons passing through the optics and reaching the detector by Firsov scattering is about 2.5 times that of multiple scattering for the lobster-eye telescope.
Lattice Boltzmann simulations of immiscible displacement process with large viscosity ratios
NASA Astrophysics Data System (ADS)
Rao, Parthib; Schaefer, Laura
2017-11-01
Immiscible displacement is a key physical mechanism involved in enhanced oil recovery and carbon sequestration processes. This multiphase flow phenomenon involves a complex interplay of viscous, capillary, inertial and wettability effects. The lattice Boltzmann (LB) method is an accurate and efficient technique for modeling and simulating multiphase/multicomponent flows especially in complex flow configurations and media. In this presentation we present numerical simulation results of displacement process in thin long channels. The results are based on a new psuedo-potential multicomponent LB model with multiple relaxation time collision (MRT) model and explicit forcing scheme. We demonstrate that the proposed model is capable of accurately simulating the displacement process involving fluids with a wider range of viscosity ratios (>100) and which also leads to viscosity-independent interfacial tension and reduction of some important numerical artifacts.
Progress towards computer simulation of NiH2 battery performance over life
NASA Technical Reports Server (NTRS)
Zimmerman, Albert H.; Quinzio, M. V.
1995-01-01
The long-term performance of rechargeable battery cells has traditionally been verified through life-testing, a procedure that generally requires significant commitments of funding and test resources. In the situation of nickel hydrogen battery cells, which have the capability of providing extremely long cycle life, the time and cost required to conduct even accelerated testing has become a serious impediment to transitioning technology improvements into spacecraft applications. The utilization of computer simulations to indicate the changes in performance to be expected in response to design or operating changes in nickel hydrogen cells is therefore a particularly attractive tool in advanced battery development, as well as for verifying performance in different applications. Computer-based simulations of the long-term performance of rechargeable battery cells have typically had very limited success in the past. There are a number of reasons for the lack in progress in this area. First, and probably most important, all battery cells are relatively complex electrochemical systems, in which performance is dictated by a large number of interacting physical and chemical processes. While the complexity alone is a significant part of the problem, in many instances the fundamental chemical and physical processes underlying long-term degradation and its effects on performance have not even been understood. Second, while specific chemical and physical changes within cell components have been associated with degradation, there has been no generalized simulation architecture that enables the chemical and physical structure (and changes therein) to be translated into cell performance. For the nickel hydrogen battery cell, our knowledge of the underlying reactions that control the performance of this cell has progressed to where it clearly is possible to model them. The recent development of a relative generalized cell modelling approach provides the framework for translating the chemical and physical structure of the components inside a cell into its performance characteristics over its entire cycle life. This report describes our approach to this task in terms of defining those processes deemed critical in controlling performance over life, and the model architecture required to translate the fundamental cell processes into performance profiles.
A Lunar Surface Operations Simulator
NASA Technical Reports Server (NTRS)
Nayar, H.; Balaram, J.; Cameron, J.; Jain, A.; Lim, C.; Mukherjee, R.; Peters, S.; Pomerantz, M.; Reder, L.; Shakkottai, P.;
2008-01-01
The Lunar Surface Operations Simulator (LSOS) is being developed to support planning and design of space missions to return astronauts to the moon. Vehicles, habitats, dynamic and physical processes and related environment systems are modeled and simulated in LSOS to assist in the visualization and design optimization of systems for lunar surface operations. A parametric analysis tool and a data browser were also implemented to provide an intuitive interface to run multiple simulations and review their results. The simulator and parametric analysis capability are described in this paper.
Physical Processes in the MAGO/MFT Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garanin, Sergey F; Reinovsky, Robert E.
2015-03-23
The Monograph is devoted to theoretical discussion of the physical effects, which are most significant for the alternative approach to the problem of controlled thermonuclear fusion (CTF): the MAGO/MTF approach. The book includes the description of the approach, its difference from the major CTF systems—magnetic confinement and inertial confinement systems. General physical methods of the processes simulation in this approach are considered, including plasma transport phenomena and radiation, and the theory of transverse collisionless shock waves, the surface discharges theory, important for such kind of research. Different flows and magneto-hydrodynamic plasma instabilities occurring in the frames of this approach aremore » also considered. In virtue of the general physical essence of the considered phenomena the presented results are applicable to a wide range of plasma physics and hydrodynamics processes. The book is intended for the plasma physics and hydrodynamics specialists, post-graduate students, and senior students-physicists.« less
Black hole feeding and feedback: the physics inside the `sub-grid'
NASA Astrophysics Data System (ADS)
Negri, A.; Volonteri, M.
2017-05-01
Black holes (BHs) are believed to be a key ingredient of galaxy formation. However, the galaxy-BH interplay is challenging to study due to the large dynamical range and complex physics involved. As a consequence, hydrodynamical cosmological simulations normally adopt sub-grid models to track the unresolved physical processes, in particular BH accretion; usually the spatial scale where the BH dominates the hydrodynamical processes (the Bondi radius) is unresolved, and an approximate Bondi-Hoyle accretion rate is used to estimate the growth of the BH. By comparing hydrodynamical simulations at different resolutions (300, 30, 3 pc) using a Bondi-Hoyle approximation to sub-parsec runs with non-parametrized accretion, our aim is to probe how well an approximated Bondi accretion is able to capture the BH accretion physics and the subsequent feedback on the galaxy. We analyse an isolated galaxy simulation that includes cooling, star formation, Type Ia and Type II supernovae, BH accretion and active galactic nuclei feedback (radiation pressure, Compton heating/cooling) where mass, momentum and energy are deposited in the interstellar medium through conical winds. We find that on average the approximated Bondi formalism can lead to both over- and underestimations of the BH growth, depending on resolution and on how the variables entering into the Bondi-Hoyle formalism are calculated.
Radiatively driven stratosphere-troposphere interactions near the tops of tropical cloud clusters
NASA Technical Reports Server (NTRS)
Churchill, Dean D.; Houze, Robert A., Jr.
1990-01-01
Results are presented of two numerical simulations of the mechanism involved in the dehydration of air, using the model of Churchill (1988) and Churchill and Houze (1990) which combines the water and ice physics parameterizations and IR and solar-radiation parameterization with a convective adjustment scheme in a kinematic nondynamic framework. One simulation, a cirrus cloud simulation, was to test the Danielsen (1982) hypothesis of a dehydration mechanism for the stratosphere; the other was to simulate the mesoscale updraft in order to test an alternative mechanism for 'freeze-drying' the air. The results show that the physical processes simulated in the mesoscale updraft differ from those in the thin-cirrus simulation. While in the thin-cirrus case, eddy fluxes occur in response to IR radiative destabilization, and, hence, no net transfer occurs between troposphere and stratosphere, the mesosphere updraft case has net upward mass transport into the lower stratosphere.
NASA Astrophysics Data System (ADS)
Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto
2014-05-01
Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.
NASA Astrophysics Data System (ADS)
Zhang, Qi; Chang, Ming; Zhou, Shengzhen; Chen, Weihua; Wang, Xuemei; Liao, Wenhui; Dai, Jianing; Wu, ZhiYong
2017-11-01
There has been a rapid growth of reactive nitrogen (Nr) deposition over the world in the past decades. The Pearl River Delta region is one of the areas with high loading of nitrogen deposition. But there are still large uncertainties in the study of dry deposition because of its complex processes of physical chemistry and vegetation physiology. At present, the forest canopy parameterization scheme used in WRF-Chem model is a single-layer "big leaf" model, and the simulation of radiation transmission and energy balance in forest canopy is not detailed and accurate. Noah-MP land surface model (Noah-MP) is based on the Noah land surface model (Noah LSM) and has multiple parametric options to simulate the energy, momentum, and material interactions of the vegetation-soil-atmosphere system. Therefore, to investigate the improvement of the simulation results of WRF-Chem on the nitrogen deposition in forest area after coupled with Noah-MP model and to reduce the influence of meteorological simulation biases on the dry deposition velocity simulation, a dry deposition single-point model coupled by Noah- MP and the WRF-Chem dry deposition module (WDDM) was used to simulate the deposition velocity (Vd). The model was driven by the micro-meteorological observation of the Dinghushan Forest Ecosystem Location Station. And a series of numerical experiments were carried out to identify the key processes influencing the calculation of dry deposition velocity, and the effects of various surface physical and plant physiological processes on dry deposition were discussed. The model captured the observed Vd well, but still underestimated the Vd. The self-defect of Wesely scheme applied by WDDM, and the inaccuracy of built-in parameters in WDDM and input data for Noah-MP (e.g. LAI) were the key factors that cause the underestimation of Vd. Therefore, future work is needed to improve model mechanisms and parameterization.
Remote Sensing Image Quality Assessment Experiment with Post-Processing
NASA Astrophysics Data System (ADS)
Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.
2018-04-01
This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.
ERIC Educational Resources Information Center
Fan, Xinxin; Geelan, David; Gillies, Robyn
2018-01-01
This study investigated the effectiveness of a novel inquiry-based instructional sequence using interactive simulations for supporting students' development of conceptual understanding, inquiry process skills and confidence in learning. The study, conducted in Beijing, involved two teachers and 117 students in four classes. The teachers…
NASA Astrophysics Data System (ADS)
Chen, Zhi; Ruan, Shaohong; Swaminathan, Nedunchezhian
2016-07-01
Three-dimensional (3D) unsteady Reynolds-averaged Navier-Stokes simulations of a spark-ignited turbulent methane/air jet flame evolving from ignition to stabilisation are conducted for different jet velocities. A partially premixed combustion model is used involving a correlated joint probability density function and both premixed and non-premixed combustion mode contributions. The 3D simulation results for the temporal evolution of the flame's leading edge are compared with previous two-dimensional (2D) results and experimental data. The comparison shows that the final stabilised flame lift-off height is well predicted by both 2D and 3D computations. However, the transient evolution of the flame's leading edge computed from 3D simulation agrees reasonably well with experiment, whereas evident discrepancies were found in the previous 2D study. This difference suggests that the third physical dimension plays an important role during the flame transient evolution process. The flame brush's leading edge displacement speed resulting from reaction, normal and tangential diffusion processes are studied at different typical stages after ignition in order to understand the effect of the third physical dimension further. Substantial differences are found for the reaction and normal diffusion components between 2D and 3D simulations especially in the initial propagation stage. The evolution of reaction progress variable scalar gradients and its interaction with the flow and mixing field in the 3D physical space have an important effect on the flame's leading edge propagation.
NASA Astrophysics Data System (ADS)
Hockicko, Peter; Krišt‧ák, L.‧uboš; Němec, Miroslav
2015-03-01
Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think creatively, improves their performance and helps them in studying physics. This paper deals with increasing the key competencies in engineering by analysing real-life situation videos - physical problems - by means of video analysis and the modelling tools using the program Tracker and simulations of physical phenomena from The Physics Education Technology (PhET™) Project (VAS method of problem tasks). The statistical testing using the t-test confirmed the significance of the differences in the knowledge of the experimental and control groups, which were the result of interactive method application.
NASA Astrophysics Data System (ADS)
Yamamoto, H.; Nakajima, K.; Zhang, K.; Nanai, S.
2015-12-01
Powerful numerical codes that are capable of modeling complex coupled processes of physics and chemistry have been developed for predicting the fate of CO2 in reservoirs as well as its potential impacts on groundwater and subsurface environments. However, they are often computationally demanding for solving highly non-linear models in sufficient spatial and temporal resolutions. Geological heterogeneity and uncertainties further increase the challenges in modeling works. Two-phase flow simulations in heterogeneous media usually require much longer computational time than that in homogeneous media. Uncertainties in reservoir properties may necessitate stochastic simulations with multiple realizations. Recently, massively parallel supercomputers with more than thousands of processors become available in scientific and engineering communities. Such supercomputers may attract attentions from geoscientist and reservoir engineers for solving the large and non-linear models in higher resolutions within a reasonable time. However, for making it a useful tool, it is essential to tackle several practical obstacles to utilize large number of processors effectively for general-purpose reservoir simulators. We have implemented massively-parallel versions of two TOUGH2 family codes (a multi-phase flow simulator TOUGH2 and a chemically reactive transport simulator TOUGHREACT) on two different types (vector- and scalar-type) of supercomputers with a thousand to tens of thousands of processors. After completing implementation and extensive tune-up on the supercomputers, the computational performance was measured for three simulations with multi-million grid models, including a simulation of the dissolution-diffusion-convection process that requires high spatial and temporal resolutions to simulate the growth of small convective fingers of CO2-dissolved water to larger ones in a reservoir scale. The performance measurement confirmed that the both simulators exhibit excellent scalabilities showing almost linear speedup against number of processors up to over ten thousand cores. Generally this allows us to perform coupled multi-physics (THC) simulations on high resolution geologic models with multi-million grid in a practical time (e.g., less than a second per time step).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mirocha, Jeff D.; Simpson, Matthew D.; Fast, Jerome D.
Simulations of two periods featuring three consecutive low level jet (LLJ) events in the US Upper Great Plains during the autumn of 2011 were conducted to explore the impacts of various setup configurations and physical process models on simulated flow parameters within the lowest 200 m above the surface, using the Weather Research and Forecasting (WRF) model. Sensitivities of simulated flow parameters to the horizontal and vertical grid spacing, planetary boundary layer (PBL) and land surface model (LSM) physics options, were assessed. Data from a Light Detection and Ranging (lidar) system, deployed to the Weather Forecast Improvement Project (WFIP; Finleymore » et al. 2013) were used to evaluate the accuracy of simulated wind speed and direction at 80 m above the surface, as well as their vertical distributions between 120 and 40 m, covering the typical span of contemporary tall wind turbines. All of the simulations qualitatively captured the overall diurnal cycle of wind speed and stratification, producing LLJs during each overnight period, however large discrepancies occurred at certain times for each simulation in relation to the observations. 54-member ensembles encompassing changes of the above discussed configuration parameters displayed a wide range of simulated vertical distributions of wind speed and direction, and potential temperature, reflecting highly variable representations of stratification during the weakly stable overnight conditions. Root mean square error (RMSE) statistics show that different ensemble members performed better and worse in various simulated parameters at different times, with no clearly superior configuration . Simulations using a PBL parameterization designed specifically for the stable conditions investigated herein provided superior overall simulations of wind speed at 80 m, demonstrating the efficacy of targeting improvements of physical process models in areas of known deficiencies. However, the considerable magnitudes of the RMSE values of even the best performing simulations indicate ample opportunities for further improvements.« less
NASA Astrophysics Data System (ADS)
Guidi, Giovanni; Scannapieco, Cecilia; Walcher, Jakob; Gallazzi, Anna
2016-10-01
We study the effects of applying observational techniques to derive the properties of simulated galaxies, with the aim of making an unbiased comparison between observations and simulations. For our study, we used 15 galaxies simulated in a cosmological context using three different feedback and chemical enrichment models, and compared their z = 0 properties with data from the Sloan Digital Sky Survey (SDSS). We show that the physical properties obtained directly from the simulations without post-processing can be very different from those obtained mimicking observational techniques. In order to provide simulators a way to reliably compare their galaxies with SDSS data, for each physical property that we studied - colours, magnitudes, gas and stellar metallicities, mean stellar ages and star formation rates - we give scaling relations that can be easily applied to the values extracted from the simulations; these scalings have in general a high correlation, except for the gas oxygen metallicities. Our simulated galaxies are photometrically similar to galaxies in the blue sequence/green valley, but in general they appear older, passive and with lower metal content compared to most of the spirals in SDSS. As a careful assessment of the agreement/disagreement with observations is the primary test of the baryonic physics implemented in hydrodynamical codes, our study shows that considering the observational biases in the derivation of the galaxies' properties is of fundamental importance to decide on the failure/success of a galaxy formation model.
Baseline process description for simulating plutonium oxide production for precalc project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pike, J. A.
Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO 2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as wellmore » as process and facility design details necessary for multi-scale, multi-physics models are provided.« less
Physical and Mathematical Questions on Signal Processing in Multibase Phase Direction Finders
NASA Astrophysics Data System (ADS)
Denisov, V. P.; Dubinin, D. V.; Meshcheryakov, A. A.
2018-02-01
Questions on improving the accuracy of multiple-base phase direction finders by rejecting anomalously large errors in the process of resolving the measurement ambiguities are considered. A physical basis is derived and calculated relationships characterizing the efficiency of the proposed solutions are obtained. Results of a computer simulation of a three-base direction finder are analyzed, along with field measurements of a three-base direction finder along near-ground paths.
Parameter Uncertainty on AGCM-simulated Tropical Cyclones
NASA Astrophysics Data System (ADS)
He, F.
2015-12-01
This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.
NASA Astrophysics Data System (ADS)
Sun, Guodong; Mu, Mu
2016-04-01
An important source of uncertainty, which then causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. There are many physical parameters in numerical models in the atmospheric and oceanic sciences, and it would cost a great deal to reduce uncertainties in all physical parameters. Therefore, finding a subset of these parameters, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach. The results imply that nonlinear interactions among parameters play a key role in the uncertainty of numerical simulations in arid and semi-arid regions of China compared to those in northern, northeastern and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.
Simulation of beam-induced plasma in gas-filled rf cavities
Yu, Kwangmin; Samulyak, Roman; Yonehara, Katsuya; ...
2017-03-07
Processes occurring in a radio-frequency (rf) cavity, filled with high pressure gas and interacting with proton beams, have been studied via advanced numerical simulations. Simulations support the experimental program on the hydrogen gas-filled rf cavity in the Mucool Test Area (MTA) at Fermilab, and broader research on the design of muon cooling devices. space, a 3D electromagnetic particle-in-cell (EM-PIC) code with atomic physics support, was used in simulation studies. Plasma dynamics in the rf cavity, including the process of neutral gas ionization by proton beams, plasma loading of the rf cavity, and atomic processes in plasma such as electron-ion andmore » ion-ion recombination and electron attachment to dopant molecules, have been studied. Here, through comparison with experiments in the MTA, simulations quantified several uncertain values of plasma properties such as effective recombination rates and the attachment time of electrons to dopant molecules. Simulations have achieved very good agreement with experiments on plasma loading and related processes. Lastly, the experimentally validated code space is capable of predictive simulations of muon cooling devices.« less
Impact of physical permafrost processes on hydrological change
NASA Astrophysics Data System (ADS)
Hagemann, Stefan; Blome, Tanja; Beer, Christian; Ekici, Altug
2015-04-01
Permafrost or perennially frozen ground is an important part of the terrestrial cryosphere; roughly one quarter of Earth's land surface is underlain by permafrost. As it is a thermal phenomenon, its characteristics are highly dependent on climatic factors. The impact of the currently observed warming, which is projected to persist during the coming decades due to anthropogenic CO2 input, certainly has effects for the vast permafrost areas of the high northern latitudes. The quantification of these effects, however, is scientifically still an open question. This is partly due to the complexity of the system, where several feedbacks are interacting between land and atmosphere, sometimes counterbalancing each other. Moreover, until recently, many global circulation models (GCMs) and Earth system models (ESMs) lacked the sufficient representation of permafrost physics in their land surface schemes. Within the European Union FP7 project PAGE21, the land surface scheme JSBACH of the Max-Planck-Institute for Meteorology ESM (MPI-ESM) has been equipped with the representation of relevant physical processes for permafrost studies. These processes include the effects of freezing and thawing of soil water for both energy and water cycles, thermal properties depending on soil water and ice contents, and soil moisture movement being influenced by the presence of soil ice. In the present study, it will be analysed how these permafrost relevant processes impact projected hydrological changes over northern hemisphere high latitude land areas. For this analysis, the atmosphere-land part of MPI-ESM, ECHAM6-JSBACH, is driven by prescribed SST and sea ice in an AMIP2-type setup with and without the newly implemented permafrost processes. Observed SST and sea ice for 1979-1999 are used to consider induced changes in the simulated hydrological cycle. In addition, simulated SST and sea ice are taken from a MPI-ESM simulation conducted for CMIP5 following the RCP8.5 scenario. The corresponding simulations with ECHAM6-JSBACH are used to assess differences in projected hydrological changes induced by the permafrost relevant processes.
NASA Astrophysics Data System (ADS)
Zimmerling, Clemens; Dörr, Dominik; Henning, Frank; Kärger, Luise
2018-05-01
Due to their high mechanical performance, continuous fibre reinforced plastics (CoFRP) become increasingly important for load bearing structures. In many cases, manufacturing CoFRPs comprises a forming process of textiles. To predict and optimise the forming behaviour of a component, numerical simulations are applied. However, for maximum part quality, both the geometry and the process parameters must match in mutual regard, which in turn requires numerous numerically expensive optimisation iterations. In both textile and metal forming, a lot of research has focused on determining optimum process parameters, whilst regarding the geometry as invariable. In this work, a meta-model based approach on component level is proposed, that provides a rapid estimation of the formability for variable geometries based on pre-sampled, physics-based draping data. Initially, a geometry recognition algorithm scans the geometry and extracts a set of doubly-curved regions with relevant geometry parameters. If the relevant parameter space is not part of an underlying data base, additional samples via Finite-Element draping simulations are drawn according to a suitable design-table for computer experiments. Time saving parallel runs of the physical simulations accelerate the data acquisition. Ultimately, a Gaussian Regression meta-model is built from the data base. The method is demonstrated on a box-shaped generic structure. The predicted results are in good agreement with physics-based draping simulations. Since evaluations of the established meta-model are numerically inexpensive, any further design exploration (e.g. robustness analysis or design optimisation) can be performed in short time. It is expected that the proposed method also offers great potential for future applications along virtual process chains: For each process step along the chain, a meta-model can be set-up to predict the impact of design variations on manufacturability and part performance. Thus, the method is considered to facilitate a lean and economic part and process design under consideration of manufacturing effects.
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
Brown, Ross; Rasmussen, Rune; Baldwin, Ian; Wyeth, Peta
2012-08-01
Nursing training for an Intensive Care Unit (ICU) is a resource intensive process. High demands are made on staff, students and physical resources. Interactive, 3D computer simulations, known as virtual worlds, are increasingly being used to supplement training regimes in the health sciences; especially in areas such as complex hospital ward processes. Such worlds have been found to be very useful in maximising the utilisation of training resources. Our aim is to design and develop a novel virtual world application for teaching and training Intensive Care nurses in the approach and method for shift handover, to provide an independent, but rigorous approach to teaching these important skills. In this paper we present a virtual world simulator for students to practice key steps in handing over the 24/7 care requirements of intensive care patients during the commencing first hour of a shift. We describe the modelling process to provide a convincing interactive simulation of the handover steps involved. The virtual world provides a practice tool for students to test their analytical skills with scenarios previously provided by simple physical simulations, and live on the job training. Additional educational benefits include facilitation of remote learning, high flexibility in study hours and the automatic recording of a reviewable log from the session. To the best of our knowledge, we believe this is a novel and original application of virtual worlds to an ICU handover process. The major outcome of the work was a virtual world environment for training nurses in the shift handover process, designed and developed for use by postgraduate nurses in training. Copyright © 2012 Australian College of Critical Care Nurses Ltd. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Karimabadi, Homa
2012-03-01
Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.
A physical-based gas-surface interaction model for rarefied gas flow simulation
NASA Astrophysics Data System (ADS)
Liang, Tengfei; Li, Qi; Ye, Wenjing
2018-01-01
Empirical gas-surface interaction models, such as the Maxwell model and the Cercignani-Lampis model, are widely used as the boundary condition in rarefied gas flow simulations. The accuracy of these models in the prediction of macroscopic behavior of rarefied gas flows is less satisfactory in some cases especially the highly non-equilibrium ones. Molecular dynamics simulation can accurately resolve the gas-surface interaction process at atomic scale, and hence can predict accurate macroscopic behavior. They are however too computationally expensive to be applied in real problems. In this work, a statistical physical-based gas-surface interaction model, which complies with the basic relations of boundary condition, is developed based on the framework of the washboard model. In virtue of its physical basis, this new model is capable of capturing some important relations/trends for which the classic empirical models fail to model correctly. As such, the new model is much more accurate than the classic models, and in the meantime is more efficient than MD simulations. Therefore, it can serve as a more accurate and efficient boundary condition for rarefied gas flow simulations.
Computer modeling and simulators as part of university training for NPP operating personnel
NASA Astrophysics Data System (ADS)
Volman, M.
2017-01-01
This paper considers aspects of a program for training future nuclear power plant personnel developed by the NPP Department of Ivanovo State Power Engineering University. Computer modeling is used for numerical experiments on the kinetics of nuclear reactors in Mathcad. Simulation modeling is carried out on the computer and full-scale simulator of water-cooled power reactor for the simulation of neutron-physical reactor measurements and the start-up - shutdown process.
NASA Astrophysics Data System (ADS)
Konishi, Toshifumi; Yamane, Daisuke; Matsushima, Takaaki; Masu, Kazuya; Machida, Katsuyuki; Toshiyoshi, Hiroshi
2014-01-01
This paper reports the design and evaluation results of a capacitive CMOS-MEMS sensor that consists of the proposed sensor circuit and a capacitive MEMS device implemented on the circuit. To design a capacitive CMOS-MEMS sensor, a multi-physics simulation of the electromechanical behavior of both the MEMS structure and the sensing LSI was carried out simultaneously. In order to verify the validity of the design, we applied the capacitive CMOS-MEMS sensor to a MEMS accelerometer implemented by the post-CMOS process onto a 0.35-µm CMOS circuit. The experimental results of the CMOS-MEMS accelerometer exhibited good agreement with the simulation results within the input acceleration range between 0.5 and 6 G (1 G = 9.8 m/s2), corresponding to the output voltages between 908.6 and 915.4 mV, respectively. Therefore, we have confirmed that our capacitive CMOS-MEMS sensor and the multi-physics simulation will be beneficial method to realize integrated CMOS-MEMS technology.
Digital system for structural dynamics simulation
NASA Technical Reports Server (NTRS)
Krauter, A. I.; Lagace, L. J.; Wojnar, M. K.; Glor, C.
1982-01-01
State-of-the-art digital hardware and software for the simulation of complex structural dynamic interactions, such as those which occur in rotating structures (engine systems). System were incorporated in a designed to use an array of processors in which the computation for each physical subelement or functional subsystem would be assigned to a single specific processor in the simulator. These node processors are microprogrammed bit-slice microcomputers which function autonomously and can communicate with each other and a central control minicomputer over parallel digital lines. Inter-processor nearest neighbor communications busses pass the constants which represent physical constraints and boundary conditions. The node processors are connected to the six nearest neighbor node processors to simulate the actual physical interface of real substructures. Computer generated finite element mesh and force models can be developed with the aid of the central control minicomputer. The control computer also oversees the animation of a graphics display system, disk-based mass storage along with the individual processing elements.
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
Virtual Observation System for Earth System Model: An Application to ACME Land Model Simulations
Wang, Dali; Yuan, Fengming; Hernandez, Benjamin; ...
2017-01-01
Investigating and evaluating physical-chemical-biological processes within an Earth system model (EMS) can be very challenging due to the complexity of both model design and software implementation. A virtual observation system (VOS) is presented to enable interactive observation of these processes during system simulation. Based on advance computing technologies, such as compiler-based software analysis, automatic code instrumentation, and high-performance data transport, the VOS provides run-time observation capability, in-situ data analytics for Earth system model simulation, model behavior adjustment opportunities through simulation steering. A VOS for a terrestrial land model simulation within the Accelerated Climate Modeling for Energy model is also presentedmore » to demonstrate the implementation details and system innovations.« less
QuVis interactive simulations: tools to support quantum mechanics instruction
NASA Astrophysics Data System (ADS)
Kohnle, Antje
2015-04-01
Quantum mechanics holds a fascination for many students, but its mathematical complexity and counterintuitive results can present major barriers. The QuVis Quantum Mechanics Visualization Project (www.st-andrews.ac.uk/physics/quvis) aims to overcome these issues through the development and evaluation of interactive simulations with accompanying activities for the learning and teaching of quantum mechanics. Over 90 simulations are now available on the QuVis website. One collection of simulations is embedded in the Institute of Physics Quantum Physics website (quantumphysics.iop.org), which consists of freely available resources for an introductory course in quantum mechanics starting from two-level systems. Simulations support model-building by reducing complexity, focusing on fundamental ideas and making the invisible visible. They promote engaged exploration, sense-making and linking of multiple representations, and include high levels of interactivity and direct feedback. Simulations are research-based and evaluation with students informs all stages of the development process. Simulations are iteratively refined using student feedback in individual observation sessions and in-class trials. Evaluation has shown that the simulations can help students learn quantum mechanics concepts at both the introductory and advanced undergraduate level and that students perceive simulations to be beneficial to their learning. Recent activity includes the launch of a new collection of HTML5 simulations that run on both desktop and tablet-based devices and the introduction of a goal and reward structure in simulations through the inclusion of challenges. This presentation will give an overview of the QuVis resources, highlight recent work and outline future plans. QuVis is supported by the UK Institute of Physics, the UK Higher Education Academy and the University of St Andrews.
NASA Astrophysics Data System (ADS)
Koch, Jonas; Nowak, Wolfgang
2013-04-01
At many hazardous waste sites and accidental spills, dense non-aqueous phase liquids (DNAPLs) such as TCE, PCE, or TCA have been released into the subsurface. Once a DNAPL is released into the subsurface, it serves as persistent source of dissolved-phase contamination. In chronological order, the DNAPL migrates through the porous medium and penetrates the aquifer, it forms a complex pattern of immobile DNAPL saturation, it dissolves into the groundwater and forms a contaminant plume, and it slowly depletes and bio-degrades in the long-term. In industrial countries the number of such contaminated sites is tremendously high to the point that a ranking from most risky to least risky is advisable. Such a ranking helps to decide whether a site needs to be remediated or may be left to natural attenuation. Both the ranking and the designing of proper remediation or monitoring strategies require a good understanding of the relevant physical processes and their inherent uncertainty. To this end, we conceptualize a probabilistic simulation framework that estimates probability density functions of mass discharge, source depletion time, and critical concentration values at crucial target locations. Furthermore, it supports the inference of contaminant source architectures from arbitrary site data. As an essential novelty, the mutual dependencies of the key parameters and interacting physical processes are taken into account throughout the whole simulation. In an uncertain and heterogeneous subsurface setting, we identify three key parameter fields: the local velocities, the hydraulic permeabilities and the DNAPL phase saturations. Obviously, these parameters depend on each other during DNAPL infiltration, dissolution and depletion. In order to highlight the importance of these mutual dependencies and interactions, we present results of several model set ups where we vary the physical and stochastic dependencies of the input parameters and simulated processes. Under these changes, the probability density functions demonstrate strong statistical shifts in their expected values and in their uncertainty. Considering the uncertainties of all key parameters but neglecting their interactions overestimates the output uncertainty. However, consistently using all available physical knowledge when assigning input parameters and simulating all relevant interactions of the involved processes reduces the output uncertainty significantly back down to useful and plausible ranges. When using our framework in an inverse setting, omitting a parameter dependency within a crucial physical process would lead to physical meaningless identified parameters. Thus, we conclude that the additional complexity we propose is both necessary and adequate. Overall, our framework provides a tool for reliable and plausible prediction, risk assessment, and model based decision support for DNAPL contaminated sites.
LIGHT SOURCE: Physical design of a 10 MeV LINAC for polymer radiation processing
NASA Astrophysics Data System (ADS)
Feng, Guang-Yao; Pei, Yuan-Ji; Wang, Lin; Zhang, Shan-Cai; Wu, Cong-Feng; Jin, Kai; Li, Wei-Min
2009-06-01
In China, polymer radiation processing has become one of the most important processing industries. The radiation processing source may be an electron beam accelerator or a radioactive source. Physical design of an electron beam facility applied for radiation crosslinking is introduced in this paper because of it's much higher dose rate and efficiency. Main part of this facility is a 10 MeV travelling wave electron linac with constant impedance accelerating structure. A start to end simulation concerning the linac is reported in this paper. The codes Opera-3d, Poisson-superfish and Parmela are used to describe electromagnetic elements of the accelerator and track particle distribution from the cathode to the end of the linac. After beam dynamic optimization, wave phase velocities in the structure have been chosen to be 0.56, 0.9 and 0.999 respectively. Physical parameters about the main elements such as DC electron gun, iris-loaded periodic structure, solenoids, etc, are presented. Simulation results proves that it can satisfy the industrial requirement. The linac is under construction. Some components have been finished. Measurements proved that they are in a good agreement with the design values.
NASA Astrophysics Data System (ADS)
Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick
2013-06-01
The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.
NASA Astrophysics Data System (ADS)
da Silva, A. M. R.; de Macêdo, J. A.
2016-06-01
On the basis of the technological advancement in the middle and the difficulty of learning by the students in the discipline of physics, this article describes the process of elaboration and implementation of a hypermedia system for high school teachers involving computer simulations for teaching basic concepts of electromagnetism, using free tool. With the completion and publication of the project there will be a new possibility of interaction of students and teachers with the technology in the classroom and in labs.
Advanced graphical user interface for multi-physics simulations using AMST
NASA Astrophysics Data System (ADS)
Hoffmann, Florian; Vogel, Frank
2017-07-01
Numerical modelling of particulate matter has gained much popularity in recent decades. Advanced Multi-physics Simulation Technology (AMST) is a state-of-the-art three dimensional numerical modelling technique combining the eX-tended Discrete Element Method (XDEM) with Computational Fluid Dynamics (CFD) and Finite Element Analysis (FEA) [1]. One major limitation of this code is the lack of a graphical user interface (GUI) meaning that all pre-processing has to be made directly in a HDF5-file. This contribution presents the first graphical pre-processor developed for AMST.
NASA Astrophysics Data System (ADS)
De Lucia, Marco; Kempka, Thomas; Jatnieks, Janis; Kühn, Michael
2017-04-01
Reactive transport simulations - where geochemical reactions are coupled with hydrodynamic transport of reactants - are extremely time consuming and suffer from significant numerical issues. Given the high uncertainties inherently associated with the geochemical models, which also constitute the major computational bottleneck, such requirements may seem inappropriate and probably constitute the main limitation for their wide application. A promising way to ease and speed-up such coupled simulations is achievable employing statistical surrogates instead of "full-physics" geochemical models [1]. Data-driven surrogates are reduced models obtained on a set of pre-calculated "full physics" simulations, capturing their principal features while being extremely fast to compute. Model reduction of course comes at price of a precision loss; however, this appears justified in presence of large uncertainties regarding the parametrization of geochemical processes. This contribution illustrates the integration of surrogates into the flexible simulation framework currently being developed by the authors' research group [2]. The high level language of choice for obtaining and dealing with surrogate models is R, which profits from state-of-the-art methods for statistical analysis of large simulations ensembles. A stand-alone advective mass transport module was furthermore developed in order to add such capability to any multiphase finite volume hydrodynamic simulator within the simulation framework. We present 2D and 3D case studies benchmarking the performance of surrogates and "full physics" chemistry in scenarios pertaining the assessment of geological subsurface utilization. [1] Jatnieks, J., De Lucia, M., Dransch, D., Sips, M.: "Data-driven surrogate model approach for improving the performance of reactive transport simulations.", Energy Procedia 97, 2016, p. 447-453. [2] Kempka, T., Nakaten, B., De Lucia, M., Nakaten, N., Otto, C., Pohl, M., Chabab [Tillner], E., Kühn, M.: "Flexible Simulation Framework to Couple Processes in Complex 3D Models for Subsurface Utilization Assessment.", Energy Procedia, 97, 2016 p. 494-501.
NASA Astrophysics Data System (ADS)
Chen, Dong; Sun, Dihua; Zhao, Min; Zhou, Tong; Cheng, Senlin
2018-07-01
In fact, driving process is a typical cyber physical process which couples tightly the cyber factor of traffic information with the physical components of the vehicles. Meanwhile, the drivers have situation awareness in driving process, which is not only ascribed to the current traffic states, but also extrapolates the changing trend. In this paper, an extended car-following model is proposed to account for drivers' situation awareness. The stability criterion of the proposed model is derived via linear stability analysis. The results show that the stable region of proposed model will be enlarged on the phase diagram compared with previous models. By employing the reductive perturbation method, the modified Korteweg de Vries (mKdV) equation is obtained. The kink-antikink soliton of mKdV equation reveals theoretically the evolution of traffic jams. Numerical simulations are conducted to verify the analytical results. Two typical traffic Scenarios are investigated. The simulation results demonstrate that drivers' situation awareness plays a key role in traffic flow oscillations and the congestion transition.
Recent progress in understanding the eruptions of classical novae
NASA Technical Reports Server (NTRS)
Shara, Michael M.
1988-01-01
Dramatic progress has occurred in the last two decades in understanding the physical processes and events leading up to, and transpiring during the eruption of a classical nova. The mechanism whereby a white dwarf accreting hydrogen-rich matter from a low-mass main-sequence companion produces a nova eruption has been understood since 1970. The mass-transferring binary stellar configuration leads inexorably to thermonuclear runaways detected at distances of megaparsecs. Summarized here are the efforts of many researchers in understanding the physical processes which generate nova eruptions; the effects upon nova eruptions of different binary-system parameters (e.g., chemical composition or mass of the white dwarf, different mass accretion rates); the possible metamorphosis from dwarf to classical novae and back again; and observational diagnostics of novae, including x ray and gamma ray emission, and the characteristics and distributions of novae in globular clusters and in extragalactic systems. While the thermonuclear-runaway model remains the successful cornerstone of nova simulation, it is now clear that a wide variety of physical processes, and three-dimensional hydrodynamic simulations, will be needed to explain the rich spectrum of behavior observed in erupting novae.
NASA Astrophysics Data System (ADS)
Martin, G. M.; Peyrillé, P.; Roehrig, R.; Rio, C.; Caian, M.; Bellon, G.; Codron, F.; Lafore, J.-P.; Poan, D. E.; Idelkadi, A.
2017-03-01
Vertical and horizontal distributions of diabatic heating in the West African monsoon (WAM) region as simulated by four model families are analyzed in order to assess the physical processes that affect the WAM circulation. For each model family, atmosphere-only runs of their CMIP5 configurations are compared with more recent configurations which are on the development path toward CMIP6. The various configurations of these models exhibit significant differences in their heating/moistening profiles, related to the different representation of physical processes such as boundary layer mixing, convection, large-scale condensation and radiative heating/cooling. There are also significant differences in the models' simulation of WAM rainfall patterns and circulations. The weaker the radiative cooling in the Saharan region, the larger the ascent in the rainband and the more intense the monsoon flow, while the latitude of the rainband is related to heating in the Gulf of Guinea region and on the northern side of the Saharan heat low. Overall, this work illustrates the difficulty experienced by current climate models in representing the characteristics of monsoon systems, but also that we can still use them to understand the interactions between local subgrid physical processes and the WAM circulation. Moreover, our conclusions regarding the relationship between errors in the large-scale circulation of the WAM and the structure of the heating by small-scale processes will motivate future studies and model development.
Simulation of Radiation Damage to Neural Cells with the Geant4-DNA Toolkit
NASA Astrophysics Data System (ADS)
Bayarchimeg, Lkhagvaa; Batmunkh, Munkhbaatar; Belov, Oleg; Lkhagva, Oidov
2018-02-01
To help in understanding the physical and biological mechanisms underlying effects of cosmic and therapeutic types of radiation on the central nervous system (CNS), we have developed an original neuron application based on the Geant4 Monte Carlo simulation toolkit, in particular on its biophysical extension Geant4-DNA. The applied simulation technique provides a tool for the simulation of physical, physico-chemical and chemical processes (e.g. production of water radiolysis species in the vicinity of neurons) in realistic geometrical model of neural cells exposed to ionizing radiation. The present study evaluates the microscopic energy depositions and water radiolysis species yields within a detailed structure of a selected neuron taking into account its soma, dendrites, axon and spines following irradiation with carbon and iron ions.
NASA Astrophysics Data System (ADS)
King, W. E.; Anderson, A. T.; Ferencz, R. M.; Hodge, N. E.; Kamath, C.; Khairallah, S. A.; Rubenchik, A. M.
2015-12-01
The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In this paper, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.
NASA Technical Reports Server (NTRS)
Sreekantamurthy, Thammaiah; Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.
2016-01-01
Composite cure process induced residual strains and warping deformations in composite components present significant challenges in the manufacturing of advanced composite structure. As a part of the Manufacturing Process and Simulation initiative of the NASA Advanced Composite Project (ACP), research is being conducted on the composite cure process by developing an understanding of the fundamental mechanisms by which the process induced factors influence the residual responses. In this regard, analytical studies have been conducted on the cure process modeling of composite structural parts with varied physical, thermal, and resin flow process characteristics. The cure process simulation results were analyzed to interpret the cure response predictions based on the underlying physics incorporated into the modeling tool. In the cure-kinetic analysis, the model predictions on the degree of cure, resin viscosity and modulus were interpreted with reference to the temperature distribution in the composite panel part and tool setup during autoclave or hot-press curing cycles. In the fiber-bed compaction simulation, the pore pressure and resin flow velocity in the porous media models, and the compaction strain responses under applied pressure were studied to interpret the fiber volume fraction distribution predictions. In the structural simulation, the effect of temperature on the resin and ply modulus, and thermal coefficient changes during curing on predicted mechanical strains and chemical cure shrinkage strains were studied to understand the residual strains and stress response predictions. In addition to computational analysis, experimental studies were conducted to measure strains during the curing of laminated panels by means of optical fiber Bragg grating sensors (FBGs) embedded in the resin impregnated panels. The residual strain measurements from laboratory tests were then compared with the analytical model predictions. The paper describes the cure process procedures and residual strain predications, and discusses pertinent experimental results from the validation studies.
An Overview of the State of the Art in Atomistic and Multiscale Simulation of Fracture
NASA Technical Reports Server (NTRS)
Saether, Erik; Yamakov, Vesselin; Phillips, Dawn R.; Glaessgen, Edward H.
2009-01-01
The emerging field of nanomechanics is providing a new focus in the study of the mechanics of materials, particularly in simulating fundamental atomic mechanisms involved in the initiation and evolution of damage. Simulating fundamental material processes using first principles in physics strongly motivates the formulation of computational multiscale methods to link macroscopic failure to the underlying atomic processes from which all material behavior originates. This report gives an overview of the state of the art in applying concurrent and sequential multiscale methods to analyze damage and failure mechanisms across length scales.
Hand controller commonality evaluation process
NASA Technical Reports Server (NTRS)
Stuart, Mark A.; Bierschwale, John M.; Wilmington, Robert P.; Adam, Susan C.; Diaz, Manuel F.; Jensen, Dean G.
1990-01-01
A hand controller evaluation process has been developed to determine the appropriate hand controller configurations for supporting remotely controlled devices. These devices include remote manipulator systems (RMS), dexterous robots, and remotely-piloted free flyers. Standard interfaces were developed to evaluate six different hand controllers in three test facilities including dynamic computer simulations, kinematic computer simulations, and physical simulations. The hand controllers under consideration were six degree-of-freedom (DOF) position and rate minimaster and joystick controllers, and three-DOF rate controllers. Task performance data, subjective comments, and anthropometric data obtained during tests were used for controller configuration recommendations to the SSF Program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ely, Geoffrey P.
2013-10-31
This project uses dynamic rupture simulations to investigate high-frequency seismic energy generation. The relevant phenomena (frictional breakdown, shear heating, effective normal-stress fluctuations, material damage, etc.) controlling rupture are strongly interacting and span many orders of magnitude in spatial scale, requiring highresolution simulations that couple disparate physical processes (e.g., elastodynamics, thermal weakening, pore-fluid transport, and heat conduction). Compounding the computational challenge, we know that natural faults are not planar, but instead have roughness that can be approximated by power laws potentially leading to large, multiscale fluctuations in normal stress. The capacity to perform 3D rupture simulations that couple these processes willmore » provide guidance for constructing appropriate source models for high-frequency ground motion simulations. The improved rupture models from our multi-scale dynamic rupture simulations will be used to conduct physicsbased (3D waveform modeling-based) probabilistic seismic hazard analysis (PSHA) for California. These calculation will provide numerous important seismic hazard results, including a state-wide extended earthquake rupture forecast with rupture variations for all significant events, a synthetic seismogram catalog for thousands of scenario events and more than 5000 physics-based seismic hazard curves for California.« less
A Global Three-Dimensional Radiation Hydrodynamic Simulation of a Self-Gravitating Accretion Disk
NASA Astrophysics Data System (ADS)
Phillipson, Rebecca; Vogeley, Michael S.; McMillan, Stephen; Boyd, Patricia
2018-01-01
We present three-dimensional, radiation hydrodynamic simulations of initially thin accretion disks with self-gravity using the grid-based code PLUTO. We produce simulated light curves and spectral energy distributions and compare to observational data of X-ray binary (XRB) and active galactic nuclei (AGN) variability. These simulations are of interest for modeling the role of radiation in accretion physics across decades of mass and frequency. In particular, the characteristics of the time variability in various bandwidths can probe the timescales over which different physical processes dominate the accretion flow. For example, in the case of some XRBs, superorbital periods much longer than the companion orbital period have been observed. Smoothed particle hydrodynamics (SPH) calculations have shown that irradiation-driven warping could be the mechanism underlying these long periods. In the case of AGN, irradiation-driven warping is also predicted to occur in addition to strong outflows originating from thermal and radiation pressure driving forces, which are important processes in understanding feedback and star formation in active galaxies. We compare our simulations to various toy models via traditional time series analysis of our synthetic and observed light curves.
Motor Imagery in Asperger Syndrome: Testing Action Simulation by the Hand Laterality Task
Conson, Massimiliano; Mazzarella, Elisabetta; Frolli, Alessandro; Esposito, Dalila; Marino, Nicoletta; Trojano, Luigi; Massagli, Angelo; Gison, Giovanna; Aprea, Nellantonio; Grossi, Dario
2013-01-01
Asperger syndrome (AS) is a neurodevelopmental condition within the Autism Spectrum Disorders (ASD) characterized by specific difficulties in social interaction, communication and behavioural control. In recent years, it has been suggested that ASD is related to a dysfunction of action simulation processes, but studies employing imitation or action observation tasks provided mixed results. Here, we addressed action simulation processes in adolescents with AS by means of a motor imagery task, the classical hand laterality task (to decide whether a rotated hand image is left or right); mental rotation of letters was also evaluated. As a specific marker of action simulation in hand rotation, we assessed the so-called biomechanical effect, that is the advantage for judging hand pictures showing physically comfortable versus physically awkward positions. We found the biomechanical effect in typically-developing participants but not in participants with AS. Overall performance on both hand laterality and letter rotation tasks, instead, did not differ in the two groups. These findings demonstrated a specific alteration of motor imagery skills in AS. We suggest that impaired mental simulation and imitation of goal-less movements in ASD could be related to shared cognitive mechanisms. PMID:23894683
Schematic driven silicon photonics design
NASA Astrophysics Data System (ADS)
Chrostowski, Lukas; Lu, Zeqin; Flückiger, Jonas; Pond, James; Klein, Jackson; Wang, Xu; Li, Sarah; Tai, Wei; Hsu, En Yao; Kim, Chan; Ferguson, John; Cone, Chris
2016-03-01
Electronic circuit designers commonly start their design process with a schematic, namely an abstract representation of the physical circuit. In integrated photonics on the other hand, it is very common for the design to begin at the physical component level. In order to build large integrated photonic systems, it is crucial to design using a schematic-driven approach. This includes simulations based on schematics, schematic-driven layout, layout versus schematic verification, and post-layout simulations. This paper describes such a design framework implemented using Mentor Graphics and Lumerical Solutions design tools. In addition, we describe challenges in silicon photonics related to manufacturing, and how these can be taken into account in simulations and how these impact circuit performance.
Semi-physical simulation test for micro CMOS star sensor
NASA Astrophysics Data System (ADS)
Yang, Jian; Zhang, Guang-jun; Jiang, Jie; Fan, Qiao-yun
2008-03-01
A designed star sensor must be extensively tested before launching. Testing star sensor requires complicated process with much time and resources input. Even observing sky on the ground is a challenging and time-consuming job, requiring complicated and expensive equipments, suitable time and location, and prone to be interfered by weather. And moreover, not all stars distributed on the sky can be observed by this testing method. Semi-physical simulation in laboratory reduces the testing cost and helps to debug, analyze and evaluate the star sensor system while developing the model. The test system is composed of optical platform, star field simulator, star field simulator computer, star sensor and the central data processing computer. The test system simulates the starlight with high accuracy and good parallelism, and creates static or dynamic image in FOV (Field of View). The conditions of the test are close to observing real sky. With this system, the test of a micro star tracker designed by Beijing University of Aeronautics and Astronautics has been performed successfully. Some indices including full-sky autonomous star identification time, attitude update frequency and attitude precision etc. meet design requirement of the star sensor. Error source of the testing system is also analyzed. It is concluded that the testing system is cost-saving, efficient, and contributes to optimizing the embed arithmetic, shortening the development cycle and improving engineering design processes.
Parameterization Interactions in Global Aquaplanet Simulations
NASA Astrophysics Data System (ADS)
Bhattacharya, Ritthik; Bordoni, Simona; Suselj, Kay; Teixeira, João.
2018-02-01
Global climate simulations rely on parameterizations of physical processes that have scales smaller than the resolved ones. In the atmosphere, these parameterizations represent moist convection, boundary layer turbulence and convection, cloud microphysics, longwave and shortwave radiation, and the interaction with the land and ocean surface. These parameterizations can generate different climates involving a wide range of interactions among parameterizations and between the parameterizations and the resolved dynamics. To gain a simplified understanding of a subset of these interactions, we perform aquaplanet simulations with the global version of the Weather Research and Forecasting (WRF) model employing a range (in terms of properties) of moist convection and boundary layer (BL) parameterizations. Significant differences are noted in the simulated precipitation amounts, its partitioning between convective and large-scale precipitation, as well as in the radiative impacts. These differences arise from the way the subcloud physics interacts with convection, both directly and through various pathways involving the large-scale dynamics and the boundary layer, convection, and clouds. A detailed analysis of the profiles of the different tendencies (from the different physical processes) for both potential temperature and water vapor is performed. While different combinations of convection and boundary layer parameterizations can lead to different climates, a key conclusion of this study is that similar climates can be simulated with model versions that are different in terms of the partitioning of the tendencies: the vertically distributed energy and water balances in the tropics can be obtained with significantly different profiles of large-scale, convection, and cloud microphysics tendencies.
Feel, Imagine and Learn!--Haptic Augmented Simulation and Embodied Instruction in Physics Learning
ERIC Educational Resources Information Center
Han, In Sook
2010-01-01
The purpose of this study was to investigate the potentials and effects of an embodied instructional model in abstract concept learning. This embodied instructional process included haptic augmented educational simulation as an instructional tool to provide perceptual experiences as well as further instruction to activate those previous…
Further development of a global pollution model for CO, CH4, and CH2 O
NASA Technical Reports Server (NTRS)
Peters, L. K.
1975-01-01
Global tropospheric pollution models are developed that describe the transport and the physical and chemical processes occurring between the principal sources and sinks of CH4 and CO. Results are given of long term static chemical kinetic computer simulations and preliminary short term dynamic simulations.
Physical environment virtualization for human activities recognition
NASA Astrophysics Data System (ADS)
Poshtkar, Azin; Elangovan, Vinayak; Shirkhodaie, Amir; Chan, Alex; Hu, Shuowen
2015-05-01
Human activity recognition research relies heavily on extensive datasets to verify and validate performance of activity recognition algorithms. However, obtaining real datasets are expensive and highly time consuming. A physics-based virtual simulation can accelerate the development of context based human activity recognition algorithms and techniques by generating relevant training and testing videos simulating diverse operational scenarios. In this paper, we discuss in detail the requisite capabilities of a virtual environment to aid as a test bed for evaluating and enhancing activity recognition algorithms. To demonstrate the numerous advantages of virtual environment development, a newly developed virtual environment simulation modeling (VESM) environment is presented here to generate calibrated multisource imagery datasets suitable for development and testing of recognition algorithms for context-based human activities. The VESM environment serves as a versatile test bed to generate a vast amount of realistic data for training and testing of sensor processing algorithms. To demonstrate the effectiveness of VESM environment, we present various simulated scenarios and processed results to infer proper semantic annotations from the high fidelity imagery data for human-vehicle activity recognition under different operational contexts.
A compact physical model for the simulation of pNML-based architectures
NASA Astrophysics Data System (ADS)
Turvani, G.; Riente, F.; Plozner, E.; Schmitt-Landsiedel, D.; Breitkreutz-v. Gamm, S.
2017-05-01
Among emerging technologies, perpendicular Nanomagnetic Logic (pNML) seems to be very promising because of its capability of combining logic and memory onto the same device, scalability, 3D-integration and low power consumption. Recently, Full Adder (FA) structures clocked by a global magnetic field have been experimentally demonstrated and detailed characterizations of the switching process governing the domain wall (DW) nucleation probability Pnuc and time tnuc have been performed. However, the design of pNML architectures represent a crucial point in the study of this technology; this can have a remarkable impact on the reliability of pNML structures. Here, we present a compact model developed in VHDL which enables to simulate complex pNML architectures while keeping into account critical physical parameters. Therefore, such parameters have been extracted from the experiments, fitted by the corresponding physical equations and encapsulated into the proposed model. Within this, magnetic structures are decomposed into a few basic elements (nucleation centers, nanowires, inverters etc.) represented by the according physical description. To validate the model, we redesigned a FA and compared our simulation results to the experiment. With this compact model of pNML devices we have envisioned a new methodology which makes it possible to simulate and test the physical behavior of complex architectures with very low computational costs.
The QuakeSim Project: Numerical Simulations for Active Tectonic Processes
NASA Technical Reports Server (NTRS)
Donnellan, Andrea; Parker, Jay; Lyzenga, Greg; Granat, Robert; Fox, Geoffrey; Pierce, Marlon; Rundle, John; McLeod, Dennis; Grant, Lisa; Tullis, Terry
2004-01-01
In order to develop a solid earth science framework for understanding and studying of active tectonic and earthquake processes, this task develops simulation and analysis tools to study the physics of earthquakes using state-of-the art modeling, data manipulation, and pattern recognition technologies. We develop clearly defined accessible data formats and code protocols as inputs to the simulations. these are adapted to high-performance computers because the solid earth system is extremely complex and nonlinear resulting in computationally intensive problems with millions of unknowns. With these tools it will be possible to construct the more complex models and simulations necessary to develop hazard assessment systems critical for reducing future losses from major earthquakes.
On Efficient Multigrid Methods for Materials Processing Flows with Small Particles
NASA Technical Reports Server (NTRS)
Thomas, James (Technical Monitor); Diskin, Boris; Harik, VasylMichael
2004-01-01
Multiscale modeling of materials requires simulations of multiple levels of structural hierarchy. The computational efficiency of numerical methods becomes a critical factor for simulating large physical systems with highly desperate length scales. Multigrid methods are known for their superior efficiency in representing/resolving different levels of physical details. The efficiency is achieved by employing interactively different discretizations on different scales (grids). To assist optimization of manufacturing conditions for materials processing with numerous particles (e.g., dispersion of particles, controlling flow viscosity and clusters), a new multigrid algorithm has been developed for a case of multiscale modeling of flows with small particles that have various length scales. The optimal efficiency of the algorithm is crucial for accurate predictions of the effect of processing conditions (e.g., pressure and velocity gradients) on the local flow fields that control the formation of various microstructures or clusters.
Observing fermionic statistics with photons in arbitrary processes
Matthews, Jonathan C. F.; Poulios, Konstantinos; Meinecke, Jasmin D. A.; Politi, Alberto; Peruzzo, Alberto; Ismail, Nur; Wörhoff, Kerstin; Thompson, Mark G.; O'Brien, Jeremy L.
2013-01-01
Quantum mechanics defines two classes of particles-bosons and fermions-whose exchange statistics fundamentally dictate quantum dynamics. Here we develop a scheme that uses entanglement to directly observe the correlated detection statistics of any number of fermions in any physical process. This approach relies on sending each of the entangled particles through identical copies of the process and by controlling a single phase parameter in the entangled state, the correlated detection statistics can be continuously tuned between bosonic and fermionic statistics. We implement this scheme via two entangled photons shared across the polarisation modes of a single photonic chip to directly mimic the fermion, boson and intermediate behaviour of two-particles undergoing a continuous time quantum walk. The ability to simulate fermions with photons is likely to have applications for verifying boson scattering and for observing particle correlations in analogue simulation using any physical platform that can prepare the entangled state prescribed here. PMID:23531788
Thought Experiments in Physics Problem-solving: On Intuition and Imagistic Simulation
ERIC Educational Resources Information Center
Georgiou, Andreas
2005-01-01
This study is part of a larger research agenda, which includes future doctoral study, aiming to investigate the psychological processes of thought experiments. How do thought-experimenters establish relations between their imaginary worlds and the physical one? How does a technique devoid of new sensory input result to new empirical knowledge? In…
Mechanics Simulations in Second Life
ERIC Educational Resources Information Center
Black, Kelly
2010-01-01
This paper examines the use of the 3-D virtual world Second Life to explore basic mechanics in physics. In Second Life, students can create scripts that take advantage of a virtual physics engine in order to conduct experiments that focus on specific phenomena. The paper explores two particular examples of this process: (1) the movement of an…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-03
... WIPP PA process culminates in a series of computer simulations that model the physical attributes of... and Processes LWA Land Withdrawal Act MSHA Mine Safety and Health Administration NMED New Mexico... Agency's technical review process was to determine whether, with the new design, the WIPP adequately...
NASA Astrophysics Data System (ADS)
Gao, Jie; Zheng, Jianrong; Zhao, Yinghui
2017-08-01
With the rapid development of LNG vehicle in China, the operator's training and assessment of the operating skills cannot operate on material objects, because of Vehicle Gas Cylinder's high pressure, flammable and explosive characteristics. LNG Vehicle Gas Cylinder's filling simulation system with semi-physical simulation technology presents the overall design and procedures of the simulation system, and elaborates the realization of the practical analog machine, data acquisition and control system and the computer software, and introduces the design process of equipment simulation model in detail. According to the designed assessment system of the Vehicle Gas Cylinder, it can obtain the operation on the actual cylinder filling and visual effects for the operator, and automatically record operation, the results of real operation with its software, and achieve the operators' training and assessment of operating skills on mobile special equipment.
Strategies for Large Scale Implementation of a Multiscale, Multiprocess Integrated Hydrologic Model
NASA Astrophysics Data System (ADS)
Kumar, M.; Duffy, C.
2006-05-01
Distributed models simulate hydrologic state variables in space and time while taking into account the heterogeneities in terrain, surface, subsurface properties and meteorological forcings. Computational cost and complexity associated with these model increases with its tendency to accurately simulate the large number of interacting physical processes at fine spatio-temporal resolution in a large basin. A hydrologic model run on a coarse spatial discretization of the watershed with limited number of physical processes needs lesser computational load. But this negatively affects the accuracy of model results and restricts physical realization of the problem. So it is imperative to have an integrated modeling strategy (a) which can be universally applied at various scales in order to study the tradeoffs between computational complexity (determined by spatio- temporal resolution), accuracy and predictive uncertainty in relation to various approximations of physical processes (b) which can be applied at adaptively different spatial scales in the same domain by taking into account the local heterogeneity of topography and hydrogeologic variables c) which is flexible enough to incorporate different number and approximation of process equations depending on model purpose and computational constraint. An efficient implementation of this strategy becomes all the more important for Great Salt Lake river basin which is relatively large (~89000 sq. km) and complex in terms of hydrologic and geomorphic conditions. Also the types and the time scales of hydrologic processes which are dominant in different parts of basin are different. Part of snow melt runoff generated in the Uinta Mountains infiltrates and contributes as base flow to the Great Salt Lake over a time scale of decades to centuries. The adaptive strategy helps capture the steep topographic and climatic gradient along the Wasatch front. Here we present the aforesaid modeling strategy along with an associated hydrologic modeling framework which facilitates a seamless, computationally efficient and accurate integration of the process model with the data model. The flexibility of this framework leads to implementation of multiscale, multiresolution, adaptive refinement/de-refinement and nested modeling simulations with least computational burden. However, performing these simulations and related calibration of these models over a large basin at higher spatio- temporal resolutions is computationally intensive and requires use of increasing computing power. With the advent of parallel processing architectures, high computing performance can be achieved by parallelization of existing serial integrated-hydrologic-model code. This translates to running the same model simulation on a network of large number of processors thereby reducing the time needed to obtain solution. The paper also discusses the implementation of the integrated model on parallel processors. Also will be discussed the mapping of the problem on multi-processor environment, method to incorporate coupling between hydrologic processes using interprocessor communication models, model data structure and parallel numerical algorithms to obtain high performance.
Icing simulation: A survey of computer models and experimental facilities
NASA Technical Reports Server (NTRS)
Potapczuk, M. G.; Reinmann, J. J.
1991-01-01
A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focussed on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for increased understanding of the physical processes governing ice accretion, ice shedding, and iced airfoil aerodynamics is examined.
Icing simulation: A survey of computer models and experimental facilities
NASA Technical Reports Server (NTRS)
Potapczuk, M. G.; Reinmann, J. J.
1991-01-01
A survey of the current methods for simulation of the response of an aircraft or aircraft subsystem to an icing encounter is presented. The topics discussed include a computer code modeling of aircraft icing and performance degradation, an evaluation of experimental facility simulation capabilities, and ice protection system evaluation tests in simulated icing conditions. Current research focused on upgrading simulation fidelity of both experimental and computational methods is discussed. The need for the increased understanding of the physical processes governing ice accretion, ice shedding, and iced aerodynamics is examined.
Integrating physically based simulators with Event Detection Systems: Multi-site detection approach.
Housh, Mashor; Ohar, Ziv
2017-03-01
The Fault Detection (FD) Problem in control theory concerns of monitoring a system to identify when a fault has occurred. Two approaches can be distinguished for the FD: Signal processing based FD and Model-based FD. The former concerns of developing algorithms to directly infer faults from sensors' readings, while the latter uses a simulation model of the real-system to analyze the discrepancy between sensors' readings and expected values from the simulation model. Most contamination Event Detection Systems (EDSs) for water distribution systems have followed the signal processing based FD, which relies on analyzing the signals from monitoring stations independently of each other, rather than evaluating all stations simultaneously within an integrated network. In this study, we show that a model-based EDS which utilizes a physically based water quality and hydraulics simulation models, can outperform the signal processing based EDS. We also show that the model-based EDS can facilitate the development of a Multi-Site EDS (MSEDS), which analyzes the data from all the monitoring stations simultaneously within an integrated network. The advantage of the joint analysis in the MSEDS is expressed by increased detection accuracy (higher true positive alarms and fewer false alarms) and shorter detection time. Copyright © 2016 Elsevier Ltd. All rights reserved.
The methods used for simulating aerosol physical and chemical processes in a new air pollution modeling system are discussed and analyzed. Such processes include emissions, nucleation, coagulation, reversible chemistry, condensation, dissolution, evaporation, irreversible chem...
Microphysics in Multi-scale Modeling System with Unified Physics
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2012-01-01
Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.
How Monte Carlo heuristics aid to identify the physical processes of drug release kinetics.
Lecca, Paola
2018-01-01
We implement a Monte Carlo heuristic algorithm to model drug release from a solid dosage form. We show that with Monte Carlo simulations it is possible to identify and explain the causes of the unsatisfactory predictive power of current drug release models. It is well known that the power-law, the exponential models, as well as those derived from or inspired by them accurately reproduce only the first 60% of the release curve of a drug from a dosage form. In this study, by using Monte Carlo simulation approaches, we show that these models fit quite accurately almost the entire release profile when the release kinetics is not governed by the coexistence of different physico-chemical mechanisms. We show that the accuracy of the traditional models are comparable with those of Monte Carlo heuristics when these heuristics approximate and oversimply the phenomenology of drug release. This observation suggests to develop and use novel Monte Carlo simulation heuristics able to describe the complexity of the release kinetics, and consequently to generate data more similar to those observed in real experiments. Implementing Monte Carlo simulation heuristics of the drug release phenomenology may be much straightforward and efficient than hypothesizing and implementing from scratch complex mathematical models of the physical processes involved in drug release. Identifying and understanding through simulation heuristics what processes of this phenomenology reproduce the observed data and then formalize them in mathematics may allow avoiding time-consuming, trial-error based regression procedures. Three bullet points, highlighting the customization of the procedure. •An efficient heuristics based on Monte Carlo methods for simulating drug release from solid dosage form encodes is presented. It specifies the model of the physical process in a simple but accurate way in the formula of the Monte Carlo Micro Step (MCS) time interval.•Given the experimentally observed curve of drug release, we point out how Monte Carlo heuristics can be integrated in an evolutionary algorithmic approach to infer the mode of MCS best fitting the observed data, and thus the observed release kinetics.•The software implementing the method is written in R language, the free most used language in the bioinformaticians community.
An Example-Based Brain MRI Simulation Framework.
He, Qing; Roy, Snehashis; Jog, Amod; Pham, Dzung L
2015-02-21
The simulation of magnetic resonance (MR) images plays an important role in the validation of image analysis algorithms such as image segmentation, due to lack of sufficient ground truth in real MR images. Previous work on MRI simulation has focused on explicitly modeling the MR image formation process. However, because of the overwhelming complexity of MR acquisition these simulations must involve simplifications and approximations that can result in visually unrealistic simulated images. In this work, we describe an example-based simulation framework, which uses an "atlas" consisting of an MR image and its anatomical models derived from the hard segmentation. The relationships between the MR image intensities and its anatomical models are learned using a patch-based regression that implicitly models the physics of the MR image formation. Given the anatomical models of a new brain, a new MR image can be simulated using the learned regression. This approach has been extended to also simulate intensity inhomogeneity artifacts based on the statistical model of training data. Results show that the example based MRI simulation method is capable of simulating different image contrasts and is robust to different choices of atlas. The simulated images resemble real MR images more than simulations produced by a physics-based model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radhakrishnan, Balasubramaniam; Fattebert, Jean-Luc; Gorti, Sarma B.
Additive Manufacturing (AM) refers to a process by which digital three-dimensional (3-D) design data is converted to build up a component by depositing material layer-by-layer. United Technologies Corporation (UTC) is currently involved in fabrication and certification of several AM aerospace structural components made from aerospace materials. This is accomplished by using optimized process parameters determined through numerous design-of-experiments (DOE)-based studies. Certification of these components is broadly recognized as a significant challenge, with long lead times, very expensive new product development cycles and very high energy consumption. Because of these challenges, United Technologies Research Center (UTRC), together with UTC business unitsmore » have been developing and validating an advanced physics-based process model. The specific goal is to develop a physics-based framework of an AM process and reliably predict fatigue properties of built-up structures as based on detailed solidification microstructures. Microstructures are predicted using process control parameters including energy source power, scan velocity, deposition pattern, and powder properties. The multi-scale multi-physics model requires solution and coupling of governing physics that will allow prediction of the thermal field and enable solution at the microstructural scale. The state-of-the-art approach to solve these problems requires a huge computational framework and this kind of resource is only available within academia and national laboratories. The project utilized the parallel phase-fields codes at Oak Ridge National Laboratory (ORNL) and Lawrence Livermore National Laboratory (LLNL), along with the high-performance computing (HPC) capabilities existing at the two labs to demonstrate the simulation of multiple dendrite growth in threedimensions (3-D). The LLNL code AMPE was used to implement the UTRC phase field model that was previously developed for a model binary alloy, and the simulation results were compared against the UTRC simulation results, followed by extension of the UTRC model to simulate multiple dendrite growth in 3-D. The ORNL MEUMAPPS code was used to simulate dendritic growth in a model ternary alloy with the same equilibrium solidification range as the Ni-base alloy 718 using realistic model parameters, including thermodynamic integration with a Calphad based model for the ternary alloy. Implementation of the UTRC model in AMPE met with several numerical and parametric issues that were resolved and good comparison between the simulation results obtained by the two codes was demonstrated for two dimensional (2-D) dendrites. 3-D dendrite growth was then demonstrated with the AMPE code using nondimensional parameters obtained in 2-D simulations. Multiple dendrite growth in 2-D and 3-D were demonstrated using ORNL’s MEUMAPPS code using simple thermal boundary conditions. MEUMAPPS was then modified to incorporate the complex, time-dependent thermal boundary conditions obtained by UTRC’s thermal modeling of single track AM experiments to drive the phase field simulations. The results were in good agreement with UTRC’s experimental measurements.« less
NASA Astrophysics Data System (ADS)
Massmann, J.; Nagel, T.; Bilke, L.; Böttcher, N.; Heusermann, S.; Fischer, T.; Kumar, V.; Schäfers, A.; Shao, H.; Vogel, P.; Wang, W.; Watanabe, N.; Ziefle, G.; Kolditz, O.
2016-12-01
As part of the German site selection process for a high-level nuclear waste repository, different repository concepts in the geological candidate formations rock salt, clay stone and crystalline rock are being discussed. An open assessment of these concepts using numerical simulations requires physical models capturing the individual particularities of each rock type and associated geotechnical barrier concept to a comparable level of sophistication. In a joint work group of the Helmholtz Centre for Environmental Research (UFZ) and the German Federal Institute for Geosciences and Natural Resources (BGR), scientists of the UFZ are developing and implementing multiphysical process models while BGR scientists apply them to large scale analyses. The advances in simulation methods for waste repositories are incorporated into the open-source code OpenGeoSys. Here, recent application-driven progress in this context is highlighted. A robust implementation of visco-plasticity with temperature-dependent properties into a framework for the thermo-mechanical analysis of rock salt will be shown. The model enables the simulation of heat transport along with its consequences on the elastic response as well as on primary and secondary creep or the occurrence of dilatancy in the repository near field. Transverse isotropy, non-isothermal hydraulic processes and their coupling to mechanical stresses are taken into account for the analysis of repositories in clay stone. These processes are also considered in the near field analyses of engineered barrier systems, including the swelling/shrinkage of the bentonite material. The temperature-dependent saturation evolution around the heat-emitting waste container is described by different multiphase flow formulations. For all mentioned applications, we illustrate the workflow from model development and implementation, over verification and validation, to repository-scale application simulations using methods of high performance computing.
Molecular dynamics simulations through GPU video games technologies
Loukatou, Styliani; Papageorgiou, Louis; Fakourelis, Paraskevas; Filntisi, Arianna; Polychronidou, Eleftheria; Bassis, Ioannis; Megalooikonomou, Vasileios; Makałowski, Wojciech; Vlachakis, Dimitrios; Kossida, Sophia
2016-01-01
Bioinformatics is the scientific field that focuses on the application of computer technology to the management of biological information. Over the years, bioinformatics applications have been used to store, process and integrate biological and genetic information, using a wide range of methodologies. One of the most de novo techniques used to understand the physical movements of atoms and molecules is molecular dynamics (MD). MD is an in silico method to simulate the physical motions of atoms and molecules under certain conditions. This has become a state strategic technique and now plays a key role in many areas of exact sciences, such as chemistry, biology, physics and medicine. Due to their complexity, MD calculations could require enormous amounts of computer memory and time and therefore their execution has been a big problem. Despite the huge computational cost, molecular dynamics have been implemented using traditional computers with a central memory unit (CPU). A graphics processing unit (GPU) computing technology was first designed with the goal to improve video games, by rapidly creating and displaying images in a frame buffer such as screens. The hybrid GPU-CPU implementation, combined with parallel computing is a novel technology to perform a wide range of calculations. GPUs have been proposed and used to accelerate many scientific computations including MD simulations. Herein, we describe the new methodologies developed initially as video games and how they are now applied in MD simulations. PMID:27525251
A Goddard Multi-Scale Modeling System with Unified Physics
NASA Technical Reports Server (NTRS)
Tao, W.K.; Anderson, D.; Atlas, R.; Chern, J.; Houser, P.; Hou, A.; Lang, S.; Lau, W.; Peters-Lidard, C.; Kakar, R.;
2008-01-01
Numerical cloud resolving models (CRMs), which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that CRMs agree with observations in simulating various types of clouds and cloud systems from different geographic locations. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that Numerical Weather Prediction (NWP) and regional scale model can be run in grid size similar to cloud resolving model through nesting technique. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a szrper-parameterization or multi-scale modeling -framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign can provide initial conditions as well as validation through utilizing the Earth Satellite simulators. At Goddard, we have developed a multi-scale modeling system with unified physics. The modeling system consists a coupled GCM-CRM (or MMF); a state-of-the-art weather research forecast model (WRF) and a cloud-resolving model (Goddard Cumulus Ensemble model). In these models, the same microphysical schemes (2ICE, several 3ICE), radiation (including explicitly calculated cloud optical properties), and surface models are applied. In addition, a comprehensive unified Earth Satellite simulator has been developed at GSFC, which is designed to fully utilize the multi-scale modeling system. A brief review of the multi-scale modeling system with unified physics/simulator and examples is presented in this article.
Simulation of Ejecta Production and Mixing Process of Sn Sample under shock loading
NASA Astrophysics Data System (ADS)
Wang, Pei; Chen, Dawei; Sun, Haiquan; Ma, Dongjun
2017-06-01
Ejection may occur when a strong shock wave release at the free surface of metal material and the ejecta of high-speed particulate matter will be formed and further mixed with the surrounding gas. Ejecta production and its mixing process has been one of the most difficult problems in shock physics remain unresolved, and have many important engineering applications in the imploding compression science. The present paper will introduce a methodology for the theoretical modeling and numerical simulation of the complex ejection and mixing process. The ejecta production is decoupled with the particle mixing process, and the ejecta state can be achieved by the direct numerical simulation for the evolution of initial defect on the metal surface. Then the particle mixing process can be simulated and resolved by a two phase gas-particle model which uses the aforementioned ejecta state as the initial condition. A preliminary ejecta experiment of planar Sn metal Sample has validated the feasibility of the proposed methodology.
Addressing spatial scales and new mechanisms in climate impact ecosystem modeling
NASA Astrophysics Data System (ADS)
Poulter, B.; Joetzjer, E.; Renwick, K.; Ogunkoya, G.; Emmett, K.
2015-12-01
Climate change impacts on vegetation distributions are typically addressed using either an empirical approach, such as a species distribution model (SDM), or with process-based methods, for example, dynamic global vegetation models (DGVMs). Each approach has its own benefits and disadvantages. For example, an SDM is constrained by data and few parameters, but does not include adaptation or acclimation processes or other ecosystem feedbacks that may act to mitigate or enhance climate effects. Alternatively, a DGVM model includes many mechanisms relating plant growth and disturbance to climate, but simulations are costly to perform at high-spatial resolution and there remains large uncertainty on a variety of fundamental physical processes. To address these issues, here, we present two DGVM-based case studies where i) high-resolution (1 km) simulations are being performed for vegetation in the Greater Yellowstone Ecosystem using a biogeochemical, forest gap model, LPJ-GUESS, and ii) where new mechanisms for simulating tropical tree-mortality are being introduced. High-resolution DGVM model simulations require not only computing and reorganizing code but also a consideration of scaling issues on vegetation dynamics and stochasticity and also on disturbance and migration. New mechanisms for simulating forest mortality must consider hydraulic limitations and carbon reserves and their interactions on source-sink dynamics and in controlling water potentials. Improving DGVM approaches by addressing spatial scale challenges and integrating new approaches for estimating forest mortality will provide new insights more relevant for land management and possibly reduce uncertainty by physical processes more directly comparable to experimental and observational evidence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, E.A.; Smed, P.F.; Bryndum, M.B.
The paper describes the numerical program, PIPESIN, that simulates the behavior of a pipeline placed on an erodible seabed. PIPEline Seabed INteraction from installation until a stable pipeline seabed configuration has occurred is simulated in the time domain including all important physical processes. The program is the result of the joint research project, ``Free Span Development and Self-lowering of Offshore Pipelines`` sponsored by EU and a group of companies and carried out by the Danish Hydraulic Institute and Delft Hydraulics. The basic modules of PIPESIN are described. The description of the scouring processes has been based on and verified throughmore » physical model tests carried out as part of the research project. The program simulates a section of the pipeline (typically 500 m) in the time domain, the main input being time series of the waves and current. The main results include predictions of the onset of free spans, their length distribution, their variation in time, and the lowering of the pipeline as function of time.« less
NASA Technical Reports Server (NTRS)
Knox, James Clinton
2016-01-01
The 1-D axially dispersed plug flow model is a mathematical model widely used for the simulation of adsorption processes. Lumped mass transfer coefficients such as the Glueckauf linear driving force (LDF) term and the axial dispersion coefficient are generally obtained by fitting simulation results to the experimental breakthrough test data. An approach is introduced where these parameters, along with the only free parameter in the energy balance equations, are individually fit to specific test data that isolates the appropriate physics. It is shown that with this approach this model provides excellent simulation results for the C02 on zeolite SA sorbent/sorbate system; however, for the H20 on zeolite SA system, non-physical deviations from constant pattern behavior occur when fitting dispersive experimental results with a large axial dispersion coefficient. A method has also been developed that determines a priori what values of the LDF and axial dispersion terms will result in non-physical simulation results for a specific sorbent/sorbate system when using the one-dimensional axially dispersed plug flow model. A relationship between the steepness of the adsorption equilibrium isotherm as indicated by the distribution factor, the magnitude of the axial dispersion and mass transfer coefficient, and the resulting non-physical behavior is derived. This relationship is intended to provide a guide for avoiding non-physical behavior by limiting the magnitude of the axial dispersion term on the basis of the mass transfer coefficient and distribution factor.
NASA Technical Reports Server (NTRS)
Tao, Wei Kuo; Chen, C.-S.; Jia, Y.; Baker, D.; Lang, S.; Wetzel, P.; Lau, W. K.-M.
2001-01-01
Several heavy precipitation episodes occurred over Taiwan from August 10 to 13, 1994. Precipitation patterns and characteristics are quite different between the precipitation events that occurred from August 10 and I I and from August 12 and 13. In Part I (Chen et al. 2001), the environmental situation and precipitation characteristics are analyzed using the EC/TOGA data, ground-based radar data, surface rainfall patterns, surface wind data, and upper air soundings. In this study (Part II), the Penn State/NCAR Mesoscale Model (MM5) is used to study the precipitation characteristics of these heavy precipitation events. Various physical processes (schemes) developed at NASA Goddard Space Flight Center (i.e., cloud microphysics scheme, radiative transfer model, and land-soil-vegetation surface model) have recently implemented into the MM5. These physical packages are described in the paper, Two way interactive nested grids are used with horizontal resolutions of 45, 15 and 5 km. The model results indicated that Cloud physics, land surface and radiation processes generally do not change the location (horizontal distribution) of heavy precipitation. The Goddard 3-class ice scheme produced more rainfall than the 2-class scheme. The Goddard multi-broad-band radiative transfer model reduced precipitation compared to a one-broad band (emissivity) radiation model. The Goddard land-soil-vegetation surface model also reduce the rainfall compared to a simple surface model in which the surface temperature is computed from a Surface energy budget following the "force-re store" method. However, model runs including all Goddard physical processes enhanced precipitation significantly for both cases. The results from these runs are in better agreement with observations. Despite improved simulations using different physical schemes, there are still some deficiencies in the model simulations. Some potential problems are discussed. Sensitivity tests (removing either terrain or radiative processes) are performed to identify the physical processes that determine the precipitation patterns and characteristics for heavy rainfall events. These sensitivity tests indicated that terrain can play a major role in determining the exact location for both precipitation events. The terrain can also play a major role in determining the intensity of precipitation for both events. However, it has a large impact on one event but a smaller one on the other. The radiative processes are also important for determining, the precipitation patterns for one case but. not the other. The radiative processes can also effect the total rainfall for both cases to different extents.
King, W. E.; Anderson, A. T.; Ferencz, R. M.; ...
2015-12-29
The production of metal parts via laser powder bed fusion additive manufacturing is growing exponentially. However, the transition of this technology from production of prototypes to production of critical parts is hindered by a lack of confidence in the quality of the part. Confidence can be established via a fundamental understanding of the physics of the process. It is generally accepted that this understanding will be increasingly achieved through modeling and simulation. However, there are significant physics, computational, and materials challenges stemming from the broad range of length and time scales and temperature ranges associated with the process. In thismore » study, we review the current state of the art and describe the challenges that need to be met to achieve the desired fundamental understanding of the physics of the process.« less
Yang, Tao; Sezer, Hayri; Celik, Ismail B.; ...
2015-06-02
In the present paper, a physics-based procedure combining experiments and multi-physics numerical simulations is developed for overall analysis of SOFCs operational diagnostics and performance predictions. In this procedure, essential information for the fuel cell is extracted first by utilizing empirical polarization analysis in conjunction with experiments and refined by multi-physics numerical simulations via simultaneous analysis and calibration of polarization curve and impedance behavior. The performance at different utilization cases and operating currents is also predicted to confirm the accuracy of the proposed model. It is demonstrated that, with the present electrochemical model, three air/fuel flow conditions are needed to producemore » a set of complete data for better understanding of the processes occurring within SOFCs. After calibration against button cell experiments, the methodology can be used to assess performance of planar cell without further calibration. The proposed methodology would accelerate the calibration process and improve the efficiency of design and diagnostics.« less
An Integrated Simulation Module for Cyber-Physical Automation Systems †
Ferracuti, Francesco; Freddi, Alessandro; Monteriù, Andrea; Prist, Mariorosario
2016-01-01
The integration of Wireless Sensors Networks (WSNs) into Cyber Physical Systems (CPSs) is an important research problem to solve in order to increase the performances, safety, reliability and usability of wireless automation systems. Due to the complexity of real CPSs, emulators and simulators are often used to replace the real control devices and physical connections during the development stage. The most widespread simulators are free, open source, expandable, flexible and fully integrated into mathematical modeling tools; however, the connection at a physical level and the direct interaction with the real process via the WSN are only marginally tackled; moreover, the simulated wireless sensor motes are not able to generate the analogue output typically required for control purposes. A new simulation module for the control of a wireless cyber-physical system is proposed in this paper. The module integrates the COntiki OS JAva Simulator (COOJA), a cross-level wireless sensor network simulator, and the LabVIEW system design software from National Instruments. The proposed software module has been called “GILOO” (Graphical Integration of Labview and cOOja). It allows one to develop and to debug control strategies over the WSN both using virtual or real hardware modules, such as the National Instruments Real-Time Module platform, the CompactRio, the Supervisory Control And Data Acquisition (SCADA), etc. To test the proposed solution, we decided to integrate it with one of the most popular simulators, i.e., the Contiki OS, and wireless motes, i.e., the Sky mote. As a further contribution, the Contiki Sky DAC driver and a new “Advanced Sky GUI” have been proposed and tested in the COOJA Simulator in order to provide the possibility to develop control over the WSN. To test the performances of the proposed GILOO software module, several experimental tests have been made, and interesting preliminary results are reported. The GILOO module has been applied to a smart home mock-up where a networked control has been developed for the LED lighting system. PMID:27164109
An Integrated Simulation Module for Cyber-Physical Automation Systems.
Ferracuti, Francesco; Freddi, Alessandro; Monteriù, Andrea; Prist, Mariorosario
2016-05-05
The integration of Wireless Sensors Networks (WSNs) into Cyber Physical Systems (CPSs) is an important research problem to solve in order to increase the performances, safety, reliability and usability of wireless automation systems. Due to the complexity of real CPSs, emulators and simulators are often used to replace the real control devices and physical connections during the development stage. The most widespread simulators are free, open source, expandable, flexible and fully integrated into mathematical modeling tools; however, the connection at a physical level and the direct interaction with the real process via the WSN are only marginally tackled; moreover, the simulated wireless sensor motes are not able to generate the analogue output typically required for control purposes. A new simulation module for the control of a wireless cyber-physical system is proposed in this paper. The module integrates the COntiki OS JAva Simulator (COOJA), a cross-level wireless sensor network simulator, and the LabVIEW system design software from National Instruments. The proposed software module has been called "GILOO" (Graphical Integration of Labview and cOOja). It allows one to develop and to debug control strategies over the WSN both using virtual or real hardware modules, such as the National Instruments Real-Time Module platform, the CompactRio, the Supervisory Control And Data Acquisition (SCADA), etc. To test the proposed solution, we decided to integrate it with one of the most popular simulators, i.e., the Contiki OS, and wireless motes, i.e., the Sky mote. As a further contribution, the Contiki Sky DAC driver and a new "Advanced Sky GUI" have been proposed and tested in the COOJA Simulator in order to provide the possibility to develop control over the WSN. To test the performances of the proposed GILOO software module, several experimental tests have been made, and interesting preliminary results are reported. The GILOO module has been applied to a smart home mock-up where a networked control has been developed for the LED lighting system.
NASA Astrophysics Data System (ADS)
Zhang, G.; Chen, F.; Gan, Y.
2017-12-01
Assessing and mitigating uncertainties in the Noah-MP land-model simulations over the Tibet Plateau region Guo Zhang1, Fei Chen1,2, Yanjun Gan11State Key Laboratory of Severe Weather, Chinese Academy of Meteorological Sciences, Beijing, China 2National Center for Atmospheric Research, Boulder, Colorado, USA Uncertainties in the Noah with multiparameterization (Noah-MP) land surface model were assessed through physics ensemble simulations for four sparsely-vegetated sites located in the Tibetan Plateau region. Those simulations were evaluated using observations at the four sites during the third Tibetan Plateau Experiment (TIPEX III).The impacts of uncertainties in precipitation data used as forcing conditions, parameterizations of sub-processes such as soil organic matter and rhizosphere on physics-ensemble simulations are identified using two different methods: the natural selection and Tukey's test. This study attempts to answer the following questions: 1) what is the relative contribution of precipitation-forcing uncertainty to the overall uncertainty range of Noah-MP simulations at those sites as compared to that at a more moisture and densely vegetated site; 2) what are the most sensitive physical parameterization for those sites; 3) can we identify the parameterizations that need to be improved? The investigation was conducted by evaluating simulated seasonal evolution of soil temperature, soilmoisture, surface heat fluxes through a number of Noah-MP ensemble simulations.
Casino physics in the classroom
NASA Astrophysics Data System (ADS)
Whitney, Charles A.
1986-12-01
This article describes a seminar on the elements of probability and random processes that is computer centered and focuses on Monte Carlo simulations of processes such as coin flips, random walks on a lattice, and the behavior of photons and atoms in a gas. Representative computer programs are also described.
Simulation of plasma loading of high-pressure RF cavities
NASA Astrophysics Data System (ADS)
Yu, K.; Samulyak, R.; Yonehara, K.; Freemire, B.
2018-01-01
Muon beam-induced plasma loading of radio-frequency (RF) cavities filled with high pressure hydrogen gas with 1% dry air dopant has been studied via numerical simulations. The electromagnetic code SPACE, that resolves relevant atomic physics processes, including ionization by the muon beam, electron attachment to dopant molecules, and electron-ion and ion-ion recombination, has been used. Simulations studies have been performed in the range of parameters typical for practical muon cooling channels.
Featured Image: The Simulated Collapse of a Core
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2016-11-01
This stunning snapshot (click for a closer look!) is from a simulation of a core-collapse supernova. Despite having been studied for many decades, the mechanism driving the explosions of core-collapse supernovae is still an area of active research. Extremely complex simulations such as this one represent best efforts to include as many realistic physical processes as is currently computationally feasible. In this study led by Luke Roberts (a NASA Einstein Postdoctoral Fellow at Caltech at the time), a core-collapse supernova is modeled long-term in fully 3D simulations that include the effects of general relativity, radiation hydrodynamics, and even neutrino physics. The authors use these simulations to examine the evolution of a supernova after its core bounce. To read more about the teams findings (and see more awesome images from their simulations), check out the paper below!CitationLuke F. Roberts et al 2016 ApJ 831 98. doi:10.3847/0004-637X/831/1/98
ERIC Educational Resources Information Center
Gates, Alexander E.
2017-01-01
A simulated physical model of volcanic processes using a glass art studio greatly enhanced enthusiasm and learning among urban, middle- to high-school aged, largely underrepresented minority students in Newark, New Jersey. The collaboration of a geoscience department with a glass art studio to create a science, technology, engineering, arts, and…
ERIC Educational Resources Information Center
Riggi, Simone; La Rocca, Paola; Riggi, Francesco
2011-01-01
GEANT4 simulations of the processes affecting the transport and collection of optical photons generated inside a scintillation detector were carried out, with the aim to complement the educational material offered by textbooks to third-year physics undergraduates. Two typical situations were considered: a long scintillator strip with and without a…
USDA-ARS?s Scientific Manuscript database
Classical, one-dimensional, mobile bed, sediment-transport models simulate vertical channel adjustment, raising or lowering cross-section node elevations to simulate erosion or deposition. This approach does not account for bank erosion processes including toe scour and mass failure. In many systems...
Neurological evidence linguistic processes precede perceptual simulation in conceptual processing.
Louwerse, Max; Hutchinson, Sterling
2012-01-01
There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky - ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes.
Neurological Evidence Linguistic Processes Precede Perceptual Simulation in Conceptual Processing
Louwerse, Max; Hutchinson, Sterling
2012-01-01
There is increasing evidence from response time experiments that language statistics and perceptual simulations both play a role in conceptual processing. In an EEG experiment we compared neural activity in cortical regions commonly associated with linguistic processing and visual perceptual processing to determine to what extent symbolic and embodied accounts of cognition applied. Participants were asked to determine the semantic relationship of word pairs (e.g., sky – ground) or to determine their iconic relationship (i.e., if the presentation of the pair matched their expected physical relationship). A linguistic bias was found toward the semantic judgment task and a perceptual bias was found toward the iconicity judgment task. More importantly, conceptual processing involved activation in brain regions associated with both linguistic and perceptual processes. When comparing the relative activation of linguistic cortical regions with perceptual cortical regions, the effect sizes for linguistic cortical regions were larger than those for the perceptual cortical regions early in a trial with the reverse being true later in a trial. These results map upon findings from other experimental literature and provide further evidence that processing of concept words relies both on language statistics and on perceptual simulations, whereby linguistic processes precede perceptual simulation processes. PMID:23133427
NASA Astrophysics Data System (ADS)
Tong, Qiujie; Wang, Qianqian; Li, Xiaoyang; Shan, Bin; Cui, Xuntai; Li, Chenyu; Peng, Zhong
2016-11-01
In order to satisfy the requirements of the real-time and generality, a laser target simulator in semi-physical simulation system based on RTX+LabWindows/CVI platform is proposed in this paper. Compared with the upper-lower computers simulation platform architecture used in the most of the real-time system now, this system has better maintainability and portability. This system runs on the Windows platform, using Windows RTX real-time extension subsystem to ensure the real-time performance of the system combining with the reflective memory network to complete some real-time tasks such as calculating the simulation model, transmitting the simulation data, and keeping real-time communication. The real-time tasks of simulation system run under the RTSS process. At the same time, we use the LabWindows/CVI to compile a graphical interface, and complete some non-real-time tasks in the process of simulation such as man-machine interaction, display and storage of the simulation data, which run under the Win32 process. Through the design of RTX shared memory and task scheduling algorithm, the data interaction between the real-time tasks process of RTSS and non-real-time tasks process of Win32 is completed. The experimental results show that this system has the strongly real-time performance, highly stability, and highly simulation accuracy. At the same time, it also has the good performance of human-computer interaction.
A combustion model of vegetation burning in "Tiger" fire propagation tool
NASA Astrophysics Data System (ADS)
Giannino, F.; Ascoli, D.; Sirignano, M.; Mazzoleni, S.; Russo, L.; Rego, F.
2017-11-01
In this paper, we propose a semi-physical model for the burning of vegetation in a wildland fire. The main physical-chemical processes involved in fire spreading are modelled through a set of ordinary differential equations, which describe the combustion process as linearly related to the consumption of fuel. The water evaporation process from leaves and wood is also considered. Mass and energy balance equations are written for fuel (leaves and wood) assuming that combustion process is homogeneous in space. The model is developed with the final aim of simulating large-scale wildland fires which spread on heterogeneous landscape while keeping the computation cost very low.
Application of nuclear physics in medical physics and nuclear medicine
NASA Astrophysics Data System (ADS)
Hoehr, Cornelia
2016-09-01
Nuclear physics has a long history of influencing and advancing medical fields. At TRIUMF we use the applications of nuclear physics to diagnose several diseases via medical isotopes and treat cancer by using proton beams. The Life Science division has a long history of producing Positron Emission Tomography (PET) isotopes but we are also investigating the production of SPECT and PET isotopes with a potential shortage for clinical operation or otherwise limited access to chemists, biologists and medical researchers. New targets are being developed, aided by a simulation platform investigating the processes inside a target under proton irradiation - nuclear, thermodynamic, and chemical. Simulations also aid in the development of new beam-shaping devices for TRIUMF's Proton Therapy facility, Canada's only proton therapy facility, as well as new treatment testing systems. Both promise improved treatment delivery for cancer patients.
ERIC Educational Resources Information Center
Hockicko, Peter; Krišták, Luboš; Nemec, Miroslav
2015-01-01
Video analysis, using the program Tracker (Open Source Physics), in the educational process introduces a new creative method of teaching physics and makes natural sciences more interesting for students. This way of exploring the laws of nature can amaze students because this illustrative and interactive educational software inspires them to think…
Numerical simulation of complex part manufactured by selective laser melting process
NASA Astrophysics Data System (ADS)
Van Belle, Laurent
2017-10-01
Selective Laser Melting (SLM) process belonging to the family of the Additive Manufacturing (AM) technologies, enable to build parts layer by layer, from metallic powder and a CAD model. Physical phenomena that occur in the process have the same issues as conventional welding. Thermal gradients generate significant residual stresses and distortions in the parts. Moreover, the large and complex parts to manufacturing, accentuate the undesirable effects. Therefore, it is essential for manufacturers to offer a better understanding of the process and to ensure production reliability of parts with high added value. This paper focuses on the simulation of manufacturing turbine by SLM process in order to calculate residual stresses and distortions. Numerical results will be presented.
Nucleosynthesis in Core-Collapse Supernovae
NASA Astrophysics Data System (ADS)
Stevenson, Taylor Shannon; Viktoria Ohstrom, Eva; Harris, James Austin; Hix, William R.
2018-01-01
The nucleosynthesis which occurs in core-collapse supernovae (CCSN) is one of the most important sources of elements in the universe. Elements from Oxygen through Iron come predominantly from supernovae, and contributions of heavier elements are also possible through processes like the weak r-process, the gamma process and the light element primary process. The composition of the ejecta depends on the mechanism of the explosion, thus simulations of high physical fidelity are needed to explore what elements and isotopes CCSN can contribute to Galactic Chemical Evolution. We will analyze the nucleosynthesis results from self-consistent CCSN simulations performed with CHIMERA, a multi-dimensional neutrino radiation-hydrodynamics code. Much of our understanding of CCSN nucleosynthesis comes from parameterized models, but unlike CHIMERA these fail to address essential physics, including turbulent flow/instability and neutrino-matter interaction. We will present nucleosynthesis predictions for the explosion of a 9.6 solar mass first generation star, relying both on results of the 160 species nuclear reaction network used in CHIMERA within this model and on post-processing with a more extensive network. The lowest mass iron core-collapse supernovae, like this model, are distinct from their more massive brethren, with their explosion mechanism and nucleosynthesis being more like electron capture supernovae resulting from Oxygen-Neon white dwarves. We will highlight the differences between the nucleosynthesis in this model and more massive supernovae. The inline 160 species network is a feature unique to CHIMERA, making this the most sophisticated model to date for a star of this type. We will discuss the need and mechanism to extrapolate the post-processing to times post-simulation and analyze the uncertainties this introduces for supernova nucleosynthesis. We will also compare the results from the inline 160 species network to the post-processing results to study further uncertainties introduced by post-processing. This work is supported by the U.S. Department of Energy, Office of Nuclear Physics, and the National Science Foundation Nuclear Theory Program (PHY-1516197).
NASA Astrophysics Data System (ADS)
Martizzi, Davide; Teyssier, Romain; Moore, Ben; Wentz, Tina
2012-06-01
The spatial distribution of matter in clusters of galaxies is mainly determined by the dominant dark matter component; however, physical processes involving baryonic matter are able to modify it significantly. We analyse a set of 500 pc resolution cosmological simulations of a cluster of galaxies with mass comparable to Virgo, performed with the AMR code RAMSES. We compare the mass density profiles of the dark, stellar and gaseous matter components of the cluster that result from different assumptions for the subgrid baryonic physics and galaxy formation processes. First, the prediction of a gravity-only N-body simulation is compared to that of a hydrodynamical simulation with standard galaxy formation recipes, and then all results are compared to a hydrodynamical simulation which includes thermal active galactic nucleus (AGN) feedback from supermassive black holes (SMBHs). We find the usual effects of overcooling and adiabatic contraction in the run with standard galaxy formation physics, but very different results are found when implementing SMBHs and AGN feedback. Star formation is strongly quenched, producing lower stellar densities throughout the cluster, and much less cold gas is available for star formation at low redshifts. At redshift z= 0 we find a flat density core of radius 10 kpc in both the dark and stellar matter density profiles. We speculate on the possible formation mechanisms able to produce such cores and we conclude that they can be produced through the coupling of different processes: (I) dynamical friction from the decay of black hole orbits during galaxy mergers; (II) AGN-driven gas outflows producing fluctuations of the gravitational potential causing the removal of collisionless matter from the central region of the cluster; (III) adiabatic expansion in response to the slow expulsion of gas from the central region of the cluster during the quiescent mode of AGN activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spong, D.A.
The design techniques and physics analysis of modern stellarator configurations for magnetic fusion research rely heavily on high performance computing and simulation. Stellarators, which are fundamentally 3-dimensional in nature, offer significantly more design flexibility than more symmetric devices such as the tokamak. By varying the outer boundary shape of the plasma, a variety of physics features, such as transport, stability, and heating efficiency can be optimized. Scientific visualization techniques are an important adjunct to this effort as they provide a necessary ergonomic link between the numerical results and the intuition of the human researcher. The authors have developed a varietymore » of visualization techniques for stellarators which both facilitate the design optimization process and allow the physics simulations to be more readily understood.« less
Computer Simulation of Electron Positron Annihilation Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, y
2003-10-02
With the launching of the Next Linear Collider coming closer and closer, there is a pressing need for physicists to develop a fully-integrated computer simulation of e{sup +}e{sup -} annihilation process at center-of-mass energy of 1TeV. A simulation program acts as the template for future experiments. Either new physics will be discovered, or current theoretical uncertainties will shrink due to more accurate higher-order radiative correction calculations. The existence of an efficient and accurate simulation will help us understand the new data and validate (or veto) some of the theoretical models developed to explain new physics. It should handle well interfacesmore » between different sectors of physics, e.g., interactions happening at parton levels well above the QCD scale which are described by perturbative QCD, and interactions happening at much lower energy scale, which combine partons into hadrons. Also it should achieve competitive speed in real time when the complexity of the simulation increases. This thesis contributes some tools that will be useful for the development of such simulation programs. We begin our study by the development of a new Monte Carlo algorithm intended to perform efficiently in selecting weight-1 events when multiple parameter dimensions are strongly correlated. The algorithm first seeks to model the peaks of the distribution by features, adapting these features to the function using the EM algorithm. The representation of the distribution provided by these features is then improved using the VEGAS algorithm for the Monte Carlo integration. The two strategies mesh neatly into an effective multi-channel adaptive representation. We then present a new algorithm for the simulation of parton shower processes in high energy QCD. We want to find an algorithm which is free of negative weights, produces its output as a set of exclusive events, and whose total rate exactly matches the full Feynman amplitude calculation. Our strategy is to create the whole QCD shower as a tree structure generated by a multiple Poisson process. Working with the whole shower allows us to include correlations between gluon emissions from different sources. QCD destructive interference is controlled by the implementation of ''angular-ordering,'' as in the HERWIG Monte Carlo program. We discuss methods for systematic improvement of the approach to include higher order QCD effects.« less
Computer simulation of surface and film processes
NASA Technical Reports Server (NTRS)
Tiller, W. A.; Halicioglu, M. T.
1983-01-01
Adequate computer methods, based on interactions between discrete particles, provide information leading to an atomic level understanding of various physical processes. The success of these simulation methods, however, is related to the accuracy of the potential energy function representing the interactions among the particles. The development of a potential energy function for crystalline SiO2 forms that can be employed in lengthy computer modelling procedures was investigated. In many of the simulation methods which deal with discrete particles, semiempirical two body potentials were employed to analyze energy and structure related properties of the system. Many body interactions are required for a proper representation of the total energy for many systems. Many body interactions for simulations based on discrete particles are discussed.
NASA Astrophysics Data System (ADS)
Prudden, R.; Arribas, A.; Tomlinson, J.; Robinson, N.
2017-12-01
The Unified Model is a numerical model of the atmosphere used at the UK Met Office (and numerous partner organisations including Korean Meteorological Agency, Australian Bureau of Meteorology and US Air Force) for both weather and climate applications.Especifically, dynamical models such as the Unified Model are now a central part of weather forecasting. Starting from basic physical laws, these models make it possible to predict events such as storms before they have even begun to form. The Unified Model can be simply described as having two components: one component solves the navier-stokes equations (usually referred to as the "dynamics"); the other solves relevant sub-grid physical processes (usually referred to as the "physics"). Running weather forecasts requires substantial computing resources - for example, the UK Met Office operates the largest operational High Performance Computer in Europe - and the cost of a typical simulation is spent roughly 50% in the "dynamics" and 50% in the "physics". Therefore there is a high incentive to reduce cost of weather forecasts and Machine Learning is a possible option because, once a machine learning model has been trained, it is often much faster to run than a full simulation. This is the motivation for a technique called model emulation, the idea being to build a fast statistical model which closely approximates a far more expensive simulation. In this paper we discuss the use of Machine Learning as an emulator to replace the "physics" component of the Unified Model. Various approaches and options will be presented and the implications for further model development, operational running of forecasting systems, development of data assimilation schemes, and development of ensemble prediction techniques will be discussed.
Atmospheric microphysical experiments on an orbital platform
NASA Technical Reports Server (NTRS)
Eaton, L. R.
1974-01-01
The Zero-Gravity Atmospheric Cloud Physics Laboratory is a Shuttle/Spacelab payload which will be capable of performing a large range of microphysics experiments. This facility will complement terrestrial cloud physics research by allowing many experiments to be performed which cannot be accomplished within the confines of a terrestrial laboratory. This paper reviews the general Cloud Physics Laboratory concept and the experiment scope. The experimental constraints are given along with details of the proposed equipment. Examples of appropriate experiments range from three-dimensional simulation of the earth and planetary atmosphere and of ocean circulation to cloud electrification processes and the effects of atmospheric pollution materials on microphysical processes.
ERIC Educational Resources Information Center
Brembs, Bjorn; de Ibarra, Natalie Hempel
2006-01-01
We have used a genetically tractable model system, the fruit fly "Drosophila melanogaster" to study the interdependence between sensory processing and associative processing on learning performance. We investigated the influence of variations in the physical and predictive properties of color stimuli in several different operant-conditioning…
Fast Particle Methods for Multiscale Phenomena Simulations
NASA Technical Reports Server (NTRS)
Koumoutsakos, P.; Wray, A.; Shariff, K.; Pohorille, Andrew
2000-01-01
We are developing particle methods oriented at improving computational modeling capabilities of multiscale physical phenomena in : (i) high Reynolds number unsteady vortical flows, (ii) particle laden and interfacial flows, (iii)molecular dynamics studies of nanoscale droplets and studies of the structure, functions, and evolution of the earliest living cell. The unifying computational approach involves particle methods implemented in parallel computer architectures. The inherent adaptivity, robustness and efficiency of particle methods makes them a multidisciplinary computational tool capable of bridging the gap of micro-scale and continuum flow simulations. Using efficient tree data structures, multipole expansion algorithms, and improved particle-grid interpolation, particle methods allow for simulations using millions of computational elements, making possible the resolution of a wide range of length and time scales of these important physical phenomena.The current challenges in these simulations are in : [i] the proper formulation of particle methods in the molecular and continuous level for the discretization of the governing equations [ii] the resolution of the wide range of time and length scales governing the phenomena under investigation. [iii] the minimization of numerical artifacts that may interfere with the physics of the systems under consideration. [iv] the parallelization of processes such as tree traversal and grid-particle interpolations We are conducting simulations using vortex methods, molecular dynamics and smooth particle hydrodynamics, exploiting their unifying concepts such as : the solution of the N-body problem in parallel computers, highly accurate particle-particle and grid-particle interpolations, parallel FFT's and the formulation of processes such as diffusion in the context of particle methods. This approach enables us to transcend among seemingly unrelated areas of research.
Physical habitat simulation system reference manual: version II
Milhous, Robert T.; Updike, Marlys A.; Schneider, Diane M.
1989-01-01
There are four major components of a stream system that determine the productivity of the fishery (Karr and Dudley 1978). These are: (1) flow regime, (2) physical habitat structure (channel form, substrate distribution, and riparian vegetation), (3) water quality (including temperature), and (4) energy inputs from the watershed (sediments, nutrients, and organic matter). The complex interaction of these components determines the primary production, secondary production, and fish population of the stream reach. The basic components and interactions needed to simulate fish populations as a function of management alternatives are illustrated in Figure I.1. The assessment process utilizes a hierarchical and modular approach combined with computer simulation techniques. The modular components represent the "building blocks" for the simulation. The quality of the physical habitat is a function of flow and, therefore, varies in quality and quantity over the range of the flow regime. The conceptual framework of the Incremental Methodology and guidelines for its application are described in "A Guide to Stream Habitat Analysis Using the Instream Flow Incremental Methodology" (Bovee 1982). Simulation of physical habitat is accomplished using the physical structure of the stream and streamflow. The modification of physical habitat by temperature and water quality is analyzed separately from physical habitat simulation. Temperature in a stream varies with the seasons, local meteorological conditions, stream network configuration, and the flow regime; thus, the temperature influences on habitat must be analysed on a stream system basis. Water quality under natural conditions is strongly influenced by climate and the geological materials, with the result that there is considerable natural variation in water quality. When we add the activities of man, the possible range of water quality possibilities becomes rather large. Consequently, water quality must also be analysed on a stream system basis. Such analysis is outside the scope of this manual, which concentrates on simulation of physical habitat based on depth, velocity, and a channel index. The results form PHABSIM can be used alone or by using a series of habitat time series programs that have been developed to generate monthly or daily habitat time series from the Weighted Usable Area versus streamflow table resulting from the habitat simulation programs and streamflow time series data. Monthly and daily streamflow time series may be obtained from USGS gages near the study site or as the output of river system management models.
NASA Astrophysics Data System (ADS)
Wimer, N. T.; Mackoweicki, A. S.; Poludnenko, A. Y.; Hoffman, C.; Daily, J. W.; Rieker, G. B.; Hamlington, P.
2017-12-01
Results are presented from a joint computational and experimental research effort focused on understanding and characterizing wildland fire spread at small scales (roughly 1m-1mm) using direct numerical simulations (DNS) with chemical kinetics mechanisms that have been calibrated using data from high-speed laser diagnostics. The simulations are intended to directly resolve, with high physical accuracy, all small-scale fluid dynamic and chemical processes relevant to wildland fire spread. The high fidelity of the simulations is enabled by the calibration and validation of DNS sub-models using data from high-speed laser diagnostics. These diagnostics have the capability to measure temperature and chemical species concentrations, and are used here to characterize evaporation and pyrolysis processes in wildland fuels subjected to an external radiation source. The chemical kinetics code CHEMKIN-PRO is used to study and reduce complex reaction mechanisms for water removal, pyrolysis, and gas phase combustion during solid biomass burning. Simulations are then presented for a gaseous pool fire coupled with the resulting multi-step chemical reaction mechanisms, and the results are connected to the fundamental structure and spread of wildland fires. It is anticipated that the combined computational and experimental approach of this research effort will provide unprecedented access to information about chemical species, temperature, and turbulence during the entire pyrolysis, evaporation, ignition, and combustion process, thereby permitting more complete understanding of the physics that must be represented by coarse-scale numerical models of wildland fire spread.
Theoretical Technology Research for the International Solar Terrestrial Physics (ISTP) Program
NASA Technical Reports Server (NTRS)
Ashour-Abdalla, Maha; Curtis, Steve (Technical Monitor)
2002-01-01
During the last four years the UCLA (University of California, Los Angeles) IGPP (Institute of Geophysics and Planetary Physics) Space Plasma Simulation Group has continued its theoretical effort to develop a Mission Oriented Theory (MOT) for the International Solar Terrestrial Physics (ISTP) program. This effort has been based on a combination of approaches: analytical theory, large-scale kinetic (LSK) calculations, global magnetohydrodynamic (MHD) simulations and self-consistent plasma kinetic (SCK) simulations. These models have been used to formulate a global interpretation of local measurements made by the ISTP spacecraft. The regions of applications of the MOT cover most of the magnetosphere: solar wind, low- and high- latitude magnetospheric boundary, near-Earth and distant magnetotail, and auroral region. Most recent investigations include: plasma processes in the electron foreshock, response of the magnetospheric cusp, particle entry in the magnetosphere, sources of observed distribution functions in the magnetotail, transport of oxygen ions, self-consistent evolution of the magnetotail, substorm studies, effects of explosive reconnection, and auroral acceleration simulations. A complete list of the activities completed under the grant follow.
NASA Technical Reports Server (NTRS)
Mocko, David M.; Sud, Y. C.; Einaudi, Franco (Technical Monitor)
2000-01-01
Present-day climate models produce large climate drifts that interfere with the climate signals simulated in modelling studies. The simplifying assumptions of the physical parameterization of snow and ice processes lead to large biases in the annual cycles of surface temperature, evapotranspiration, and the water budget, which in turn causes erroneous land-atmosphere interactions. Since land processes are vital for climate prediction, and snow and snowmelt processes have been shown to affect Indian monsoons and North American rainfall and hydrology, special attention is now being given to cold land processes and their influence on the simulated annual cycle in GCMs. The snow model of the SSiB land-surface model being used at Goddard has evolved from a unified single snow-soil layer interacting with a deep soil layer through a force-restore procedure to a two-layer snow model atop a ground layer separated by a snow-ground interface. When the snow cover is deep, force-restore occurs within the snow layers. However, several other simplifying assumptions such as homogeneous snow cover, an empirical depth related surface albedo, snowmelt and melt-freeze in the diurnal cycles, and neglect of latent heat of soil freezing and thawing still remain as nagging problems. Several important influences of these assumptions will be discussed with the goal of improving them to better simulate the snowmelt and meltwater hydrology. Nevertheless, the current snow model (Mocko and Sud, 2000, submitted) better simulates cold land processes as compared to the original SSiB. This was confirmed against observations of soil moisture, runoff, and snow cover in global GSWP (Sud and Mocko, 1999) and point-scale Valdai simulations over seasonal snow regions. New results from the current snow model SSiB from the 10-year PILPS 2e intercomparison in northern Scandinavia will be presented.
Kassiopeia: a modern, extensible C++ particle tracking package
DOE Office of Scientific and Technical Information (OSTI.GOV)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
Kassiopeia: a modern, extensible C++ particle tracking package
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; ...
2017-05-16
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur inmore » flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle's state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.« less
Kassiopeia: a modern, extensible C++ particle tracking package
NASA Astrophysics Data System (ADS)
Furse, Daniel; Groh, Stefan; Trost, Nikolaus; Babutzka, Martin; Barrett, John P.; Behrens, Jan; Buzinsky, Nicholas; Corona, Thomas; Enomoto, Sanshiro; Erhard, Moritz; Formaggio, Joseph A.; Glück, Ferenc; Harms, Fabian; Heizmann, Florian; Hilk, Daniel; Käfer, Wolfgang; Kleesiek, Marco; Leiber, Benjamin; Mertens, Susanne; Oblath, Noah S.; Renschler, Pascal; Schwarz, Johannes; Slocum, Penny L.; Wandkowsky, Nancy; Wierman, Kevin; Zacher, Michael
2017-05-01
The Kassiopeia particle tracking framework is an object-oriented software package using modern C++ techniques, written originally to meet the needs of the KATRIN collaboration. Kassiopeia features a new algorithmic paradigm for particle tracking simulations which targets experiments containing complex geometries and electromagnetic fields, with high priority put on calculation efficiency, customizability, extensibility, and ease-of-use for novice programmers. To solve Kassiopeia's target physics problem the software is capable of simulating particle trajectories governed by arbitrarily complex differential equations of motion, continuous physics processes that may in part be modeled as terms perturbing that equation of motion, stochastic processes that occur in flight such as bulk scattering and decay, and stochastic surface processes occurring at interfaces, including transmission and reflection effects. This entire set of computations takes place against the backdrop of a rich geometry package which serves a variety of roles, including initialization of electromagnetic field simulations and the support of state-dependent algorithm-swapping and behavioral changes as a particle’s state evolves. Thanks to the very general approach taken by Kassiopeia it can be used by other experiments facing similar challenges when calculating particle trajectories in electromagnetic fields. It is publicly available at https://github.com/KATRIN-Experiment/Kassiopeia.
Shprits, Yuri Y.; Kellerman, Adam C.; Drozdov, Alexander Y.; ...
2015-11-19
Our study focused on understanding the coupling between different electron populations in the inner magnetosphere and the various physical processes that determine evolution of electron fluxes at different energies. Observations during the 17 March 2013 storm and simulations with a newly developed Versatile Electron Radiation Belt-4D (VERB-4D) are presented. This analysis of the drift trajectories of the energetic and relativistic electrons shows that electron trajectories at transitional energies with a first invariant on the scale of ~100 MeV/G may resemble ring current or relativistic electron trajectories depending on the level of geomagnetic activity. Simulations with the VERB-4D code including convection,more » radial diffusion, and energy diffusion are presented. Sensitivity simulations including various physical processes show how different acceleration mechanisms contribute to the energization of energetic electrons at transitional energies. In particular, the range of energies where inward transport is strongly influenced by both convection and radial diffusion are studied. Our results of the 4-D simulations are compared to Van Allen Probes observations at a range of energies including source, seed, and core populations of the energetic and relativistic electrons in the inner magnetosphere.« less
The ATLAS Simulation Infrastructure
Aad, G.; Abbott, B.; Abdallah, J.; ...
2010-09-25
The simulation software for the ATLAS Experiment at the Large Hadron Collider is being used for large-scale production of events on the LHC Computing Grid. This simulation requires many components, from the generators that simulate particle collisions, through packages simulating the response of the various detectors and triggers. All of these components come together under the ATLAS simulation infrastructure. In this paper, that infrastructure is discussed, including that supporting the detector description, interfacing the event generation, and combining the GEANT4 simulation of the response of the individual detectors. Also described are the tools allowing the software validation, performance testing, andmore » the validation of the simulated output against known physics processes.« less
Computing the apparent centroid of radar targets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C.E.
1996-12-31
A high-frequency multibounce radar scattering code was used as a simulation platform for demonstrating an algorithm to compute the ARC of specific radar targets. To illustrate this simulation process, several targets models were used. Simulation results for a sphere model were used to determine the errors of approximation associated with the simulation; verifying the process. The severity of glint induced tracking errors was also illustrated using a model of an F-15 aircraft. It was shown, in a deterministic manner, that the ARC of a target can fall well outside its physical extent. Finally, the apparent radar centroid simulation based onmore » a ray casting procedure is well suited for use on most massively parallel computing platforms and could lead to the development of a near real-time radar tracking simulation for applications such as endgame fuzing, survivability, and vulnerability analyses using specific radar targets and fuze algorithms.« less
Modeling socio-cultural processes in network-centric environments
NASA Astrophysics Data System (ADS)
Santos, Eunice E.; Santos, Eugene, Jr.; Korah, John; George, Riya; Gu, Qi; Kim, Keumjoo; Li, Deqing; Russell, Jacob; Subramanian, Suresh
2012-05-01
The major focus in the field of modeling & simulation for network centric environments has been on the physical layer while making simplifications for the human-in-the-loop. However, the human element has a big impact on the capabilities of network centric systems. Taking into account the socio-behavioral aspects of processes such as team building, group decision-making, etc. are critical to realistically modeling and analyzing system performance. Modeling socio-cultural processes is a challenge because of the complexity of the networks, dynamism in the physical and social layers, feedback loops and uncertainty in the modeling data. We propose an overarching framework to represent, model and analyze various socio-cultural processes within network centric environments. The key innovation in our methodology is to simultaneously model the dynamism in both the physical and social layers while providing functional mappings between them. We represent socio-cultural information such as friendships, professional relationships and temperament by leveraging the Culturally Infused Social Network (CISN) framework. The notion of intent is used to relate the underlying socio-cultural factors to observed behavior. We will model intent using Bayesian Knowledge Bases (BKBs), a probabilistic reasoning network, which can represent incomplete and uncertain socio-cultural information. We will leverage previous work on a network performance modeling framework called Network-Centric Operations Performance and Prediction (N-COPP) to incorporate dynamism in various aspects of the physical layer such as node mobility, transmission parameters, etc. We validate our framework by simulating a suitable scenario, incorporating relevant factors and providing analyses of the results.
Physical explosion analysis in heat exchanger network design
NASA Astrophysics Data System (ADS)
Pasha, M.; Zaini, D.; Shariff, A. M.
2016-06-01
The failure of shell and tube heat exchangers is being extensively experienced by the chemical process industries. This failure can create a loss of production for long time duration. Moreover, loss of containment through heat exchanger could potentially lead to a credible event such as fire, explosion and toxic release. There is a need to analyse the possible worst case effect originated from the loss of containment of the heat exchanger at the early design stage. Physical explosion analysis during the heat exchanger network design is presented in this work. Baker and Prugh explosion models are deployed for assessing the explosion effect. Microsoft Excel integrated with process design simulator through object linking and embedded (OLE) automation for this analysis. Aspen HYSYS V (8.0) used as a simulation platform in this work. A typical heat exchanger network of steam reforming and shift conversion process was presented as a case study. It is investigated from this analysis that overpressure generated from the physical explosion of each heat exchanger can be estimated in a more precise manner by using Prugh model. The present work could potentially assist the design engineer to identify the critical heat exchanger in the network at the preliminary design stage.
WE-D-204-02: Errors and Process Improvements in Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fontenla, D.
2016-06-15
Speakers in this session will present overview and details of a specific rotation or feature of their Medical Physics Residency Program that is particularly exceptional and noteworthy. The featured rotations include foundational topics executed with exceptional acumen and innovative educational rotations perhaps not commonly found in Medical Physics Residency Programs. A site-specific clinical rotation will be described, where the medical physics resident follows the physician and medical resident for two weeks into patient consultations, simulation sessions, target contouring sessions, planning meetings with dosimetry, patient follow up visits, and tumor boards, to gain insight into the thought processes of the radiationmore » oncologist. An incident learning rotation will be described where the residents learns about and practices evaluating clinical errors and investigates process improvements for the clinic. The residency environment at a Canadian medical physics residency program will be described, where the training and interactions with radiation oncology residents is integrated. And the first month rotation will be described, where the medical physics resident rotates through the clinical areas including simulation, dosimetry, and treatment units, gaining an overview of the clinical flow and meeting all the clinical staff to begin the residency program. This session will be of particular interest to residency programs who are interested in adopting or adapting these curricular ideas into their programs and to residency candidates who want to learn about programs already employing innovative practices. Learning Objectives: To learn about exceptional and innovative clinical rotations or program features within existing Medical Physics Residency Programs. To understand how to adopt/adapt innovative curricular designs into your own Medical Physics Residency Program, if appropriate.« less
Simulation of plasma loading of high-pressure RF cavities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, K.; Samulyak, R.; Yonehara, K.
2018-01-11
Muon beam-induced plasma loading of radio-frequency (RF) cavities filled with high pressure hydrogen gas with 1% dry air dopant has been studied via numerical simulations. The electromagnetic code SPACE, that resolves relevant atomic physics processes, including ionization by the muon beam, electron attachment to dopant molecules, and electron-ion and ion-ion recombination, has been used. Simulations studies have also been performed in the range of parameters typical for practical muon cooling channels.
Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations
NASA Astrophysics Data System (ADS)
Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.
2017-09-01
Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (I) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (II) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph; (III) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (IV) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (VI) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.
Lagrangian particles with mixing. I. Simulating scalar transport
NASA Astrophysics Data System (ADS)
Klimenko, A. Y.
2009-06-01
The physical similarity and mathematical equivalence of continuous diffusion and particle random walk forms one of the cornerstones of modern physics and the theory of stochastic processes. The randomly walking particles do not need to posses any properties other than location in physical space. However, particles used in many models dealing with simulating turbulent transport and turbulent combustion do posses a set of scalar properties and mixing between particle properties is performed to reflect the dissipative nature of the diffusion processes. We show that the continuous scalar transport and diffusion can be accurately specified by means of localized mixing between randomly walking Lagrangian particles with scalar properties and assess errors associated with this scheme. Particles with scalar properties and localized mixing represent an alternative formulation for the process, which is selected to represent the continuous diffusion. Simulating diffusion by Lagrangian particles with mixing involves three main competing requirements: minimizing stochastic uncertainty, minimizing bias introduced by numerical diffusion, and preserving independence of particles. These requirements are analyzed for two limited cases of mixing between two particles and mixing between a large number of particles. The problem of possible dependences between particles is most complicated. This problem is analyzed using a coupled chain of equations that has similarities with Bogolubov-Born-Green-Kirkwood-Yvon chain in statistical physics. Dependences between particles can be significant in close proximity of the particles resulting in a reduced rate of mixing. This work develops further ideas introduced in the previously published letter [Phys. Fluids 19, 031702 (2007)]. Paper I of this work is followed by Paper II [Phys. Fluids 19, 065102 (2009)] where modeling of turbulent reacting flows by Lagrangian particles with localized mixing is specifically considered.
Simulation of the GEM detector for BM@N experiment
NASA Astrophysics Data System (ADS)
Baranov, Dmitriy; Rogachevsky, Oleg
2017-03-01
The Gas Electron Multiplier (GEM) detector is one of the basic parts of the BM@N experiment included in the NICA project. The simulation model that takes into account features of signal generation process in an ionization GEM chamber is presented in this article. Proper parameters for the simulation were extracted from data retrieved with the help of Garfield++ (a toolkit for the detailed simulation of particle detectors). Due to this, we are able to generate clusters in layers of the micro-strip readout that correspond to clusters retrieved from a real physics experiment.
NASA Astrophysics Data System (ADS)
Wang, Kelu; Li, Xin; Zhang, Xiaobo
2018-03-01
The power dissipation maps of Ti-25Al-15Nb alloy were constructed by using the compression test data. A method is proposed to predict the distribution and variation of power dissipation coefficient in hot forging process using both the dynamic material model and finite element simulation. Using the proposed method, the change characteristics of the power dissipation coefficient are simulated and predicted. The effectiveness of the proposed method was verified by comparing the simulation results with the physical experimental results.
NASA Astrophysics Data System (ADS)
Peishu, Zong; Jianping, Tang; Shuyu, Wang; Lingyun, Xie; Jianwei, Yu; Yunqian, Zhu; Xiaorui, Niu; Chao, Li
2017-08-01
The parameterization of physical processes is one of the critical elements to properly simulate the regional climate over eastern China. It is essential to conduct detailed analyses on the effect of physical parameterization schemes on regional climate simulation, to provide more reliable regional climate change information. In this paper, we evaluate the 25-year (1983-2007) summer monsoon climate characteristics of precipitation and surface air temperature by using the regional spectral model (RSM) with different physical schemes. The ensemble results using the reliability ensemble averaging (REA) method are also assessed. The result shows that the RSM model has the capacity to reproduce the spatial patterns, the variations, and the temporal tendency of surface air temperature and precipitation over eastern China. And it tends to predict better climatology characteristics over the Yangtze River basin and the South China. The impact of different physical schemes on RSM simulations is also investigated. Generally, the CLD3 cloud water prediction scheme tends to produce larger precipitation because of its overestimation of the low-level moisture. The systematic biases derived from the KF2 cumulus scheme are larger than those from the RAS scheme. The scale-selective bias correction (SSBC) method improves the simulation of the temporal and spatial characteristics of surface air temperature and precipitation and advances the circulation simulation capacity. The REA ensemble results show significant improvement in simulating temperature and precipitation distribution, which have much higher correlation coefficient and lower root mean square error. The REA result of selected experiments is better than that of nonselected experiments, indicating the necessity of choosing better ensemble samples for ensemble.
A Multi-scale Modeling System with Unified Physics to Study Precipitation Processes
NASA Astrophysics Data System (ADS)
Tao, W. K.
2017-12-01
In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), and (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF). The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitation, processes and their sensitivity on model resolution and microphysics schemes will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.
Ebel, B.A.; Mirus, B.B.; Heppner, C.S.; VanderKwaak, J.E.; Loague, K.
2009-01-01
Distributed hydrologic models capable of simulating fully-coupled surface water and groundwater flow are increasingly used to examine problems in the hydrologic sciences. Several techniques are currently available to couple the surface and subsurface; the two most frequently employed approaches are first-order exchange coefficients (a.k.a., the surface conductance method) and enforced continuity of pressure and flux at the surface-subsurface boundary condition. The effort reported here examines the parameter sensitivity of simulated hydrologic response for the first-order exchange coefficients at a well-characterized field site using the fully coupled Integrated Hydrology Model (InHM). This investigation demonstrates that the first-order exchange coefficients can be selected such that the simulated hydrologic response is insensitive to the parameter choice, while simulation time is considerably reduced. Alternatively, the ability to choose a first-order exchange coefficient that intentionally decouples the surface and subsurface facilitates concept-development simulations to examine real-world situations where the surface-subsurface exchange is impaired. While the parameters comprising the first-order exchange coefficient cannot be directly estimated or measured, the insensitivity of the simulated flow system to these parameters (when chosen appropriately) combined with the ability to mimic actual physical processes suggests that the first-order exchange coefficient approach can be consistent with a physics-based framework. Copyright ?? 2009 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Mani, N. J.; Waliser, D. E.; Jiang, X.
2014-12-01
While the boreal summer monsoon intraseasonal variability (BSISV) exerts profound influence on the south Asian monsoon, the capability of present day dynamical models in simulating and predicting the BSISV is still limited. The global model evaluation project on vertical structure and diabatic processes of the Madden Julian Oscillations (MJO) is a joint venture, coordinated by the Working Group on Numerical Experimentation (WGNE) MJO Task Force and GEWEX Atmospheric System Study (GASS) program, for assessing the model deficiencies in simulating the ISV and for improving our understanding of the underlying processes. In this study the simulation of the northward propagating BSISV is investigated in 26 climate models with special focus on the vertical diabatic heating structure and clouds. Following parallel lines of inquiry as the MJO Task Force has done with the eastward propagating MJO, we utilize previously proposed and newly developed model performance metrics and process diagnostics and apply them to the global climate model simulations of BSISV.
NASA Astrophysics Data System (ADS)
Deng, Shaoyong; Zhang, Shiqiang; He, Minbo; Zhang, Zheng; Guan, Xiaowei
2017-05-01
The positive-branch confocal unstable resonator with inhomogeneous gain medium was studied for the normal used high energy DF laser system. The fast changing process of the resonator's eigenmodes was coupled with the slow changing process of the thermal deformation of cavity mirrors. Influences of the thermal deformation of cavity mirrors to the outcoupled beam quality and transmission loss of high frequency components of high energy laser were computed. The simulations are done through programs compiled by MATLAB and GLAD software and the method of combination of finite elements and Fox-li iteration algorithm was used. Effects of thermal distortion, misaligned of cavity mirrors and inhomogeneous distribution of gain medium were introduced to simulate the real physical circumstances of laser cavity. The wavefront distribution and beam quality (including RMS of wavefront, power in the bucket, Strehl ratio, diffraction limit β, position of the beam spot center, spot size and intensity distribution in far-field ) of the distorted outcoupled beam were studied. The conclusions of the simulation agree with the experimental results. This work would supply references of wavefront correction range to the adaptive optics system of interior alleyway.
Statistical variances of diffusional properties from ab initio molecular dynamics simulations
NASA Astrophysics Data System (ADS)
He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei
2018-12-01
Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.
UFMulti: A new parallel processing software system for HEP
NASA Astrophysics Data System (ADS)
Avery, Paul; White, Andrew
1989-12-01
UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstation or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future.
Random walk, diffusion and mixing in simulations of scalar transport in fluid flows
NASA Astrophysics Data System (ADS)
Klimenko, A. Y.
2008-12-01
Physical similarity and mathematical equivalence of continuous diffusion and particle random walk form one of the cornerstones of modern physics and the theory of stochastic processes. In many applied models used in simulation of turbulent transport and turbulent combustion, mixing between particles is used to reflect the influence of the continuous diffusion terms in the transport equations. We show that the continuous scalar transport and diffusion can be accurately specified by means of mixing between randomly walking Lagrangian particles with scalar properties and assess errors associated with this scheme. This gives an alternative formulation for the stochastic process which is selected to represent the continuous diffusion. This paper focuses on statistical errors and deals with relatively simple cases, where one-particle distributions are sufficient for a complete description of the problem.
Warrington, Steven J; Beeson, Michael S; Fire, Frank L
2013-05-01
Emergency medicine residents use simulation training for many reasons, such as gaining experience with critically ill patients and becoming familiar with disease processes. Residents frequently criticize simulation training using current high-fidelity mannequins due to the poor quality of physical exam findings present, such as auscultatory findings, as it may lead them down an alternate diagnostic or therapeutic pathway. Recently wireless remote programmed stethoscopes (simulation stethoscopes) have been developed that allow wireless transmission of any sound to a stethoscope receiver, which improves the fidelity of a physical examination and the simulation case. Following institutional review committee approval, 14 PGY1-3 emergency medicine residents were assessed during 2 simulation-based cases using pre-defined scoring anchors on multiple actions, such as communication skills and treatment decisions (Appendix 1). Each case involved a patient presenting with dyspnea requiring management based off physical examination findings. One case was a patient with exacerbation of heart failure, while the other was a patient with a tension pneumothorax. Each resident was randomized into a case associated with the simulation stethoscope. Following the cases residents were asked to fill out an evaluation questionnaire. Residents perceived the most realistic physical exam findings on those associated with the case using the simulation stethoscope (13/14, 93%). Residents also preferred the simulation stethoscope as an adjunct to the case (13/14, 93%), and they rated the simulation stethoscope case to have significantly more realistic auscultatory findings (4.4/5 vs. 3.0/5 difference of means 1.4, p=0.0007). Average scores of residents were significantly better in the simulation stethoscope-associated case (2.5/3 vs. 2.3/3 difference of means 0.2, p=0.04). There was no considerable difference in the total time taken per case. A simulation stethoscope may be a useful adjunct to current emergency medicine simulation-based training. Residents both preferred the use of the simulation stethoscope and perceived physical exam findings to be more realistic, leading to improved fidelity. Potential sources of bias include the small population, narrow scoring range, and the lack of blinding. Further research, focusing on use for resident assessment and clinical significance with a larger population and blinding of graders, is needed.
Beeson, Michael S.; Fire, Frank L.
2013-01-01
Introduction: Emergency medicine residents use simulation training for many reasons, such as gaining experience with critically ill patients and becoming familiar with disease processes. Residents frequently criticize simulation training using current high-fidelity mannequins due to the poor quality of physical exam findings present, such as auscultatory findings, as it may lead them down an alternate diagnostic or therapeutic pathway. Recently wireless remote programmed stethoscopes (simulation stethoscopes) have been developed that allow wireless transmission of any sound to a stethoscope receiver, which improves the fidelity of a physical examination and the simulation case. Methods: Following institutional review committee approval, 14 PGY1-3 emergency medicine residents were assessed during 2 simulation-based cases using pre-defined scoring anchors on multiple actions, such as communication skills and treatment decisions (Appendix 1). Each case involved a patient presenting with dyspnea requiring management based off physical examination findings. One case was a patient with exacerbation of heart failure, while the other was a patient with a tension pneumothorax. Each resident was randomized into a case associated with the simulation stethoscope. Following the cases residents were asked to fill out an evaluation questionnaire. Results: Residents perceived the most realistic physical exam findings on those associated with the case using the simulation stethoscope (13/14, 93%). Residents also preferred the simulation stethoscope as an adjunct to the case (13/14, 93%), and they rated the simulation stethoscope case to have significantly more realistic auscultatory findings (4.4/5 vs. 3.0/5 difference of means 1.4, p=0.0007). Average scores of residents were significantly better in the simulation stethoscope-associated case (2.5/3 vs. 2.3/3 difference of means 0.2, p=0.04). There was no considerable difference in the total time taken per case. Conclusion: A simulation stethoscope may be a useful adjunct to current emergency medicine simulation-based training. Residents both preferred the use of the simulation stethoscope and perceived physical exam findings to be more realistic, leading to improved fidelity. Potential sources of bias include the small population, narrow scoring range, and the lack of blinding. Further research, focusing on use for resident assessment and clinical significance with a larger population and blinding of graders, is needed. PMID:23687548
IR characteristic simulation of city scenes based on radiosity model
NASA Astrophysics Data System (ADS)
Xiong, Xixian; Zhou, Fugen; Bai, Xiangzhi; Yu, Xiyu
2013-09-01
Reliable modeling for thermal infrared (IR) signatures of real-world city scenes is required for signature management of civil and military platforms. Traditional modeling methods generally assume that scene objects are individual entities during the physical processes occurring in infrared range. However, in reality, the physical scene involves convective and conductive interactions between objects as well as the radiations interactions between objects. A method based on radiosity model describes these complex effects. It has been developed to enable an accurate simulation for the radiance distribution of the city scenes. Firstly, the physical processes affecting the IR characteristic of city scenes were described. Secondly, heat balance equations were formed on the basis of combining the atmospheric conditions, shadow maps and the geometry of scene. Finally, finite difference method was used to calculate the kinetic temperature of object surface. A radiosity model was introduced to describe the scattering effect of radiation between surface elements in the scene. By the synthesis of objects radiance distribution in infrared range, we could obtain the IR characteristic of scene. Real infrared images and model predictions were shown and compared. The results demonstrate that this method can realistically simulate the IR characteristic of city scenes. It effectively displays the infrared shadow effects and the radiation interactions between objects in city scenes.
Physically based modeling of bedrock incision by abrasion, plucking, and macroabrasion
NASA Astrophysics Data System (ADS)
Chatanantavet, Phairot; Parker, Gary
2009-11-01
Many important insights into the dynamic coupling among climate, erosion, and tectonics in mountain areas have derived from several numerical models of the past few decades which include descriptions of bedrock incision. However, many questions regarding incision processes and morphology of bedrock streams still remain unanswered. A more mechanistically based incision model is needed as a component to study landscape evolution. Major bedrock incision processes include (among other mechanisms) abrasion by bed load, plucking, and macroabrasion (a process of fracturing of the bedrock into pluckable sizes mediated by particle impacts). The purpose of this paper is to develop a physically based model of bedrock incision that includes all three processes mentioned above. To build the model, we start by developing a theory of abrasion, plucking, and macroabrasion mechanisms. We then incorporate hydrology, the evaluation of boundary shear stress, capacity transport, an entrainment relation for pluckable particles, a routing model linking in-stream sediment and hillslopes, a formulation for alluvial channel coverage, a channel width relation, Hack's law, and Exner equation into the model so that we can simulate the evolution of bedrock channels. The model successfully simulates various features of bed elevation profiles of natural bedrock rivers under a variety of input or boundary conditions. The results also illustrate that knickpoints found in bedrock rivers may be autogenic in addition to being driven by base level fall and lithologic changes. This supports the concept that bedrock incision by knickpoint migration may be an integral part of normal incision processes. The model is expected to improve the current understanding of the linkage among physically meaningful input parameters, the physics of incision process, and morphological changes in bedrock streams.
Numerical simulation and analysis for low-frequency rock physics measurements
NASA Astrophysics Data System (ADS)
Dong, Chunhui; Tang, Genyang; Wang, Shangxu; He, Yanxiao
2017-10-01
In recent years, several experimental methods have been introduced to measure the elastic parameters of rocks in the relatively low-frequency range, such as differential acoustic resonance spectroscopy (DARS) and stress-strain measurement. It is necessary to verify the validity and feasibility of the applied measurement method and to quantify the sources and levels of measurement error. Relying solely on the laboratory measurements, however, we cannot evaluate the complete wavefield variation in the apparatus. Numerical simulations of elastic wave propagation, on the other hand, are used to model the wavefield distribution and physical processes in the measurement systems, and to verify the measurement theory and analyze the measurement results. In this paper we provide a numerical simulation method to investigate the acoustic waveform response of the DARS system and the quasi-static responses of the stress-strain system, both of which use axisymmetric apparatus. We applied this method to parameterize the properties of the rock samples, the sample locations and the sensor (hydrophone and strain gauges) locations and simulate the measurement results, i.e. resonance frequencies and axial and radial strains on the sample surface, from the modeled wavefield following the physical experiments. Rock physical parameters were estimated by inversion or direct processing of these data, and showed a perfect match with the true values, thus verifying the validity of the experimental measurements. Error analysis was also conducted for the DARS system with 18 numerical samples, and the sources and levels of error are discussed. In particular, we propose an inversion method for estimating both density and compressibility of these samples. The modeled results also showed fairly good agreement with the real experiment results, justifying the effectiveness and feasibility of our modeling method.
Hydrological and water quality processes simulation by the integrated MOHID model
NASA Astrophysics Data System (ADS)
Epelde, Ane; Antiguedad, Iñaki; Brito, David; Eduardo, Jauch; Neves, Ramiro; Sauvage, Sabine; Sánchez-Pérez, José Miguel
2016-04-01
Different modelling approaches have been used in recent decades to study the water quality degradation caused by non-point source pollution. In this study, the MOHID fully distributed and physics-based model has been employed to simulate hydrological processes and nitrogen dynamics in a nitrate vulnerable zone: the Alegria River watershed (Basque Country, Northern Spain). The results of this study indicate that the MOHID code is suitable for hydrological processes simulation at the watershed scale, as the model shows satisfactory performance at simulating the discharge (with NSE: 0.74 and 0.76 during calibration and validation periods, respectively). The agronomical component of the code, allowed the simulation of agricultural practices, which lead to adequate crop yield simulation in the model. Furthermore, the nitrogen exportation also shows satisfactory performance (with NSE: 0.64 and 0.69 during calibration and validation periods, respectively). While the lack of field measurements do not allow to evaluate the nutrient cycling processes in depth, it has been observed that the MOHID model simulates the annual denitrification according to general ranges established for agricultural watersheds (in this study, 9 kg N ha-1 year-1). In addition, the model has simulated coherently the spatial distribution of the denitrification process, which is directly linked to the simulated hydrological conditions. Thus, the model has localized the highest rates nearby the discharge zone of the aquifer and also where the aquifer thickness is low. These results evidence the strength of this model to simulate watershed scale hydrological processes as well as the crop production and the agricultural activity derived water quality degradation (considering both nutrient exportation and nutrient cycling processes).
Overview of the CLIC detector and its physics potential
NASA Astrophysics Data System (ADS)
Ström, Rickard
2017-12-01
The CLIC detector and physics study (CLICdp) is an international collaboration that investigates the physics potential of the Compact Linear Collider (CLIC). CLIC is a high-energy electron-positron collider under development, aiming for centre-of-mass energies from a few hundred GeV to 3 TeV. In addition to physics studies based on full Monte Carlo simulations of signal and background processes, CLICdp performs cuttingedge hardware R&D. In this contribution CLICdp will present recent results from physics prospect studies, emphasising Higgs studies. Additionally the new CLIC detector model and the recently updated CLIC baseline staging scenario will be presented.
A Method for Functional Task Alignment Analysis of an Arthrocentesis Simulator.
Adams, Reid A; Gilbert, Gregory E; Buckley, Lisa A; Nino Fong, Rodolfo; Fuentealba, I Carmen; Little, Erika L
2018-05-16
During simulation-based education, simulators are subjected to procedures composed of a variety of tasks and processes. Simulators should functionally represent a patient in response to the physical action of these tasks. The aim of this work was to describe a method for determining whether a simulator does or does not have sufficient functional task alignment (FTA) to be used in a simulation. Potential performance checklist items were gathered from published arthrocentesis guidelines and aggregated into a performance checklist using Lawshe's method. An expert panel used this performance checklist and an FTA analysis questionnaire to evaluate a simulator's ability to respond to the physical actions required by the performance checklist. Thirteen items, from a pool of 39, were included on the performance checklist. Experts had mixed reviews of the simulator's FTA and its suitability for use in simulation. Unexpectedly, some positive FTA was found for several tasks where the simulator lacked functionality. By developing a detailed list of specific tasks required to complete a clinical procedure, and surveying experts on the simulator's response to those actions, educators can gain insight into the simulator's clinical accuracy and suitability. Unexpected of positive FTA ratings of function deficits suggest that further revision of the survey method is required.
An assessment of coupling algorithms for nuclear reactor core physics simulations
Hamilton, Steven; Berrill, Mark; Clarno, Kevin; ...
2016-04-01
This paper evaluates the performance of multiphysics coupling algorithms applied to a light water nuclear reactor core simulation. The simulation couples the k-eigenvalue form of the neutron transport equation with heat conduction and subchannel flow equations. We compare Picard iteration (block Gauss–Seidel) to Anderson acceleration and multiple variants of preconditioned Jacobian-free Newton–Krylov (JFNK). The performance of the methods are evaluated over a range of energy group structures and core power levels. A novel physics-based approximation to a Jacobian-vector product has been developed to mitigate the impact of expensive on-line cross section processing steps. Furthermore, numerical simulations demonstrating the efficiency ofmore » JFNK and Anderson acceleration relative to standard Picard iteration are performed on a 3D model of a nuclear fuel assembly. Both criticality (k-eigenvalue) and critical boron search problems are considered.« less
An assessment of coupling algorithms for nuclear reactor core physics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, Steven; Berrill, Mark; Clarno, Kevin
This paper evaluates the performance of multiphysics coupling algorithms applied to a light water nuclear reactor core simulation. The simulation couples the k-eigenvalue form of the neutron transport equation with heat conduction and subchannel flow equations. We compare Picard iteration (block Gauss–Seidel) to Anderson acceleration and multiple variants of preconditioned Jacobian-free Newton–Krylov (JFNK). The performance of the methods are evaluated over a range of energy group structures and core power levels. A novel physics-based approximation to a Jacobian-vector product has been developed to mitigate the impact of expensive on-line cross section processing steps. Furthermore, numerical simulations demonstrating the efficiency ofmore » JFNK and Anderson acceleration relative to standard Picard iteration are performed on a 3D model of a nuclear fuel assembly. Both criticality (k-eigenvalue) and critical boron search problems are considered.« less
An assessment of coupling algorithms for nuclear reactor core physics simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamilton, Steven, E-mail: hamiltonsp@ornl.gov; Berrill, Mark, E-mail: berrillma@ornl.gov; Clarno, Kevin, E-mail: clarnokt@ornl.gov
This paper evaluates the performance of multiphysics coupling algorithms applied to a light water nuclear reactor core simulation. The simulation couples the k-eigenvalue form of the neutron transport equation with heat conduction and subchannel flow equations. We compare Picard iteration (block Gauss–Seidel) to Anderson acceleration and multiple variants of preconditioned Jacobian-free Newton–Krylov (JFNK). The performance of the methods are evaluated over a range of energy group structures and core power levels. A novel physics-based approximation to a Jacobian-vector product has been developed to mitigate the impact of expensive on-line cross section processing steps. Numerical simulations demonstrating the efficiency of JFNKmore » and Anderson acceleration relative to standard Picard iteration are performed on a 3D model of a nuclear fuel assembly. Both criticality (k-eigenvalue) and critical boron search problems are considered.« less
Ablation dynamics - from absorption to heat accumulation/ultra-fast laser matter interaction
NASA Astrophysics Data System (ADS)
Kramer, Thorsten; Remund, Stefan; Jäggi, Beat; Schmid, Marc; Neuenschwander, Beat
2018-05-01
Ultra-short laser radiation is used in manifold industrial applications today. Although state-of-the-art laser sources are providing an average power of 10-100 W with repetition rates of up to several megahertz, most applications do not benefit from it. On the one hand, the processing speed is limited to some hundred millimeters per second by the dynamics of mechanical axes or galvanometric scanners. On the other hand, high repetition rates require consideration of new physical effects such as heat accumulation and shielding that might reduce the process efficiency. For ablation processes, process efficiency can be expressed by the specific removal rate, ablated volume per time, and average power. The analysis of the specific removal rate for different laser parameters, like average power, repetition rate or pulse duration, and process parameters, like scanning speed or material, can be used to find the best operation point for microprocessing applications. Analytical models and molecular dynamics simulations based on the so-called two-temperature model reveal the causes for the appearance of limiting physical effects. The findings of models and simulations can be used to take advantage and optimize processing strategies.
The physics of proton therapy.
Newhauser, Wayne D; Zhang, Rui
2015-04-21
The physics of proton therapy has advanced considerably since it was proposed in 1946. Today analytical equations and numerical simulation methods are available to predict and characterize many aspects of proton therapy. This article reviews the basic aspects of the physics of proton therapy, including proton interaction mechanisms, proton transport calculations, the determination of dose from therapeutic and stray radiations, and shielding design. The article discusses underlying processes as well as selected practical experimental and theoretical methods. We conclude by briefly speculating on possible future areas of research of relevance to the physics of proton therapy.
Newhauser, Wayne D; Zhang, Rui
2015-01-01
The physics of proton therapy has advanced considerably since it was proposed in 1946. Today analytical equations and numerical simulation methods are available to predict and characterize many aspects of proton therapy. This article reviews the basic aspects of the physics of proton therapy, including proton interaction mechanisms, proton transport calculations, the determination of dose from therapeutic and stray radiations, and shielding design. The article discusses underlying processes as well as selected practical experimental and theoretical methods. We conclude by briefly speculating on possible future areas of research of relevance to the physics of proton therapy. PMID:25803097
A Vision on the Status and Evolution of HEP Physics Software Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canal, P.; Elvira, D.; Hatcher, R.
2013-07-28
This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.
Kim, Jaewook; Woo, Sung Sik; Sarpeshkar, Rahul
2018-04-01
The analysis and simulation of complex interacting biochemical reaction pathways in cells is important in all of systems biology and medicine. Yet, the dynamics of even a modest number of noisy or stochastic coupled biochemical reactions is extremely time consuming to simulate. In large part, this is because of the expensive cost of random number and Poisson process generation and the presence of stiff, coupled, nonlinear differential equations. Here, we demonstrate that we can amplify inherent thermal noise in chips to emulate randomness physically, thus alleviating these costs significantly. Concurrently, molecular flux in thermodynamic biochemical reactions maps to thermodynamic electronic current in a transistor such that stiff nonlinear biochemical differential equations are emulated exactly in compact, digitally programmable, highly parallel analog "cytomorphic" transistor circuits. For even small-scale systems involving just 80 stochastic reactions, our 0.35-μm BiCMOS chips yield a 311× speedup in the simulation time of Gillespie's stochastic algorithm over COPASI, a fast biochemical-reaction software simulator that is widely used in computational biology; they yield a 15 500× speedup over equivalent MATLAB stochastic simulations. The chip emulation results are consistent with these software simulations over a large range of signal-to-noise ratios. Most importantly, our physical emulation of Poisson chemical dynamics does not involve any inherently sequential processes and updates such that, unlike prior exact simulation approaches, they are parallelizable, asynchronous, and enable even more speedup for larger-size networks.
Image-Based Reconstruction and Analysis of Dynamic Scenes in a Landslide Simulation Facility
NASA Astrophysics Data System (ADS)
Scaioni, M.; Crippa, J.; Longoni, L.; Papini, M.; Zanzi, L.
2017-12-01
The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time) reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC) technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.
Entangling spin-spin interactions of ions in individually controlled potential wells
NASA Astrophysics Data System (ADS)
Wilson, Andrew; Colombe, Yves; Brown, Kenton; Knill, Emanuel; Leibfried, Dietrich; Wineland, David
2014-03-01
Physical systems that cannot be modeled with classical computers appear in many different branches of science, including condensed-matter physics, statistical mechanics, high-energy physics, atomic physics and quantum chemistry. Despite impressive progress on the control and manipulation of various quantum systems, implementation of scalable devices for quantum simulation remains a formidable challenge. As one approach to scalability in simulation, here we demonstrate an elementary building-block of a configurable quantum simulator based on atomic ions. Two ions are trapped in separate potential wells that can individually be tailored to emulate a number of different spin-spin couplings mediated by the ions' Coulomb interaction together with classical laser and microwave fields. We demonstrate deterministic tuning of this interaction by independent control of the local wells and emulate a particular spin-spin interaction to entangle the internal states of the two ions with 0.81(2) fidelity. Extension of the building-block demonstrated here to a 2D-network, which ion-trap micro-fabrication processes enable, may provide a new quantum simulator architecture with broad flexibility in designing and scaling the arrangement of ions and their mutual interactions. This research was funded by the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), ONR, and the NIST Quantum Information Program.
NASA Astrophysics Data System (ADS)
Vautard, Robert; Christidis, Nikolaos; Ciavarella, Andrew; Alvarez-Castro, Carmen; Bellprat, Omar; Christiansen, Bo; Colfescu, Ioana; Cowan, Tim; Doblas-Reyes, Francisco; Eden, Jonathan; Hauser, Mathias; Hegerl, Gabriele; Hempelmann, Nils; Klehmet, Katharina; Lott, Fraser; Nangini, Cathy; Orth, René; Radanovics, Sabine; Seneviratne, Sonia I.; van Oldenborgh, Geert Jan; Stott, Peter; Tett, Simon; Wilcox, Laura; Yiou, Pascal
2018-04-01
A detailed analysis is carried out to assess the HadGEM3-A global atmospheric model skill in simulating extreme temperatures, precipitation and storm surges in Europe in the view of their attribution to human influence. The analysis is performed based on an ensemble of 15 atmospheric simulations forced with observed sea surface temperature of the 54 year period 1960-2013. These simulations, together with dual simulations without human influence in the forcing, are intended to be used in weather and climate event attribution. The analysis investigates the main processes leading to extreme events, including atmospheric circulation patterns, their links with temperature extremes, land-atmosphere and troposphere-stratosphere interactions. It also compares observed and simulated variability, trends and generalized extreme value theory parameters for temperature and precipitation. One of the most striking findings is the ability of the model to capture North-Atlantic atmospheric weather regimes as obtained from a cluster analysis of sea level pressure fields. The model also reproduces the main observed weather patterns responsible for temperature and precipitation extreme events. However, biases are found in many physical processes. Slightly excessive drying may be the cause of an overestimated summer interannual variability and too intense heat waves, especially in central/northern Europe. However, this does not seem to hinder proper simulation of summer temperature trends. Cold extremes appear well simulated, as well as the underlying blocking frequency and stratosphere-troposphere interactions. Extreme precipitation amounts are overestimated and too variable. The atmospheric conditions leading to storm surges were also examined in the Baltics region. There, simulated weather conditions appear not to be leading to strong enough storm surges, but winds were found in very good agreement with reanalyses. The performance in reproducing atmospheric weather patterns indicates that biases mainly originate from local and regional physical processes. This makes local bias adjustment meaningful for climate change attribution.
Computational Modeling of Hydrodynamics and Scour around Underwater Munitions
NASA Astrophysics Data System (ADS)
Liu, X.; Xu, Y.
2017-12-01
Munitions deposited in water bodies are a big threat to human health, safety, and environment. It is thus imperative to predict the motion and the resting status of the underwater munitions. A multitude of physical processes are involved, which include turbulent flows, sediment transport, granular material mechanics, 6 degree-of-freedom motion of the munition, and potential liquefaction. A clear understanding of this unique physical setting is currently lacking. Consequently, it is extremely hard to make reliable predictions. In this work, we present the computational modeling of two importance processes, i.e., hydrodynamics and scour, around munition objects. Other physical processes are also considered in our comprehensive model. However, they are not shown in this talk. To properly model the dynamics of the deforming bed and the motion of the object, an immersed boundary method is implemented in the open source CFD package OpenFOAM. Fixed bed and scour cases are simulated and compared with laboratory experiments. The future work of this project will implement the coupling between all the physical processes.
α-induced reaction cross section measurements on 197Au
NASA Astrophysics Data System (ADS)
Szücs, Tamás; Gyürky, György; Halász, Zoltán; Kiss, Gábor Gy.; Fülöp, Zsolt
2018-01-01
The γ-process is responsible for creating the majority of the isotopes of heavier elements on the proton rich side of the valley of stability. The γ-process simulations fail to reproduce the measured solar system abundance of these isotopes. The problem can lie in the not well known astrophysical scenarios where the process takes place, or in the not sufficiently known nuclear physics input. To improve the latter part, α-induced reaction cross section measurements on 197Au were carried out at Atomki. With this dataset new experimental information will become available, which can be later used as validation of the theoretical cross section calculations used in the γ-process simulations.
Through-process modelling of texture and anisotropy in AA5182
NASA Astrophysics Data System (ADS)
Crumbach, M.; Neumann, L.; Goerdeler, M.; Aretz, H.; Gottstein, G.; Kopp, R.
2006-07-01
A through-process texture and anisotropy prediction for AA5182 sheet production from hot rolling through cold rolling and annealing is reported. Thermo-mechanical process data predicted by the finite element method (FEM) package T-Pack based on the software LARSTRAN were fed into a combination of physics based microstructure models for deformation texture (GIA), work hardening (3IVM), nucleation texture (ReNuc), and recrystallization texture (StaRT). The final simulated sheet texture was fed into a FEM simulation of cup drawing employing a new concept of interactively updated texture based yield locus predictions. The modelling results of texture development and anisotropy were compared to experimental data. The applicability to other alloys and processes is discussed.
NASA Astrophysics Data System (ADS)
Klejment, Piotr; Kosmala, Alicja; Foltyn, Natalia; Dębski, Wojciech
2017-04-01
The earthquake focus is the point where a rock under external stress starts to fracture. Understanding earthquake nucleation and earthquake dynamics requires thus understanding of fracturing of brittle materials. This, however, is a continuing problem and enduring challenge to geoscience. In spite of significant progress we still do not fully understand the failure of rock materials due to extreme stress concentration in natural condition. One of the reason of this situation is that information about natural or induced seismic events is still not sufficient for precise description of physical processes in seismic foci. One of the possibility of improving this situation is using numerical simulations - a powerful tool of contemporary physics. For this reason we used an advanced implementation of the Discrete Element Method (DEM). DEM's main task is to calculate physical properties of materials which are represented as an assembly of a great number of particles interacting with each other. We analyze the possibility of using DEM for describing materials during so called Brazilian Test. Brazilian Test is a testing method to obtain the tensile strength of brittle material. One of the primary reasons for conducting such simulations is to measure macroscopic parameters of the rock sample. We would like to report our efforts of describing the fracturing process during the Brazilian Test from the microscopic point of view and give an insight into physical processes preceding materials failure.
NASA Astrophysics Data System (ADS)
Fang, Y.; Huang, M.; Liu, C.; Li, H.; Leung, L. R.
2013-11-01
Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into Earth system models (e.g., community land models (CLMs)), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module, Next Generation BioGeoChemical Module (NGBGC), version 1.0, with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter, and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into CLM. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems. The method presented here could in theory be applied to simulate biogeochemical cycles in other Earth system models.
NASA Astrophysics Data System (ADS)
Kouznetsova, I.; Gerhard, J. I.; Mao, X.; Barry, D. A.; Robinson, C.; Brovelli, A.; Harkness, M.; Fisher, A.; Mack, E. E.; Payne, J. A.; Dworatzek, S.; Roberts, J.
2008-12-01
A detailed model to simulate trichloroethene (TCE) dechlorination in anaerobic groundwater systems has been developed and implemented through PHAST, a robust and flexible geochemical modeling platform. The approach is comprehensive but retains flexibility such that models of varying complexity can be used to simulate TCE biodegradation in the vicinity of nonaqueous phase liquid (NAPL) source zones. The complete model considers a full suite of biological (e.g., dechlorination, fermentation, sulfate and iron reduction, electron donor competition, toxic inhibition, pH inhibition), physical (e.g., flow and mass transfer) and geochemical processes (e.g., pH modulation, gas formation, mineral interactions). Example simulations with the model demonstrated that the feedback between biological, physical, and geochemical processes is critical. Successful simulation of a thirty-two-month column experiment with site soil, complex groundwater chemistry, and exhibiting both anaerobic dechlorination and endogenous respiration, provided confidence in the modeling approach. A comprehensive suite of batch simulations was then conducted to estimate the sensitivity of predicted TCE degradation to the 36 model input parameters. A local sensitivity analysis was first employed to rank the importance of parameters, revealing that 5 parameters consistently dominated model predictions across a range of performance metrics. A global sensitivity analysis was then performed to evaluate the influence of a variety of full parameter data sets available in the literature. The modeling study was performed as part of the SABRE (Source Area BioREmediation) project, a public/private consortium whose charter is to determine if enhanced anaerobic bioremediation can result in effective and quantifiable treatment of chlorinated solvent DNAPL source areas. The modelling conducted has provided valuable insight into the complex interactions between processes in the evolving biogeochemical systems, particularly at the laboratory scale.
NASA Astrophysics Data System (ADS)
Li, Xiaoyi; Gao, Hui; Soteriou, Marios C.
2017-08-01
Atomization of extremely high viscosity liquid can be of interest for many applications in aerospace, automotive, pharmaceutical, and food industries. While detailed atomization measurements usually face grand challenges, high-fidelity numerical simulations offer the advantage to comprehensively explore the atomization details. In this work, a previously validated high-fidelity first-principle simulation code HiMIST is utilized to simulate high-viscosity liquid jet atomization in crossflow. The code is used to perform a parametric study of the atomization process in a wide range of Ohnesorge numbers (Oh = 0.004-2) and Weber numbers (We = 10-160). Direct comparisons between the present study and previously published low-viscosity jet in crossflow results are performed. The effects of viscous damping and slowing on jet penetration, liquid surface instabilities, ligament formation/breakup, and subsequent droplet formation are investigated. Complex variations in near-field and far-field jet penetrations with increasing Oh at different We are observed and linked with the underlying jet deformation and breakup physics. Transition in breakup regimes and increase in droplet size with increasing Oh are observed, mostly consistent with the literature reports. The detailed simulations elucidate a distinctive edge-ligament-breakup dominated process with long surviving ligaments for the higher Oh cases, as opposed to a two-stage edge-stripping/column-fracture process for the lower Oh counterparts. The trend of decreasing column deflection with increasing We is reversed as Oh increases. A predominantly unimodal droplet size distribution is predicted at higher Oh, in contrast to the bimodal distribution at lower Oh. It has been found that both Rayleigh-Taylor and Kelvin-Helmholtz linear stability theories cannot be easily applied to interpret the distinct edge breakup process and further study of the underlying physics is needed.
Simulation based analysis of laser beam brazing
NASA Astrophysics Data System (ADS)
Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael
2016-03-01
Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.
WASTE CONDITIONING FOR TANK HEEL TRANSFER
DOE Office of Scientific and Technical Information (OSTI.GOV)
M.A. Ebadian, Ph.D.
1999-01-01
This report summarizes the research carried out at Florida International University's Hemispheric Center for Environmental Technology (FIU-HCET) for the fiscal year 1998 (FY98) under the Tank Focus Area (TFA) project ''Waste Conditioning for Tank Slurry Transfer.'' The objective of this project is to determine the effect of chemical and physical properties on the waste conditioning process and transfer. The focus of this research consisted in building a waste conditioning experimental facility to test different slurry simulants under different conditions, and analyzing their chemical and physical properties. This investigation would provide experimental data and analysis results that can make the tankmore » waste conditioning process more efficient, improve the transfer system, and influence future modifications to the waste conditioning and transfer system. A waste conditioning experimental facility was built in order to test slurry simulants. The facility consists of a slurry vessel with several accessories for parameter control and sampling. The vessel also has a lid system with a shaft-mounted propeller connected to an air motor. In addition, a circulation system is connected to the slurry vessel for simulant cooling and heating. Experimental data collection and analysis of the chemical and physical properties of the tank slurry simulants has been emphasized. For this, one waste slurry simulant (Fernald) was developed, and another two simulants (SRS and Hanford) obtained from DOE sites were used. These simulants, composed of water, soluble metal salts, and insoluble solid particles, were used to represent the actual radioactive waste slurries from different DOE sites. The simulants' chemical and physical properties analyzed include density, viscosity, pH, settling rate, and volubility. These analyses were done to samples obtained from different experiments performed at room temperature but different mixing time and strength. The experimental results indicate that the viscosity of the slurries follow the Bingham plastic model, especially when the solids concentration is increased. At low concentrations slurries may behave as Newtonian fluids. The three simulants follow a similar settling rate behavior. This behavior can be explained as a combination of one or more decreasing exponential curves. This means that the particle settling rate of the simulants decreases exponentially as time increases. The pH range for the three simulants was from 8 to 13 at all concentrations. The SRS simulant showed the highest pH, around 12; the other two simulants, Hanford and Fernald, had about the same pH range, from 3 to 9. When comparing volubility of the three simulants at the same concentration, SRS simulant showed higher volubility, followed by the Hanford simulant and the Fernald simulant, in that order. Further work is scheduled for next year (FY99) in this project, when other parameters like simulants particle size distribution, particle shape, and crystallization behavior will be studied. The same tests performed this period also will be performed at different temperatures for data comparison.« less
NASA Astrophysics Data System (ADS)
Nyckowiak, Jedrzej; Lesny, Jacek; Haas, Edwin; Juszczak, Radoslaw; Kiese, Ralf; Butterbach-Bahl, Klaus; Olejnik, Janusz
2014-05-01
Modeling of nitrous oxide emissions from soil is very complex. Many different biological and chemical processes take place in soils which determine the amount of emitted nitrous oxide. Additionaly, biogeochemical models contain many detailed factors which may determine fluxes and other simulated variables. We used the LandscapeDNDC model in order to simulate N2O emissions, crop yields and soil physical properties from mineral cultivated soils in Poland. Nitrous oxide emissions from soils were modeled for fields with winter wheat, winter rye, spring barley, triticale, potatoes and alfalfa crops. Simulations were carried out for the plots of the Brody arable experimental station of Poznan University of Life Science in western Poland and covered the period 2003 - 2012. The model accuracy and its efficiency was determined by comparing simulations result with measurements of nitrous oxide emissions (measured with static chambers) from about 40 field campaigns. N2O emissions are strongly dependent on temperature and soil water content, hence we compared also simulated soil temperature at 10cm depth and soil water content at the same depth with the daily measured values of these driving variables. We compared also simulated yield quantities for each individual experimental plots with yield quantities which were measured in the period 2003-2012. We conclude that the LandscapeDNDC model is capable to simulate soil N2O emissions, crop yields and physical properties of soil with satisfactorily good accuracy and efficiency.
NASA Astrophysics Data System (ADS)
Liu, C.; Yang, X.; Bailey, V. L.; Bond-Lamberty, B. P.; Hinkle, C.
2013-12-01
Mathematical representations of hydrological and biogeochemical processes in soil, plant, aquatic, and atmospheric systems vary with scale. Process-rich models are typically used to describe hydrological and biogeochemical processes at the pore and small scales, while empirical, correlation approaches are often used at the watershed and regional scales. A major challenge for multi-scale modeling is that water flow, biogeochemical processes, and reactive transport are described using different physical laws and/or expressions at the different scales. For example, the flow is governed by the Navier-Stokes equations at the pore-scale in soils, by the Darcy law in soil columns and aquifer, and by the Navier-Stokes equations again in open water bodies (ponds, lake, river) and atmosphere surface layer. This research explores whether the physical laws at the different scales and in different physical domains can be unified to form a unified multi-scale model (UMSM) to systematically investigate the cross-scale, cross-domain behavior of fundamental processes at different scales. This presentation will discuss our research on the concept, mathematical equations, and numerical execution of the UMSM. Three-dimensional, multi-scale hydrological processes at the Disney Wilderness Preservation (DWP) site, Florida will be used as an example for demonstrating the application of the UMSM. In this research, the UMSM was used to simulate hydrological processes in rooting zones at the pore and small scales including water migration in soils under saturated and unsaturated conditions, root-induced hydrological redistribution, and role of rooting zone biogeochemical properties (e.g., root exudates and microbial mucilage) on water storage and wetting/draining. The small scale simulation results were used to estimate effective water retention properties in soil columns that were superimposed on the bulk soil water retention properties at the DWP site. The UMSM parameterized from smaller scale simulations were then used to simulate coupled flow and moisture migration in soils in saturated and unsaturated zones, surface and groundwater exchange, and surface water flow in streams and lakes at the DWP site under dynamic precipitation conditions. Laboratory measurements of soil hydrological and biogeochemical properties are used to parameterize the UMSM at the small scales, and field measurements are used to evaluate the UMSM.
NASA Astrophysics Data System (ADS)
Blázquez, M.; Egizabal, A.; Unzueta, I.
2014-08-01
The LIFE+ Project SIRENA, Simulation of the release of nanomaterials from consumer products for environmental exposure assessment, (LIFE11 ENV/ES/596) has set up a Technological Surveillance System (TSS) to trace technical references at worldwide level related to nanocomposites and the release from nanocomposites. So far a total of seventy three items of different nature (from peer reviewed articles to presentations and contributions to congresses) have been selected and classified as "nanomaterials release simulation technologies". In present document, different approaches for the simulation of different life cycle stages through the physical degradation of polymer nanocomposites at laboratory scale are assessed. In absence of a reference methodology, the comparison of the different protocols used still remains a challenge.
Badal, Andreu; Badano, Aldo
2009-11-01
It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDATM programming model (NVIDIA Corporation, Santa Clara, CA). An outline of the new code and a sample x-ray imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.
Statistically Modeling I-V Characteristics of CNT-FET with LASSO
NASA Astrophysics Data System (ADS)
Ma, Dongsheng; Ye, Zuochang; Wang, Yan
2017-08-01
With the advent of internet of things (IOT), the need for studying new material and devices for various applications is increasing. Traditionally we build compact models for transistors on the basis of physics. But physical models are expensive and need a very long time to adjust for non-ideal effects. As the vision for the application of many novel devices is not certain or the manufacture process is not mature, deriving generalized accurate physical models for such devices is very strenuous, whereas statistical modeling is becoming a potential method because of its data oriented property and fast implementation. In this paper, one classical statistical regression method, LASSO, is used to model the I-V characteristics of CNT-FET and a pseudo-PMOS inverter simulation based on the trained model is implemented in Cadence. The normalized relative mean square prediction error of the trained model versus experiment sample data and the simulation results show that the model is acceptable for digital circuit static simulation. And such modeling methodology can extend to general devices.
Louis R. Iverson; Frank R. Thompson; Stephen Matthews; Matthew Peters; Anantha Prasad; William D. Dijak; Jacob Fraser; Wen J. Wang; Brice Hanberry; Hong He; Maria Janowiak; Patricia Butler; Leslie Brandt; Chris Swanston
2016-01-01
Context. Species distribution models (SDM) establish statistical relationships between the current distribution of species and key attributes whereas process-based models simulate ecosystem and tree species dynamics based on representations of physical and biological processes. TreeAtlas, which uses DISTRIB SDM, and Linkages and LANDIS PRO, process...
A physics based method for combining multiple anatomy models with application to medical simulation.
Zhu, Yanong; Magee, Derek; Ratnalingam, Rishya; Kessel, David
2009-01-01
We present a physics based approach to the construction of anatomy models by combining components from different sources; different image modalities, protocols, and patients. Given an initial anatomy, a mass-spring model is generated which mimics the physical properties of the solid anatomy components. This helps maintain valid spatial relationships between the components, as well as the validity of their shapes. Combination can be either replacing/modifying an existing component, or inserting a new component. The external forces that deform the model components to fit the new shape are estimated from Gradient Vector Flow and Distance Transform maps. We demonstrate the applicability and validity of the described approach in the area of medical simulation, by showing the processes of non-rigid surface alignment, component replacement, and component insertion.
WEPP Model applications for evaluations of best management practices
D. C. Flanagan; W. J. Elliott; J. R. Frankenberger; C. Huang
2010-01-01
The Water Erosion Prediction Project (WEPP) model is a process-based erosion prediction technology for application to small watersheds and hillslope profiles, under agricultural, forested, rangeland, and other land management conditions. Developed by the United States Department of Agriculture (USDA) over the past 25 years, WEPP simulates many of the physical processes...
Advances in Integrated Computational Materials Engineering "ICME"
NASA Astrophysics Data System (ADS)
Hirsch, Jürgen
The methods of Integrated Computational Materials Engineering that were developed and successfully applied for Aluminium have been constantly improved. The main aspects and recent advances of integrated material and process modeling are simulations of material properties like strength and forming properties and for the specific microstructure evolution during processing (rolling, extrusion, annealing) under the influence of material constitution and process variations through the production process down to the final application. Examples are discussed for the through-process simulation of microstructures and related properties of Aluminium sheet, including DC ingot casting, pre-heating and homogenization, hot and cold rolling, final annealing. New results are included of simulation solution annealing and age hardening of 6xxx alloys for automotive applications. Physically based quantitative descriptions and computer assisted evaluation methods are new ICME methods of integrating new simulation tools also for customer applications, like heat affected zones in welding of age hardening alloys. The aspects of estimating the effect of specific elements due to growing recycling volumes requested also for high end Aluminium products are also discussed, being of special interest in the Aluminium producing industries.
Thermomechanical Simulation of the Splashing of Ceramic Droplets on a Rigid Substrate
NASA Astrophysics Data System (ADS)
Bertagnolli, Mauro; Marchese, Maurizio; Jacucci, Gianni; St. Doltsinis, Ioannis; Noelting, Swen
1997-05-01
Finite element simulation techniques have been applied to the spreading process of single ceramic liquid droplets impacting on a flat cold surface under plasma-spraying conditions. The goal of the present investigation is to predict the geometrical form of the splat as a function of technological process parameters, such as initial temperature and velocity, and to follow the thermal field developing in the droplet up to solidification. A non-linear finite element programming system has been utilized in order to model the complex physical phenomena involved in the present impact process. The Lagrangean description of the motion of the viscous melt in the drops, as constrained by surface tension and the developing contact with the target, has been coupled to an analysis of transient thermal phenomena accounting also for the solidification of the material. The present study refers to a parameter spectrum as from experimental data of technological relevance. The significance of process parameters for the most pronounced physical phenomena is discussed as are also the consequences of modelling. We consider the issue of solidification as well and touch on the effect of partially unmelted material.
Experiments on Dust Grain Charging
NASA Technical Reports Server (NTRS)
Abbas, M. N.; Craven, P. D.; Spann, J. F.; Tankosic, D.; LeClair, A.; West, E. A.
2004-01-01
Dust particles in various astrophysical environments are charged by a variety of mechanisms generally involving collisional processes with other charged particles and photoelectric emission with UV radiation from nearby sources. The sign and the magnitude of the particle charge are determined by the competition between the charging processes by UV radiation and collisions with charged particles. Knowledge of the particle charges and equilibrium potentials is important for understanding of a number of physical processes. The charge of a dust grain is thus a fundamental parameter that influences the physics of dusty plasmas, processes in the interplanetary medium and interstellar medium, interstellar dust clouds, planetary rings, cometary and outer atmospheres of planets etc. In this paper we present some results of experiments on charging of dust grains carried out on a laboratory facility capable levitating micron size dust grains in an electrodynamic balance in simulated space environments. The charging/discharging experiments were carried out by exposing the dust grains to energetic electron beams and UV radiation. Photoelectric efficiencies and yields of micron size dust grains of SiO2, and lunar simulates obtained from NASA-JSC will be presented.
NASA Astrophysics Data System (ADS)
Nellist, C.; Dinu, N.; Gkougkousis, E.; Lounis, A.
2015-06-01
The LHC accelerator complex will be upgraded between 2020-2022, to the High-Luminosity-LHC, to considerably increase statistics for the various physics analyses. To operate under these challenging new conditions, and maintain excellent performance in track reconstruction and vertex location, the ATLAS pixel detector must be substantially upgraded and a full replacement is expected. Processing techniques for novel pixel designs are optimised through characterisation of test structures in a clean room and also through simulations with Technology Computer Aided Design (TCAD). A method to study non-perpendicular tracks through a pixel device is discussed. Comparison of TCAD simulations with Secondary Ion Mass Spectrometry (SIMS) measurements to investigate the doping profile of structures and validate the simulation process is also presented.
Numerical simulation of rock fragmentation during cutting by conical picks under confining pressure
NASA Astrophysics Data System (ADS)
Li, Xuefeng; Wang, Shibo; Ge, Shirong; Malekian, Reza; Li, Zhixiong
2017-12-01
In this article, the effect of confining pressure on rock fragmentation process during cutting was investigated by numerical simulation with a discrete element method (DEM). Four kinds of sandstones with different physical properties were simulated in the rock cutting models under different confining pressures. The rock fragmentation process, the cutting force, and the specific energy under different confining pressures were analyzed. With the increase in confining pressure and rock strength, the vertical propagation of cracks was restrained. Rock samples were compacted and strengthened by confining pressure resulting in the increase of the cutting force. The specific energy of rock cutting linearly increased with the increase of the confining pressure ratio.
NASA Astrophysics Data System (ADS)
Nassiri, Ali; Vivek, Anupam; Abke, Tim; Liu, Bert; Lee, Taeseon; Daehn, Glenn
2017-06-01
Numerical simulations of high-velocity impact welding are extremely challenging due to the coupled physics and highly dynamic nature of the process. Thus, conventional mesh-based numerical methodologies are not able to accurately model the process owing to the excessive mesh distortion close to the interface of two welded materials. A simulation platform was developed using smoothed particle hydrodynamics, implemented in a parallel architecture on a supercomputer. Then, the numerical simulations were compared to experimental tests conducted by vaporizing foil actuator welding. The close correspondence of the experiment and modeling in terms of interface characteristics allows the prediction of local temperature and strain distributions, which are not easily measured.
Multiscale simulation of molecular processes in cellular environments.
Chiricotto, Mara; Sterpone, Fabio; Derreumaux, Philippe; Melchionna, Simone
2016-11-13
We describe the recent advances in studying biological systems via multiscale simulations. Our scheme is based on a coarse-grained representation of the macromolecules and a mesoscopic description of the solvent. The dual technique handles particles, the aqueous solvent and their mutual exchange of forces resulting in a stable and accurate methodology allowing biosystems of unprecedented size to be simulated.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2016 The Author(s).
Low-Frequency Waves in HF Heating of the Ionosphere
NASA Astrophysics Data System (ADS)
Sharma, A. S.; Eliasson, B.; Milikh, G. M.; Najmi, A.; Papadopoulos, K.; Shao, X.; Vartanyan, A.
2016-02-01
Ionospheric heating experiments have enabled an exploration of the ionosphere as a large-scale natural laboratory for the study of many plasma processes. These experiments inject high-frequency (HF) radio waves using high-power transmitters and an array of ground- and space-based diagnostics. This chapter discusses the excitation and propagation of low-frequency waves in HF heating of the ionosphere. The theoretical aspects and the associated models and simulations, and the results from experiments, mostly from the HAARP facility, are presented together to provide a comprehensive interpretation of the relevant plasma processes. The chapter presents the plasma model of the ionosphere for describing the physical processes during HF heating, the numerical code, and the simulations of the excitation of low-frequency waves by HF heating. It then gives the simulations of the high-latitude ionosphere and mid-latitude ionosphere. The chapter also briefly discusses the role of kinetic processes associated with wave generation.
Multi-scale Modeling of Arctic Clouds
NASA Astrophysics Data System (ADS)
Hillman, B. R.; Roesler, E. L.; Dexheimer, D.
2017-12-01
The presence and properties of clouds are critically important to the radiative budget in the Arctic, but clouds are notoriously difficult to represent in global climate models (GCMs). The challenge stems partly from a disconnect in the scales at which these models are formulated and the scale of the physical processes important to the formation of clouds (e.g., convection and turbulence). Because of this, these processes are parameterized in large-scale models. Over the past decades, new approaches have been explored in which a cloud system resolving model (CSRM), or in the extreme a large eddy simulation (LES), is embedded into each gridcell of a traditional GCM to replace the cloud and convective parameterizations to explicitly simulate more of these important processes. This approach is attractive in that it allows for more explicit simulation of small-scale processes while also allowing for interaction between the small and large-scale processes. The goal of this study is to quantify the performance of this framework in simulating Arctic clouds relative to a traditional global model, and to explore the limitations of such a framework using coordinated high-resolution (eddy-resolving) simulations. Simulations from the global model are compared with satellite retrievals of cloud fraction partioned by cloud phase from CALIPSO, and limited-area LES simulations are compared with ground-based and tethered-balloon measurements from the ARM Barrow and Oliktok Point measurement facilities.
Evaluation of tocopherol recovery through simulation of molecular distillation process.
Moraes, E B; Batistella, C B; Alvarez, M E Torres; Filho, Rubens Maciel; Maciel, M R Wolf
2004-01-01
DISMOL simulator was used to determine the best possible operating conditions to guide, in future studies, experimental works. This simulator needs several physical-chemical properties and often it is very difficult to determine them because of the complexity of the involved components. Their determinations must be made through correlations and/or predictions, in order to characterize the system and calculate it. The first try is to have simulation results of a system that later can be validated with experimental data. To implement, in the simulator, the necessary parameters of complex systems is a difficult task. In this work, we aimed to determe these properties in order to evaluate the tocopherol (vitamin E) recovery using a DISMOL simulator. The raw material used was the crude deodorizer distillate of soya oil. With this procedure, it is possible to determine the best operating conditions for experimental works and to evaluate the process in the separation of new systems, analyzing the profiles obtained from these simulations for the falling film molecular distillator.
Study on the CFD simulation of refrigerated container
NASA Astrophysics Data System (ADS)
Arif Budiyanto, Muhammad; Shinoda, Takeshi; Nasruddin
2017-10-01
The objective this study is to performed Computational Fluid Dynamic (CFD) simulation of refrigerated container in the container port. Refrigerated container is a thermal cargo container constructed from an insulation wall to carry kind of perishable goods. CFD simulation was carried out use cross sectional of container walls to predict surface temperatures of refrigerated container and to estimate its cooling load. The simulation model is based on the solution of the partial differential equations governing the fluid flow and heat transfer processes. The physical model of heat-transfer processes considered in this simulation are consist of solar radiation from the sun, heat conduction on the container walls, heat convection on the container surfaces and thermal radiation among the solid surfaces. The validation of simulation model was assessed uses surface temperatures at center points on each container walls obtained from the measurement experimentation in the previous study. The results shows the surface temperatures of simulation model has good agreement with the measurement data on all container walls.
NASA Technical Reports Server (NTRS)
Salas, Manuel D.
2007-01-01
The research program of the aerodynamics, aerothermodynamics and plasmadynamics discipline of NASA's Hypersonic Project is reviewed. Details are provided for each of its three components: 1) development of physics-based models of non-equilibrium chemistry, surface catalytic effects, turbulence, transition and radiation; 2) development of advanced simulation tools to enable increased spatial and time accuracy, increased geometrical complexity, grid adaptation, increased physical-processes complexity, uncertainty quantification and error control; and 3) establishment of experimental databases from ground and flight experiments to develop better understanding of high-speed flows and to provide data to validate and guide the development of simulation tools.
Comparison between Magnetopause and Magnetotail Reconnection Processes
NASA Astrophysics Data System (ADS)
Walker, R. J.; Lapenta, G.; Berchem, J.; El-Alaoui, M.
2017-12-01
For the past two years the Magnetosphere Multiscale (MMS) mission has returned detailed observations of reconnection at Earth's dayside magnetopause and now apogee has moved into the magnetotail to enable investigations of reconnection in the plasma sheet. We have been using a combination of global magnetohydrodynamic (MHD) simulation and particle-in-cell (PIC) simulation to model the physics of the reconnection process in both regions. In these calculations, we first use the MHD simulation to model the overall magnetospheric configuration and then carry out a large implicit PIC simulation by using the resulting MHD state to set the initial and boundary conditions. In this presentation, we review the similarities and differences found between the physical processes involved in reconnection occurring in the two different regions. For instance, similar crescent shaped distribution functions have been both observed and found in simulations of reconnection at the magnetopause and in the tail current sheet. Likewise, kinetic simulations have shown that the agyrotropy (non-gyrotropy) of the electron distribution function is the cleanest indicator of the location of the electron diffusion region (EDR) of both regions. There are also significant differences between the two regions. These are mostly related to the fact that separatrices are different because the plasma density is asymmetric across the dayside magnetopause and that smaller electric and guide fields are present in the night side. For instance, the jetting plasmas from reconnection in the tail form dipolarization fronts where energy exchange occurs while flux transfer events (flux ropes) form on the magnetopause and then move away from the reconnection site without forming dipolarization fronts. However, many uncertainties remain. For example, strong waves associated with the reconnection are found in the EDR at both places but it is not understood whether the kinetic mechanisms leading to the waves are the same or different.
Opticks : GPU Optical Photon Simulation for Particle Physics using NVIDIA® OptiX™
NASA Astrophysics Data System (ADS)
C, Blyth Simon
2017-10-01
Opticks is an open source project that integrates the NVIDIA OptiX GPU ray tracing engine with Geant4 toolkit based simulations. Massive parallelism brings drastic performance improvements with optical photon simulation speedup expected to exceed 1000 times Geant4 when using workstation GPUs. Optical photon simulation time becomes effectively zero compared to the rest of the simulation. Optical photons from scintillation and Cherenkov processes are allocated, generated and propagated entirely on the GPU, minimizing transfer overheads and allowing CPU memory usage to be restricted to optical photons that hit photomultiplier tubes or other photon detectors. Collecting hits into standard Geant4 hit collections then allows the rest of the simulation chain to proceed unmodified. Optical physics processes of scattering, absorption, scintillator reemission and boundary processes are implemented in CUDA OptiX programs based on the Geant4 implementations. Wavelength dependent material and surface properties as well as inverse cumulative distribution functions for reemission are interleaved into GPU textures providing fast interpolated property lookup or wavelength generation. Geometry is provided to OptiX in the form of CUDA programs that return bounding boxes for each primitive and ray geometry intersection positions. Some critical parts of the geometry such as photomultiplier tubes have been implemented analytically with the remainder being tessellated. OptiX handles the creation and application of a choice of acceleration structures such as boundary volume hierarchies and the transparent use of multiple GPUs. OptiX supports interoperation with OpenGL and CUDA Thrust that has enabled unprecedented visualisations of photon propagations to be developed using OpenGL geometry shaders to provide interactive time scrubbing and CUDA Thrust photon indexing to enable interactive history selection.
NASA Astrophysics Data System (ADS)
Yao, Zhixiong; Tang, Youmin; Chen, Dake; Zhou, Lei; Li, Xiaojing; Lian, Tao; Ul Islam, Siraj
2016-12-01
This study examines the possible impacts of coupling processes on simulations of the Indian Ocean Dipole (IOD). Emphasis is placed on the atmospheric model resolution and physics. Five experiments were conducted for this purpose, including one control run of the ocean-only model, four coupled experiments using two different versions of the Community Atmosphere Model (CAM4 and CAM5) and two different resolutions. The results show that the control run could effectively simulate various features of the IOD. The coupled experiments run at the higher resolution yielded more realistic IOD period and intensity than their counterparts at the low resolution. The coupled experiments using CAM5 generally showed a better simulation skill in the tropical Indian SST climatology and phase-locking than those using CAM4, but the wind anomalies were stronger and the IOD period were longer in the former experiments than in the latter. In all coupled experiments, the IOD intensity was much stronger than the observed intensity, which is attributable to wind-thermocline depth feedback and thermocline depth-subsurface temperature feedback. The CAM5 physics seems beneficial for the simulation of summer rainfall over the eastern equatorial Indian Ocean and the CAM4 physics tends to produce less biases over the western equatorial Indian Ocean, whereas the higher resolution tends to generate unrealistically strong meridional winds. The IOD-ENSO relationship was captured reasonably well in coupled experiments, with improvements in CAM5 relative to CAM4. However, the teleconnection of the IOD-Indian summer monsoon and ENSO-Indian summer monsoon was not realistically simulated in all experiments.
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1990-01-01
Strong artificial intelligence claims that conscious thought can arise in computers containing the right algorithms even though none of the programs or components of those computers understand which is going on. As proof, it asserts that brains are finite webs of neurons, each with a definite function governed by the laws of physics; this web has a set of equations that can be solved (or simulated) by a sufficiently powerful computer. Strong AI claims the Turing test as a criterion of success. A recent debate in Scientific American concludes that the Turing test is not sufficient, but leaves intact the underlying premise that thought is a computable process. The recent book by Roger Penrose, however, offers a sharp challenge, arguing that the laws of quantum physics may govern mental processes and that these laws may not be computable. In every area of mathematics and physics, Penrose finds evidence of nonalgorithmic human activity and concludes that mental processes are inherently more powerful than computational processes.
Improving Permafrost Hydrology Prediction Through Data-Model Integration
NASA Astrophysics Data System (ADS)
Wilson, C. J.; Andresen, C. G.; Atchley, A. L.; Bolton, W. R.; Busey, R.; Coon, E.; Charsley-Groffman, L.
2017-12-01
The CMIP5 Earth System Models were unable to adequately predict the fate of the 16GT of permafrost carbon in a warming climate due to poor representation of Arctic ecosystem processes. The DOE Office of Science Next Generation Ecosystem Experiment, NGEE-Arctic project aims to reduce uncertainty in the Arctic carbon cycle and its impact on the Earth's climate system by improved representation of the coupled physical, chemical and biological processes that drive how much buried carbon will be converted to CO2 and CH4, how fast this will happen, which form will dominate, and the degree to which increased plant productivity will offset increased soil carbon emissions. These processes fundamentally depend on permafrost thaw rate and its influence on surface and subsurface hydrology through thermal erosion, land subsidence and changes to groundwater flow pathways as soil, bedrock and alluvial pore ice and massive ground ice melts. LANL and its NGEE colleagues are co-developing data and models to better understand controls on permafrost degradation and improve prediction of the evolution of permafrost and its impact on Arctic hydrology. The LANL Advanced Terrestrial Simulator was built using a state of the art HPC software framework to enable the first fully coupled 3-dimensional surface-subsurface thermal-hydrology and land surface deformation simulations to simulate the evolution of the physical Arctic environment. Here we show how field data including hydrology, snow, vegetation, geochemistry and soil properties, are informing the development and application of the ATS to improve understanding of controls on permafrost stability and permafrost hydrology. The ATS is being used to inform parameterizations of complex coupled physical, ecological and biogeochemical processes for implementation in the DOE ACME land model, to better predict the role of changing Arctic hydrology on the global climate system. LA-UR-17-26566.
Schnek: A C++ library for the development of parallel simulation codes on regular grids
NASA Astrophysics Data System (ADS)
Schmitz, Holger
2018-05-01
A large number of algorithms across the field of computational physics are formulated on grids with a regular topology. We present Schnek, a library that enables fast development of parallel simulations on regular grids. Schnek contains a number of easy-to-use modules that greatly reduce the amount of administrative code for large-scale simulation codes. The library provides an interface for reading simulation setup files with a hierarchical structure. The structure of the setup file is translated into a hierarchy of simulation modules that the developer can specify. The reader parses and evaluates mathematical expressions and initialises variables or grid data. This enables developers to write modular and flexible simulation codes with minimal effort. Regular grids of arbitrary dimension are defined as well as mechanisms for defining physical domain sizes, grid staggering, and ghost cells on these grids. Ghost cells can be exchanged between neighbouring processes using MPI with a simple interface. The grid data can easily be written into HDF5 files using serial or parallel I/O.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCaskey, Alexander J.
There is a lack of state-of-the-art HPC simulation tools for simulating general quantum computing. Furthermore, there are no real software tools that integrate current quantum computers into existing classical HPC workflows. This product, the Quantum Virtual Machine (QVM), solves this problem by providing an extensible framework for pluggable virtual, or physical, quantum processing units (QPUs). It enables the execution of low level quantum assembly codes and returns the results of such executions.
Simulation of the halite dielectric spectrum in the infrared region
NASA Astrophysics Data System (ADS)
Aryomin, I. E.
2013-07-01
In this paper, we consider the practical efficiency of an simulation of a real frequency characteristic of complex permittivity of a NaCl halite crystal observed in the frequency range of establishment of elastic ionic polarization processes. In computational experiments, use was made of a cybernetic equation of permittivity, as well as the classical, corpuscular, and originally modified models of the considered physical phenomena.
WE-G-BRA-04: The Development of a Virtual Reality Dosimetry Training Platform for Physics Training.
Beavis, A; Ward, J
2012-06-01
Recently there has been a great deal of interest in the application of Simulation methodologies for training. We have previously developed a Virtual Environment for Radiotherapy Training, VERT, which simulates a fully interactive and functional Linac. Patient and plan data can be accessed across a DICOM interface, allowing the treatment process to be simulated. Here we present a newly developed range of Physics equipment, which allows the user to undertake realistic QC processes. Five devices are available: 1) scanning water phantom, 2) 'solid water' QC block/ion chamber, 3) light/ radiation field coincidence phantom, 4) laser alignment phantom and 5) water based calibration phantom with reference class and 'departmental' ion chamber. The devices were created to operate realistically and function as expected, each has an associated control screen which provides control and feedback information. The dosimetric devices respond appropriately to the beam qualities available on the Linac. Geometrical characteristics of the Linac, e.g. isocentre integrity, laser calibration and jaw calibrations can have random errors introduced in order to enable the user learn and observe fault conditions. In the calibration module appropriate factors for temperature and pressure must be set to correct for ambient, simulated, room conditions. The dosimetric devices can be used to characterise the Linac beams. Depth doses with Dmax of 15mm/29mm and d10 of 67%/77% respectively for 10cm square 6/15MV beams were measured. The Quality Indices (TPR20/10 ratios) can be measured as 0.668 and 0.761 respectively. At a simple level the tools can be used to demonstrate beam divergence or the effect of the inverse square law; They are also designed to be used to simulate the calibration of a new ion chamber. We have developed a novel set of tools that allow education of Physics processes via simulation training in our virtual environment. Both Authors are Founders and Directors of Vertual Ltd, a spin-out company that exists to commericalise the results of the research work presented in this abstract. © 2012 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Urata, Yumi; Kuge, Keiko; Kase, Yuko
2015-02-01
Phase transitions of pore water have never been considered in dynamic rupture simulations with thermal pressurization (TP), although they may control TP. From numerical simulations of dynamic rupture propagation including TP, in the absence of any water phase transition process, we predict that frictional heating and TP are likely to change liquid pore water into supercritical water for a strike-slip fault under depth-dependent stress. This phase transition causes changes of a few orders of magnitude in viscosity, compressibility, and thermal expansion among physical properties of water, thus affecting the diffusion of pore pressure. Accordingly, we perform numerical simulations of dynamic ruptures with TP, considering physical properties that vary with the pressure and temperature of pore water on a fault. To observe the effects of the phase transition, we assume uniform initial stress and no fault-normal variations in fluid density and viscosity. The results suggest that the varying physical properties decrease the total slip in cases with high stress at depth and small shear zone thickness. When fault-normal variations in fluid density and viscosity are included in the diffusion equation, they activate TP much earlier than the phase transition. As a consequence, the total slip becomes greater than that in the case with constant physical properties, eradicating the phase transition effect. Varying physical properties do not affect the rupture velocity, irrespective of the fault-normal variations. Thus, the phase transition of pore water has little effect on dynamic ruptures. Fault-normal variations in fluid density and viscosity may play a more significant role.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, Panagiotis; /Fermilab; Cary, John
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less
Semantic Information Processing of Physical Simulation Based on Scientific Concept Vocabulary Model
NASA Astrophysics Data System (ADS)
Kino, Chiaki; Suzuki, Yoshio; Takemiya, Hiroshi
Scientific Concept Vocabulary (SCV) has been developed to actualize Cognitive methodology based Data Analysis System: CDAS which supports researchers to analyze large scale data efficiently and comprehensively. SCV is an information model for processing semantic information for physics and engineering. In the model of SCV, all semantic information is related to substantial data and algorisms. Consequently, SCV enables a data analysis system to recognize the meaning of execution results output from a numerical simulation. This method has allowed a data analysis system to extract important information from a scientific view point. Previous research has shown that SCV is able to describe simple scientific indices and scientific perceptions. However, it is difficult to describe complex scientific perceptions by currently-proposed SCV. In this paper, a new data structure for SCV has been proposed in order to describe scientific perceptions in more detail. Additionally, the prototype of the new model has been constructed and applied to actual data of numerical simulation. The result means that the new SCV is able to describe more complex scientific perceptions.
NASA Astrophysics Data System (ADS)
Moore, J. K.
2016-02-01
The efficiency of the biological pump is influenced by complex interactions between chemical, biological, and physical processes. The efficiency of export out of surface waters and down through the water column to the deep ocean has been linked to a number of factors including biota community composition, production of mineral ballast components, physical aggregation and disaggregation processes, and ocean oxygen concentrations. I will examine spatial patterns in the export ratio and the efficiency of the biological pump at the global scale using the Community Earth System Model (CESM). There are strong spatial variations in the export efficiency as simulated by the CESM, which are strongly correlated with new nutrient inputs to the euphotic zone and their impacts on phytoplankton community structure. I will compare CESM simulations that include dynamic, variable export ratios driven by the phytoplankton community structure, with simulations that impose a near-constant export ratio to examine the effects of export efficiency on nutrient and surface chlorophyll distributions. The model predicted export ratios will also be compared with recent satellite-based estimates.
NASA Technical Reports Server (NTRS)
Moehlmann, D.; Kochan, H.
1992-01-01
The Space Simulator of the German Aerospace Research Establishment at Cologne, formerly used for testing satellites, is now, since 1987, the central unit within the research sub-program 'Comet-Simulation' (KOSI). The KOSI team has investigated physical processes relevant to comets and their surfaces. As a byproduct we gained experience in sample-handling under simulated space conditions. In broadening the scope of the research activities of the DLR Institute of Space Simulation an extension to 'Laboratory-Planetology' is planned. Following the KOSI-experiments a Mars Surface-Simulation with realistic minerals and surface soil in a suited environment (temperature, pressure, and CO2-atmosphere) is foreseen as the next step. Here, our main interest is centered on thermophysical properties of the Martian surface and energy transport (and related gas transport) through the surface. These laboratory simulation activities can be related to space missions as typical pre-mission and during-the-mission support of the experiments design and operations (simulation in parallel). Post mission experiments for confirmation and interpretation of results are of great value. The physical dimensions of the Space Simulator (cylinder of about 2.5 m diameter and 5 m length) allows for testing and qualification of experimental hardware under realistic Martian conditions.
The simulated clinical environment: Cognitive and emotional impact among undergraduates.
Tremblay, Marie-Laurence; Lafleur, Alexandre; Leppink, Jimmie; Dolmans, Diana H J M
2017-02-01
Simulated clinical immersion (SCI) is used in undergraduate healthcare programs to expose the learner to real-life situations in authentic simulated clinical environments. For novices, the environment in which the simulation occurs can be distracting and stressful, hence potentially compromising learning. This study aims to determine whether SCI (with environment) imposes greater extraneous cognitive load and stress on undergraduate pharmacy students than simulated patients (SP) (without environment). It also aims to explore how features of the simulated environment influence students' perception of learning. In this mixed-methods study, 143 undergraduate pharmacy students experienced both SCI and SP in a crossover design. After the simulations, participants rated their cognitive load and emotions. Thirty-five students met in focus groups to explore their perception of learning in simulation. Intrinsic and extraneous cognitive load and stress scores in SCI were significantly but modestly higher compared to SP. Qualitative findings reveal that the physical environment in SCI generated more stress and affected students? focus. In SP, students concentrated on clinical reasoning. SCI stimulated a focus on data collection but impeded in-depth problem solving processes. The physical environment in simulation influences what and how students learn. SCI was reported as more cognitively demanding than SP. Our findings emphasize the need for the development of adapted instructional design guidelines in simulation for novices.
Simulation methods supporting homologation of Electronic Stability Control in vehicle variants
NASA Astrophysics Data System (ADS)
Lutz, Albert; Schick, Bernhard; Holzmann, Henning; Kochem, Michael; Meyer-Tuve, Harald; Lange, Olav; Mao, Yiqin; Tosolin, Guido
2017-10-01
Vehicle simulation has a long tradition in the automotive industry as a powerful supplement to physical vehicle testing. In the field of Electronic Stability Control (ESC) system, the simulation process has been well established to support the ESC development and application by suppliers and Original Equipment Manufacturers (OEMs). The latest regulation of the United Nations Economic Commission for Europe UN/ECE-R 13 allows also for simulation-based homologation. This extends the usage of simulation from ESC development to homologation. This paper gives an overview of simulation methods, as well as processes and tools used for the homologation of ESC in vehicle variants. The paper first describes the generic homologation process according to the European Regulation (UN/ECE-R 13H, UN/ECE-R 13/11) and U.S. Federal Motor Vehicle Safety Standard (FMVSS 126). Subsequently the ESC system is explained as well as the generic application and release process at the supplier and OEM side. Coming up with the simulation methods, the ESC development and application process needs to be adapted for the virtual vehicles. The simulation environment, consisting of vehicle model, ESC model and simulation platform, is explained in detail with some exemplary use-cases. In the final section, examples of simulation-based ESC homologation in vehicle variants are shown for passenger cars, light trucks, heavy trucks and trailers. This paper is targeted to give a state-of-the-art account of the simulation methods supporting the homologation of ESC systems in vehicle variants. However, the described approach and the lessons learned can be used as reference in future for an extended usage of simulation-supported releases of the ESC system up to the development and release of driver assistance systems.
Three Dimensional Projection Environment for Molecular Design and Surgical Simulation
2011-08-01
bypasses the cumbersome meshing process . The deformation model is only comprised of mass nodes, which are generated by sampling the object volume before...force should minimize the penetration volume, the haptic feedback force is derived directly. Additionally, a post- processing technique is developed to...render distinct physi-cal tissue properties across different interaction areas. The proposed approach does not require any pre- processing and is
Parrish, Clyde F
2003-12-01
A series of workshops were sponsored by the Physical Science Division of NASA's Office of Biological and Physical Research to address operational gravity-compliant in-situ resource utilization and life support techologies. Workshop participants explored a Mars simulation study on Devon Island, Canada; the processing of carbon dioxide in regenerative life support systems; space tourism; rocket technology; plant growth research for closed ecological systems; and propellant extraction of planetary regoliths.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Danilovic, S.; Solanki, S. K.; Barthol, P.
Ellerman Bombs are signatures of magnetic reconnection, which is an important physical process in the solar atmosphere. How and where they occur is a subject of debate. In this paper, we analyze Sunrise/IMaX data, along with 3D MHD simulations that aim to reproduce the exact scenario proposed for the formation of these features. Although the observed event seems to be more dynamic and violent than the simulated one, simulations clearly confirm the basic scenario for the production of EBs. The simulations also reveal the full complexity of the underlying process. The simulated observations show that the Fe i 525.02 nm linemore » gives no information on the height where reconnection takes place. It can only give clues about the heating in the aftermath of the reconnection. However, the information on the magnetic field vector and velocity at this spatial resolution is extremely valuable because it shows what numerical models miss and how they can be improved.« less
The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles
NASA Technical Reports Server (NTRS)
Lytle, John K.
1999-01-01
Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.
NASA Technical Reports Server (NTRS)
Drozda, Tomasz G.; Axdahl, Erik L.; Cabell, Karen F.
2014-01-01
With the increasing costs of physics experiments and simultaneous increase in availability and maturity of computational tools it is not surprising that computational fluid dynamics (CFD) is playing an increasingly important role, not only in post-test investigations, but also in the early stages of experimental planning. This paper describes a CFD-based effort executed in close collaboration between computational fluid dynamicists and experimentalists to develop a virtual experiment during the early planning stages of the Enhanced Injection and Mixing project at NASA Langley Research Center. This projects aims to investigate supersonic combustion ramjet (scramjet) fuel injection and mixing physics, improve the understanding of underlying physical processes, and develop enhancement strategies and functional relationships relevant to flight Mach numbers greater than 8. The purpose of the virtual experiment was to provide flow field data to aid in the design of the experimental apparatus and the in-stream rake probes, to verify the nonintrusive measurements based on NO-PLIF, and to perform pre-test analysis of quantities obtainable from the experiment and CFD. The approach also allowed for the joint team to develop common data processing and analysis tools, and to test research ideas. The virtual experiment consisted of a series of Reynolds-averaged simulations (RAS). These simulations included the facility nozzle, the experimental apparatus with a baseline strut injector, and the test cabin. Pure helium and helium-air mixtures were used to determine the efficacy of different inert gases to model hydrogen injection. The results of the simulations were analyzed by computing mixing efficiency, total pressure recovery, and stream thrust potential. As the experimental effort progresses, the simulation results will be compared with the experimental data to calibrate the modeling constants present in the CFD and validate simulation fidelity. CFD will also be used to investigate different injector concepts, improve understanding of the flow structure and flow physics, and develop functional relationships. Both RAS and large eddy simulations (LES) are planned for post-test analysis of the experimental data.
A generic biogeochemical module for earth system models
NASA Astrophysics Data System (ADS)
Fang, Y.; Huang, M.; Liu, C.; Li, H.-Y.; Leung, L. R.
2013-06-01
Physical and biogeochemical processes regulate soil carbon dynamics and CO2 flux to and from the atmosphere, influencing global climate changes. Integration of these processes into earth system models (e.g. community land models - CLM), however, currently faces three major challenges: (1) extensive efforts are required to modify modeling structures and to rewrite computer programs to incorporate new or updated processes as new knowledge is being generated, (2) computational cost is prohibitively expensive to simulate biogeochemical processes in land models due to large variations in the rates of biogeochemical processes, and (3) various mathematical representations of biogeochemical processes exist to incorporate different aspects of fundamental mechanisms, but systematic evaluation of the different mathematical representations is difficult, if not impossible. To address these challenges, we propose a new computational framework to easily incorporate physical and biogeochemical processes into land models. The new framework consists of a new biogeochemical module with a generic algorithm and reaction database so that new and updated processes can be incorporated into land models without the need to manually set up the ordinary differential equations to be solved numerically. The reaction database consists of processes of nutrient flow through the terrestrial ecosystems in plants, litter and soil. This framework facilitates effective comparison studies of biogeochemical cycles in an ecosystem using different conceptual models under the same land modeling framework. The approach was first implemented in CLM and benchmarked against simulations from the original CLM-CN code. A case study was then provided to demonstrate the advantages of using the new approach to incorporate a phosphorus cycle into the CLM model. To our knowledge, the phosphorus-incorporated CLM is a new model that can be used to simulate phosphorus limitation on the productivity of terrestrial ecosystems.
A Multi-Scale Integrated Approach to Representing Watershed Systems: Significance and Challenges
NASA Astrophysics Data System (ADS)
Kim, J.; Ivanov, V. Y.; Katopodes, N.
2013-12-01
A range of processes associated with supplying services and goods to human society originate at the watershed level. Predicting watershed response to forcing conditions has been of high interest to many practical societal problems, however, remains challenging due to two significant properties of the watershed systems, i.e., connectivity and non-linearity. Connectivity implies that disturbances arising at any larger scale will necessarily propagate and affect local-scale processes; their local effects consequently influence other processes, and often convey nonlinear relationships. Physically-based, process-scale modeling is needed to approach the understanding and proper assessment of non-linear effects between the watershed processes. We have developed an integrated model simulating hydrological processes, flow dynamics, erosion and sediment transport, tRIBS-OFM-HRM (Triangulated irregular network - based Real time Integrated Basin Simulator-Overland Flow Model-Hairsine and Rose Model). This coupled model offers the advantage of exploring the hydrological effects of watershed physical factors such as topography, vegetation, and soil, as well as their feedback mechanisms. Several examples investigating the effects of vegetation on flow movement, the role of soil's substrate on sediment dynamics, and the driving role of topography on morphological processes are illustrated. We show how this comprehensive modeling tool can help understand interconnections and nonlinearities of the physical system, e.g., how vegetation affects hydraulic resistance depending on slope, vegetation cover fraction, discharge, and bed roughness condition; how the soil's substrate condition impacts erosion processes with an non-unique characteristic at the scale of a zero-order catchment; and how topographic changes affect spatial variations of morphologic variables. Due to feedback and compensatory nature of mechanisms operating in different watershed compartments, our conclusion is that a key to representing watershed systems lies in an integrated, interdisciplinary approach, whereby a physically-based model is used for assessments/evaluations associated with future changes in landuse, climate, and ecosystems.
On simulated annealing phase transitions in phylogeny reconstruction.
Strobl, Maximilian A R; Barker, Daniel
2016-08-01
Phylogeny reconstruction with global criteria is NP-complete or NP-hard, hence in general requires a heuristic search. We investigate the powerful, physically inspired, general-purpose heuristic simulated annealing, applied to phylogeny reconstruction. Simulated annealing mimics the physical process of annealing, where a liquid is gently cooled to form a crystal. During the search, periods of elevated specific heat occur, analogous to physical phase transitions. These simulated annealing phase transitions play a crucial role in the outcome of the search. Nevertheless, they have received comparably little attention, for phylogeny or other optimisation problems. We analyse simulated annealing phase transitions during searches for the optimal phylogenetic tree for 34 real-world multiple alignments. In the same way in which melting temperatures differ between materials, we observe distinct specific heat profiles for each input file. We propose this reflects differences in the search landscape and can serve as a measure for problem difficulty and for suitability of the algorithm's parameters. We discuss application in algorithmic optimisation and as a diagnostic to assess parameterisation before computationally costly, large phylogeny reconstructions are launched. Whilst the focus here lies on phylogeny reconstruction under maximum parsimony, it is plausible that our results are more widely applicable to optimisation procedures in science and industry. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
SIGNUM: A Matlab, TIN-based landscape evolution model
NASA Astrophysics Data System (ADS)
Refice, A.; Giachetta, E.; Capolongo, D.
2012-08-01
Several numerical landscape evolution models (LEMs) have been developed to date, and many are available as open source codes. Most are written in efficient programming languages such as Fortran or C, but often require additional code efforts to plug in to more user-friendly data analysis and/or visualization tools to ease interpretation and scientific insight. In this paper, we present an effort to port a common core of accepted physical principles governing landscape evolution directly into a high-level language and data analysis environment such as Matlab. SIGNUM (acronym for Simple Integrated Geomorphological Numerical Model) is an independent and self-contained Matlab, TIN-based landscape evolution model, built to simulate topography development at various space and time scales. SIGNUM is presently capable of simulating hillslope processes such as linear and nonlinear diffusion, fluvial incision into bedrock, spatially varying surface uplift which can be used to simulate changes in base level, thrust and faulting, as well as effects of climate changes. Although based on accepted and well-known processes and algorithms in its present version, it is built with a modular structure, which allows to easily modify and upgrade the simulated physical processes to suite virtually any user needs. The code is conceived as an open-source project, and is thus an ideal tool for both research and didactic purposes, thanks to the high-level nature of the Matlab environment and its popularity among the scientific community. In this paper the simulation code is presented together with some simple examples of surface evolution, and guidelines for development of new modules and algorithms are proposed.
Tangible Landscape: Cognitively Grasping the Flow of Water
NASA Astrophysics Data System (ADS)
Harmon, B. A.; Petrasova, A.; Petras, V.; Mitasova, H.; Meentemeyer, R. K.
2016-06-01
Complex spatial forms like topography can be challenging to understand, much less intentionally shape, given the heavy cognitive load of visualizing and manipulating 3D form. Spatiotemporal processes like the flow of water over a landscape are even more challenging to understand and intentionally direct as they are dependent upon their context and require the simulation of forces like gravity and momentum. This cognitive work can be offloaded onto computers through 3D geospatial modeling, analysis, and simulation. Interacting with computers, however, can also be challenging, often requiring training and highly abstract thinking. Tangible computing - an emerging paradigm of human-computer interaction in which data is physically manifested so that users can feel it and directly manipulate it - aims to offload this added cognitive work onto the body. We have designed Tangible Landscape, a tangible interface powered by an open source geographic information system (GRASS GIS), so that users can naturally shape topography and interact with simulated processes with their hands in order to make observations, generate and test hypotheses, and make inferences about scientific phenomena in a rapid, iterative process. Conceptually Tangible Landscape couples a malleable physical model with a digital model of a landscape through a continuous cycle of 3D scanning, geospatial modeling, and projection. We ran a flow modeling experiment to test whether tangible interfaces like this can effectively enhance spatial performance by offloading cognitive processes onto computers and our bodies. We used hydrological simulations and statistics to quantitatively assess spatial performance. We found that Tangible Landscape enhanced 3D spatial performance and helped users understand water flow.
NASA Astrophysics Data System (ADS)
Opitz, Florian; Treffinger, Peter
2016-04-01
Electric arc furnaces (EAF) are complex industrial plants whose actual behavior depends upon numerous factors. Due to its energy intensive operation, the EAF process has always been subject to optimization efforts. For these reasons, several models have been proposed in literature to analyze and predict different modes of operation. Most of these models focused on the processes inside the vessel itself. The present paper introduces a dynamic, physics-based model of a complete EAF plant which consists of the four subsystems vessel, electric system, electrode regulation, and off-gas system. Furthermore the solid phase is not treated to be homogenous but a simple spatial discretization is employed. Hence it is possible to simulate the energy input by electric arcs and fossil fuel burners depending on the state of the melting progress. The model is implemented in object-oriented, equation-based language Modelica. The simulation results are compared to literature data.
Metal Big Area Additive Manufacturing: Process Modeling and Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simunovic, Srdjan; Nycz, Andrzej; Noakes, Mark W
Metal Big Area Additive Manufacturing (mBAAM) is a new additive manufacturing (AM) technology for printing large-scale 3D objects. mBAAM is based on the gas metal arc welding process and uses a continuous feed of welding wire to manufacture an object. An electric arc forms between the wire and the substrate, which melts the wire and deposits a bead of molten metal along the predetermined path. In general, the welding process parameters and local conditions determine the shape of the deposited bead. The sequence of the bead deposition and the corresponding thermal history of the manufactured object determine the long rangemore » effects, such as thermal-induced distortions and residual stresses. Therefore, the resulting performance or final properties of the manufactured object are dependent on its geometry and the deposition path, in addition to depending on the basic welding process parameters. Physical testing is critical for gaining the necessary knowledge for quality prints, but traversing the process parameter space in order to develop an optimized build strategy for each new design is impractical by pure experimental means. Computational modeling and optimization may accelerate development of a build process strategy and saves time and resources. Because computational modeling provides these opportunities, we have developed a physics-based Finite Element Method (FEM) simulation framework and numerical models to support the mBAAM process s development and design. In this paper, we performed a sequentially coupled heat transfer and stress analysis for predicting the final deformation of a small rectangular structure printed using the mild steel welding wire. Using the new simulation technologies, material was progressively added into the FEM simulation as the arc weld traversed the build path. In the sequentially coupled heat transfer and stress analysis, the heat transfer was performed to calculate the temperature evolution, which was used in a stress analysis to evaluate the residual stresses and distortions. In this formulation, we assume that physics is directionally coupled, i.e. the effect of stress of the component on the temperatures is negligible. The experiment instrumentation (measurement types, sensor types, sensor locations, sensor placements, measurement intervals) and the measurements are presented. The temperatures and distortions from the simulations show good correlation with experimental measurements. Ongoing modeling work is also briefly discussed.« less
Principles of magnetohydrodynamic simulation in space plasmas
NASA Technical Reports Server (NTRS)
Sato, T.
1985-01-01
Attention is given to the philosophical as well as physical principles that are essential to the establishment of MHD simulation studies for solar plasma research, assuming the capabilities of state-of-the-art computers and emphasizing the importance of 'local' MHD simulation. Solar-terrestrial plasma space is divided into several elementary regions where a macroscopic elementary energy conversion process could conceivably occur; the local MHD simulation is defined as self-contained in each of the regions. The importance of, and the difficulties associated with, the boundary condition are discussed in detail. The roles of diagnostics and of the finite difference method are noted.
Monte Carlo Simulation of the Rapid Crystallization of Bismuth-Doped Silicon
NASA Technical Reports Server (NTRS)
Jackson, Kenneth A.; Gilmer, George H.; Temkin, Dmitri E.
1995-01-01
In this Letter we report Ising model simulations of the growth of alloys which predict quite different behavior near and far from equilibrium. Our simulations reproduce the phenomenon which has been termed 'solute trapping,' where concentrations of solute, which are far in excess of the equilibrium concentrations, are observed in the crystal after rapid crystallization. This phenomenon plays an important role in many processes which involve first order phase changes which take place under conditions far from equilibrium. The underlying physical basis for it has not been understood, but these Monte Carlo simulations provide a powerful means for investigating it.
Integration of Tuyere, Raceway and Shaft Models for Predicting Blast Furnace Process
NASA Astrophysics Data System (ADS)
Fu, Dong; Tang, Guangwu; Zhao, Yongfu; D'Alessio, John; Zhou, Chenn Q.
2018-06-01
A novel modeling strategy is presented for simulating the blast furnace iron making process. Such physical and chemical phenomena are taking place across a wide range of length and time scales, and three models are developed to simulate different regions of the blast furnace, i.e., the tuyere model, the raceway model and the shaft model. This paper focuses on the integration of the three models to predict the entire blast furnace process. Mapping output and input between models and an iterative scheme are developed to establish communications between models. The effects of tuyere operation and burden distribution on blast furnace fuel efficiency are investigated numerically. The integration of different models provides a way to realistically simulate the blast furnace by improving the modeling resolution on local phenomena and minimizing the model assumptions.
Enhanced Verification Test Suite for Physics Simulation Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamm, J R; Brock, J S; Brandon, S T
2008-10-10
This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less
Introducing Multisensor Satellite Radiance-Based Evaluation for Regional Earth System Modeling
NASA Technical Reports Server (NTRS)
Matsui, T.; Santanello, J.; Shi, J. J.; Tao, W.-K.; Wu, D.; Peters-Lidard, C.; Kemp, E.; Chin, M.; Starr, D.; Sekiguchi, M.;
2014-01-01
Earth System modeling has become more complex, and its evaluation using satellite data has also become more difficult due to model and data diversity. Therefore, the fundamental methodology of using satellite direct measurements with instrumental simulators should be addressed especially for modeling community members lacking a solid background of radiative transfer and scattering theory. This manuscript introduces principles of multisatellite, multisensor radiance-based evaluation methods for a fully coupled regional Earth System model: NASA-Unified Weather Research and Forecasting (NU-WRF) model. We use a NU-WRF case study simulation over West Africa as an example of evaluating aerosol-cloud-precipitation-land processes with various satellite observations. NU-WRF-simulated geophysical parameters are converted to the satellite-observable raw radiance and backscatter under nearly consistent physics assumptions via the multisensor satellite simulator, the Goddard Satellite Data Simulator Unit. We present varied examples of simple yet robust methods that characterize forecast errors and model physics biases through the spatial and statistical interpretation of various satellite raw signals: infrared brightness temperature (Tb) for surface skin temperature and cloud top temperature, microwave Tb for precipitation ice and surface flooding, and radar and lidar backscatter for aerosol-cloud profiling simultaneously. Because raw satellite signals integrate many sources of geophysical information, we demonstrate user-defined thresholds and a simple statistical process to facilitate evaluations, including the infrared-microwave-based cloud types and lidar/radar-based profile classifications.
Simulating Coupling Complexity in Space Plasmas: First Results from a new code
NASA Astrophysics Data System (ADS)
Kryukov, I.; Zank, G. P.; Pogorelov, N. V.; Raeder, J.; Ciardo, G.; Florinski, V. A.; Heerikhuisen, J.; Li, G.; Petrini, F.; Shematovich, V. I.; Winske, D.; Shaikh, D.; Webb, G. M.; Yee, H. M.
2005-12-01
The development of codes that embrace 'coupling complexity' via the self-consistent incorporation of multiple physical scales and multiple physical processes in models has been identified by the NRC Decadal Survey in Solar and Space Physics as a crucial necessary development in simulation/modeling technology for the coming decade. The National Science Foundation, through its Information Technology Research (ITR) Program, is supporting our efforts to develop a new class of computational code for plasmas and neutral gases that integrates multiple scales and multiple physical processes and descriptions. We are developing a highly modular, parallelized, scalable code that incorporates multiple scales by synthesizing 3 simulation technologies: 1) Computational fluid dynamics (hydrodynamics or magneto-hydrodynamics-MHD) for the large-scale plasma; 2) direct Monte Carlo simulation of atoms/neutral gas, and 3) transport code solvers to model highly energetic particle distributions. We are constructing the code so that a fourth simulation technology, hybrid simulations for microscale structures and particle distributions, can be incorporated in future work, but for the present, this aspect will be addressed at a test-particle level. This synthesis we will provide a computational tool that will advance our understanding of the physics of neutral and charged gases enormously. Besides making major advances in basic plasma physics and neutral gas problems, this project will address 3 Grand Challenge space physics problems that reflect our research interests: 1) To develop a temporal global heliospheric model which includes the interaction of solar and interstellar plasma with neutral populations (hydrogen, helium, etc., and dust), test-particle kinetic pickup ion acceleration at the termination shock, anomalous cosmic ray production, interaction with galactic cosmic rays, while incorporating the time variability of the solar wind and the solar cycle. 2) To develop a coronal mass ejection and interplanetary shock propagation model for the inner and outer heliosphere, including, at a test-particle level, wave-particle interactions and particle acceleration at traveling shock waves and compression regions. 3) To develop an advanced Geospace General Circulation Model (GGCM) capable of realistically modeling space weather events, in particular the interaction with CMEs and geomagnetic storms. Furthermore, by implementing scalable run-time supports and sophisticated off- and on-line prediction algorithms, we anticipate important advances in the development of automatic and intelligent system software to optimize a wide variety of 'embedded' computations on parallel computers. Finally, public domain MHD and hydrodynamic codes had a transforming effect on space and astrophysics. We expect that our new generation, open source, public domain multi-scale code will have a similar transformational effect in a variety of disciplines, opening up new classes of problems to physicists and engineers alike.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Lake, L.W.; Sepehrnoori, K.
1988-11-01
The objective of this research is to develop, validate, and apply a comprehensive chemical flooding simulator for chemical recovery processes involving surfactants, polymers, and alkaline chemicals in various combinations. This integrated program includes components of laboratory experiments, physical property modelling, scale-up theory, and numerical analysis as necessary and integral components of the simulation activity. Developing, testing and applying flooding simulator (UTCHEM) to a wide variety of laboratory and reservoir problems involving tracers, polymers, polymer gels, surfactants, and alkaline agent has been continued. Improvements in both the physical-chemical and numerical aspects of UTCHEM have been made which enhance its versatility, accuracymore » and speed. Supporting experimental studies during the past year include relative permeability and trapping of microemulsion, tracer flow studies oil recovery in cores using alcohol free surfactant slugs, and microemulsion viscosity measurements. These have enabled model improvement simulator testing. Another code called PROPACK has also been developed which is used as a preprocessor for UTCHEM. Specifically, it is used to evaluate input to UTCHEM by computing and plotting key physical properties such as phase behavior interfacial tension.« less
Beyond Hydrodynamic Modeling of AGN Heating in Galaxy Clusters
NASA Astrophysics Data System (ADS)
Yang, Hsiang-Yi Karen
Clusters of galaxies hold a unique position in hierarchical structure formation - they are both powerful cosmological probes and excellent astrophysical laboratories. Accurate modeling of the cluster properties is crucial for reducing systematic uncertainties in cluster cosmology. However, theoretical modeling of the intracluster medium (ICM) has long suffered from the "cooling-flow problem" - clusters with short central times or cool cores (CCs) are predicted to host massive inflows of gas that are not observed. Feedback from active galactic nuclei (AGN) is by far the most promising heating mechanism to counteract radiative cooling. Recent hydrodynamic simulations have made remarkable progress reproducing properties of the CCs. However, there remain two major questions that cannot be probed using purely hydrodynamic models: (1) what are the roles of cosmic rays (CRs)? (2) how is the existing picture altered when the ICM is modeled as weakly collisional plasma? We propose to move beyond limitations of pure hydrodynamics and progress toward a complete understanding of how AGN jet-inflated bubbles interact with their surroundings and provide heat to the ICM. Our objectives include: (1) understand how CR-dominated bubbles heat the ICM; (2) understand bubble evolution and sound-wave dissipation in the ICM with different assumptions of plasma properties, e.g., collisionality of the ICM, with or without anisotropic transport processes; (3) Develop a subgrid model of AGN heating that can be adopted in cosmological simulations based on state-of-the-art isolated simulations. We will use a combination of analytical calculations and idealized simulations to advance our understanding of each individual physical process. We will then perform the first three-dimensional (3D) magnetohydrodynamic (MHD) simulations of self-regulated AGN feedback with relevant CR and anisotropic transport processes in order to quantify the amount and distribution of heating from the AGN. Our proposed work will elucidate the poorly understood CR and anisotropic transport processes in the weakly collisional ICM and shed light on the long-standing mystery of AGN heating in CC clusters. Our investigation, which incorporates plasma effects into fluid models and provides physical foundation for cosmological simulations, will serve as an important bridge between physics on both micro and macro scales. This study will enable robust modeling of the radio-mode feedback of AGN in cosmological simulations of cluster and galaxy formation. It will also directly impact observational studies of clusters including NASA missions such as Chandra, XMM-Newton, Astro-H/Hitomi, Fermi, HST, and Planck.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Weizhao; Ren, Huaqing; Wang, Zequn
2016-10-19
An integrated computational materials engineering method is proposed in this paper for analyzing the design and preforming process of woven carbon fiber composites. The goal is to reduce the cost and time needed for the mass production of structural composites. It integrates the simulation methods from the micro-scale to the macro-scale to capture the behavior of the composite material in the preforming process. In this way, the time consuming and high cost physical experiments and prototypes in the development of the manufacturing process can be circumvented. This method contains three parts: the micro-scale representative volume element (RVE) simulation to characterizemore » the material; the metamodeling algorithm to generate the constitutive equations; and the macro-scale preforming simulation to predict the behavior of the composite material during forming. The results show the potential of this approach as a guidance to the design of composite materials and its manufacturing process.« less
Contributions of the ARM Program to Radiative Transfer Modeling for Climate and Weather Applications
NASA Technical Reports Server (NTRS)
Mlawer, Eli J.; Iacono, Michael J.; Pincus, Robert; Barker, Howard W.; Oreopoulos, Lazaros; Mitchell, David L.
2016-01-01
Accurate climate and weather simulations must account for all relevant physical processes and their complex interactions. Each of these atmospheric, ocean, and land processes must be considered on an appropriate spatial and temporal scale, which leads these simulations to require a substantial computational burden. One especially critical physical process is the flow of solar and thermal radiant energy through the atmosphere, which controls planetary heating and cooling and drives the large-scale dynamics that moves energy from the tropics toward the poles. Radiation calculations are therefore essential for climate and weather simulations, but are themselves quite complex even without considering the effects of variable and inhomogeneous clouds. Clear-sky radiative transfer calculations have to account for thousands of absorption lines due to water vapor, carbon dioxide, and other gases, which are irregularly distributed across the spectrum and have shapes dependent on pressure and temperature. The line-by-line (LBL) codes that treat these details have a far greater computational cost than can be afforded by global models. Therefore, the crucial requirement for accurate radiation calculations in climate and weather prediction models must be satisfied by fast solar and thermal radiation parameterizations with a high level of accuracy that has been demonstrated through extensive comparisons with LBL codes. See attachment for continuation.
Data-Informed Large-Eddy Simulation of Coastal Land-Air-Sea Interactions
NASA Astrophysics Data System (ADS)
Calderer, A.; Hao, X.; Fernando, H. J.; Sotiropoulos, F.; Shen, L.
2016-12-01
The study of atmospheric flows in coastal areas has not been fully addressed due to the complex processes emerging from the land-air-sea interactions, e.g., abrupt change in land topography, strong current shear, wave shoaling, and depth-limited wave breaking. The available computational tools that have been applied to study such littoral regions are mostly based on open-ocean assumptions, which most times do not lead to reliable solutions. The goal of the present study is to better understand some of these near-shore processes, employing the advanced computational tools, developed in our research group. Our computational framework combines a large-eddy simulation (LES) flow solver for atmospheric flows, a sharp-interface immersed boundary method that can deal with real complex topographies (Calderer et al., J. Comp. Physics 2014), and a phase-resolved, depth-dependent, wave model (Yang and Shen, J. Comp. Physics 2011). Using real measured data taken in the FRF station in Duck, North Carolina, we validate and demonstrate the predictive capabilities of the present computational framework, which are shown to be in overall good agreement with the measured data under different wind-wave scenarios. We also analyse the effects of some of the complex processes captured by our simulation tools.
Coupling of Noah-MP and the High Resolution CI-WATER ADHydro Hydrological Model
NASA Astrophysics Data System (ADS)
Moreno, H. A.; Goncalves Pureza, L.; Ogden, F. L.; Steinke, R. C.
2014-12-01
ADHydro is a physics-based, high-resolution, distributed hydrological model suitable for simulating large watersheds in a massively parallel computing environment. It simulates important processes such as: rainfall and infiltration, snowfall and snowmelt in complex terrain, vegetation and evapotranspiration, soil heat flux and freezing, overland flow, channel flow, groundwater flow and water management. For the vegetation and evapotranspiration processes, ADHydro uses the validated community land surface model (LSM) Noah-MP. Noah-MP uses multiple options for key land-surface hydrology and was developed to facilitate climate predictions with physically based ensembles. This presentation discusses the lessons learned in coupling Noah-MP to ADHydro. Noah-MP is delivered with a main driver program and not as a library with a clear interface to be called from other codes. This required some investigation to determine the correct functions to call and the appropriate parameter values. ADHydro runs Noah-MP as a point process on each mesh element and provides initialization and forcing data for each element. Modeling data are acquired from various sources including the Soil Survey Geographic Database (SSURGO), the Weather Research and Forecasting (WRF) model, and internal ADHydro simulation states. Despite these challenges in coupling Noah-MP to ADHydro, the use of Noah-MP provides the benefits of a supported community code.
Dependence of Snowmelt Simulations on Scaling of the Forcing Processes (Invited)
NASA Astrophysics Data System (ADS)
Winstral, A. H.; Marks, D. G.; Gurney, R. J.
2009-12-01
The spatial organization and scaling relationships of snow distribution in mountain environs is ultimately dependent on the controlling processes. These processes include interactions between weather, topography, vegetation, snow state, and seasonally-dependent radiation inputs. In large scale snow modeling it is vital to know these dependencies to obtain accurate predictions while reducing computational costs. This study examined the scaling characteristics of the forcing processes and the dependency of distributed snowmelt simulations to their scaling. A base model simulation characterized these processes with 10m resolution over a 14.0 km2 basin with an elevation range of 1474 - 2244 masl. Each of the major processes affecting snow accumulation and melt - precipitation, wind speed, solar radiation, thermal radiation, temperature, and vapor pressure - were independently degraded to 1 km resolution. Seasonal and event-specific results were analyzed. Results indicated that scale effects on melt vary by process and weather conditions. The dependence of melt simulations on the scaling of solar radiation fluxes also had a seasonal component. These process-based scaling characteristics should remain static through time as they are based on physical considerations. As such, these results not only provide guidance for current modeling efforts, but are also well suited to predicting how potential climate changes will affect the heterogeneity of mountain snow distributions.
Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.
Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (i) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (ii) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph;more » (iii) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (iv) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (vi) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.« less
Risk Assessment and Scaling for the SLS LH2 ET
NASA Technical Reports Server (NTRS)
Hafiychuk, Halyna; Ponizovskaya-Devine, Ekaterina; Luchinsky, Dmitry; Khasin, Michael; Osipov, Viatcheslav V.; Smelyanskiy, Vadim N.
2012-01-01
In this report the main physics processes in LH2 tank during prepress and rocket flight are studied. The goal of this investigation is to analyze possible hazards and to make risk assessment in proposed LH2 tank designs for SLS with 5 engines (the situation with 4 engines is less critical). For analysis we use the multinode model (MNM) developed by us and presented in a separate report and also 3D ANSYS simulations. We carry out simulation and theoretical analysis the physics processes such as (i) accumulation of bubbles in LH2 during replenish stage and their collapsing in the liquid during the prepress; (ii) condensation-evaporation at the liquid-vapor interface and tank wall, (iv) heating the liquid near the interface and wall due to condensation and environment heat, (v) injection of hot He during prepress and of hot GH2 during flight, (vi) mixing and cooling of the injected gases due to heat transfer between the gases, liquid and the tank wall. We analyze the effects of these physical processes on the thermo- and fluid gas dynamics in the ullage and on the stratification of temperature in the liquid and assess the associated hazards. A special emphasize is put on the scaling predictions for the larger SLS LH2 tank.
NASA Astrophysics Data System (ADS)
Borrás, E.; Ródenas, M.; Vera, T.; Muñoz, A.
2015-12-01
The atmospheric particulate matter has a large impact on climate, biosphere behaviour and human health. Its study is complex because of large number of species are present at low concentrations and the continuous time evolution, being not easily separable from meteorology, and transport processes. Closed systems have been proposed by isolating specific reactions, pollutants or products and controlling the oxidizing environment. High volume simulation chambers, such as EUropean PHOtoREactor (EUPHORE), are an essential tool used to simulate atmospheric photochemical reactions. This communication describes the last results about the reactivity of prominent atmospheric pollutants and the subsequent particulate matter formation. Specific experiments focused on organic aerosols have been developed at the EUPHORE photo-reactor. The use of on-line instrumentation, supported by off-line techniques, has provided well-defined reaction profiles, physical properties, and up to 300 different species are determined in particulate matter. The application fields include the degradation of anthropogenic and biogenic pollutants, and pesticides under several atmospheric conditions, studying their contribution on the formation of secondary organic aerosols (SOA). The studies performed at the EUPHORE have improved the mechanistic studies of atmospheric degradation processes and the knowledge about the chemical and physical properties of atmospheric particulate matter formed during these processes.
NASA Astrophysics Data System (ADS)
Yeoh, S. K.; Li, Z.; Goldstein, D. B.; Varghese, P. L.; Trafton, L. M.; Levin, D. A.
2014-12-01
The Enceladus ice/vapor plume not only accounts for the various features observed in the Saturnian system, such as the E-ring, the narrow neutral H2O torus, and Enceladus' own bright albedo, but also raises exciting new possibilities, including the existence of liquid water on Enceladus. Therefore, understanding the plume and its physics is important. Here we assume that the plume arises from flow expansion within multiple narrow subsurface cracks connected to reservoirs of liquid water underground, and simulate this expanding flow from the underground reservoir out to several Enceladus radii where Cassini data are available for comparison. The direct simulation Monte Carlo (DSMC) method is used to simulate the subsurface and near-field collisional regions and a free-molecular model is used to propagate the plume out into the far-field. We include the following physical processes in our simulations: the flow interaction with the crack walls, grain condensation from the vapor phase, non-equilibrium effects (e.g. freezing of molecular internal energy modes), the interaction between the vapor and the ice grains, the gravitational fields of Enceladus and Saturn, and Coriolis and centrifugal forces (due to motion in non-inertial reference frame). The end result is a plume model that includes the relevant physics of the flow from the underground source out to where Cassini measurements are taken. We have made certain assumptions about the channel geometry and reservoir conditions. The model is constrained using various available Cassini data (particularly those of INMS, CDA and UVIS) to understand the plume physics as well as estimate the vapor and grain production rates and its temporal variability.
NASA Astrophysics Data System (ADS)
Mert, A.
2016-12-01
The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.
MOOSE: A parallel computational framework for coupled systems of nonlinear equations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Derek Gaston; Chris Newman; Glen Hansen
Systems of coupled, nonlinear partial differential equations (PDEs) often arise in simulation of nuclear processes. MOOSE: Multiphysics Object Oriented Simulation Environment, a parallel computational framework targeted at the solution of such systems, is presented. As opposed to traditional data-flow oriented computational frameworks, MOOSE is instead founded on the mathematical principle of Jacobian-free Newton-Krylov (JFNK) solution methods. Utilizing the mathematical structure present in JFNK, physics expressions are modularized into `Kernels,'' allowing for rapid production of new simulation tools. In addition, systems are solved implicitly and fully coupled, employing physics based preconditioning, which provides great flexibility even with large variance in timemore » scales. A summary of the mathematics, an overview of the structure of MOOSE, and several representative solutions from applications built on the framework are presented.« less
NASA Astrophysics Data System (ADS)
Bosch, R.; Ward, D.
2017-12-01
Investigation of erosion rates and processes at knickpoints in surface bedrock streams is an active area of research, involving complex feedbacks in the coupled relationships between dissolution, abrasion, and plucking that have not been sufficiently addressed. Even less research has addressed how these processes operate to propagate knickpoints through cave passages in layered sedimentary rocks, despite these features being common along subsurface streams. In both settings, there is evidence for mechanical and chemical erosion, but in cave passages the different hydrologic and hydraulic regimes, combined with an important role for the dissolution process, affect the relative roles and coupled interactions between these processes, and distinguish them from surface stream knickpoints. Using a novel approach of imaging cave passages using Structure from Motion (SFM), we create 3D geometry meshes to explore these systems using multiphysics simulation, and compare the processes as they occur in caves with those in surface streams. Here we focus on four field sites with actively eroding streambeds that include knickpoints: Upper River Acheron and Devil's Cooling Tub in Mammoth Cave, Kentucky; and two surface streams in Clermont County, Ohio, Avey's Run and Fox Run. SFM 3D reconstructions are built using images exported from 4K video shot at each field location. We demonstrate that SFM is a viable imaging approach for reconstructing cave passages with complex morphologies. We then use these reconstructions to create meshes upon which to run multiphysics simulations using STAR-CCM+. Our approach incorporates multiphase free-surface computational fluid dynamics simulations with sediment transport modeled using discrete element method grains. Physical and chemical properties of the water, bedrock, and sediment enable computation of shear stress, sediment impact forces, and chemical kinetic conditions at the bed surface. Preliminary results prove the efficacy of commercially available multiphysics simulation software for modeling various flow conditions, erosional processes, and their complex coupled interactions in cave passages and in surface stream channels to expand knowledge and understanding of overall cave system development and river profile erosion.
NASA Astrophysics Data System (ADS)
Yang, J.; Zammit, C.; McMillan, H. K.
2016-12-01
As in most countries worldwide, water management in lowland areas is a big concern for New Zealand due to its economic importance for water related human activities. As a result, the estimation of available water resources in these areas (e.g., for irrigation and water supply purpose) is crucial and often requires an understanding of complex hydrological processes, which are often characterized by strong interactions between surface water and groundwater (usually expressed as losing and gaining rivers). These processes are often represented and simulated using integrated physically based hydrological models. However models with physically based groundwater modules typically require large amount of non-readily available geologic and aquifer information and are computationally intensive. Instead, this paper presents a conceptual groundwater model that is fully integrated into New Zealand's national hydrological model TopNet based on TopModel concepts (Beven, 1992). Within this conceptual framework, the integrated model can simulate not only surface processes, but also groundwater processes and surface water-groundwater interaction processes (including groundwater flow, river-groundwater interaction, and groundwater interaction with external watersheds). The developed model was applied to two New Zealand catchments with different hydro-geological and climate characteristics (Pareora catchment in the Canterbury Plains and Grey catchment on the West Coast). Previous studies have documented strong interactions between the river and groundwater, based on the analysis of a large number of concurrent flow measurements and associated information along the river main stem. Application of the integrated hydrological model indicates flow simulation (compared to the original hydrological model conceptualisation) during low flow conditions are significantly improved and further insights on local river dynamics are gained. Due to its conceptual characteristics and low level of data requirement, the integrated model could be used at local and national scales to improve the simulation of hydrological processes in non-topographically driven areas (where groundwater processes are important), and to assess impact of climate change on the integrated hydrological cycle in these areas.
Expected Performance of the ATLAS Experiment - Detector, Trigger and Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aad, G.; Abat, E.; Abbott, B.
2011-11-28
The Large Hadron Collider (LHC) at CERN promises a major step forward in the understanding of the fundamental nature of matter. The ATLAS experiment is a general-purpose detector for the LHC, whose design was guided by the need to accommodate the wide spectrum of possible physics signatures. The major remit of the ATLAS experiment is the exploration of the TeV mass scale where groundbreaking discoveries are expected. In the focus are the investigation of the electroweak symmetry breaking and linked to this the search for the Higgs boson as well as the search for Physics beyond the Standard Model. Inmore » this report a detailed examination of the expected performance of the ATLAS detector is provided, with a major aim being to investigate the experimental sensitivity to a wide range of measurements and potential observations of new physical processes. An earlier summary of the expected capabilities of ATLAS was compiled in 1999 [1]. A survey of physics capabilities of the CMS detector was published in [2]. The design of the ATLAS detector has now been finalised, and its construction and installation have been completed [3]. An extensive test-beam programme was undertaken. Furthermore, the simulation and reconstruction software code and frameworks have been completely rewritten. Revisions incorporated reflect improved detector modelling as well as major technical changes to the software technology. Greatly improved understanding of calibration and alignment techniques, and their practical impact on performance, is now in place. The studies reported here are based on full simulations of the ATLAS detector response. A variety of event generators were employed. The simulation and reconstruction of these large event samples thus provided an important operational test of the new ATLAS software system. In addition, the processing was distributed world-wide over the ATLAS Grid facilities and hence provided an important test of the ATLAS computing system - this is the origin of the expression 'CSC studies' ('computing system commissioning'), which is occasionally referred to in these volumes. The work reported does generally assume that the detector is fully operational, and in this sense represents an idealised detector: establishing the best performance of the ATLAS detector with LHC proton-proton collisions is a challenging task for the future. The results summarised here therefore represent the best estimate of ATLAS capabilities before real operational experience of the full detector with beam. Unless otherwise stated, simulations also do not include the effect of additional interactions in the same or other bunch-crossings, and the effect of neutron background is neglected. Thus simulations correspond to the low-luminosity performance of the ATLAS detector. This report is broadly divided into two parts: firstly the performance for identification of physics objects is examined in detail, followed by a detailed assessment of the performance of the trigger system. This part is subdivided into chapters surveying the capabilities for charged particle tracking, each of electron/photon, muon and tau identification, jet and missing transverse energy reconstruction, b-tagging algorithms and performance, and finally the trigger system performance. In each chapter of the report, there is a further subdivision into shorter notes describing different aspects studied. The second major subdivision of the report addresses physics measurement capabilities, and new physics search sensitivities. Individual chapters in this part discuss ATLAS physics capabilities in Standard Model QCD and electroweak processes, in the top quark sector, in b-physics, in searches for Higgs bosons, supersymmetry searches, and finally searches for other new particles predicted in more exotic models.« less
ATLAS Simulation using Real Data: Embedding and Overlay
NASA Astrophysics Data System (ADS)
Haas, Andrew; ATLAS Collaboration
2017-10-01
For some physics processes studied with the ATLAS detector, a more accurate simulation in some respects can be achieved by including real data into simulated events, with substantial potential improvements in the CPU, disk space, and memory usage of the standard simulation configuration, at the cost of significant database and networking challenges. Real proton-proton background events can be overlaid (at the detector digitization output stage) on a simulated hard-scatter process, to account for pileup background (from nearby bunch crossings), cavern background, and detector noise. A similar method is used to account for the large underlying event from heavy ion collisions, rather than directly simulating the full collision. Embedding replaces the muons found in Z→μμ decays in data with simulated taus at the same 4-momenta, thus preserving the underlying event and pileup from the original data event. In all these cases, care must be taken to exactly match detector conditions (beamspot, magnetic fields, alignments, dead sensors, etc.) between the real data event and the simulation. We will discuss the status of these overlay and embedding techniques within ATLAS software and computing.
Simulating Technology Processes to Foster Learning.
ERIC Educational Resources Information Center
Krumholtz, Nira
1998-01-01
Based on a spiral model of technology evolution, elementary students used LOGO computer software to become both developers and users of technology. The computerized environment enabled 87% to reach intuitive understanding of physical concepts; 24% expressed more formal scientific understanding. (SK)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, T.
perf-dump is a library for dumping performance data in much the same way physics simulations dump checkpoints. It records per-process, per-timestep, per-phase, and per-thread performance counter data and dumps this large data periodically into an HDF5 data file.
Thermo-Chemical Phenomena Simulation for Ablation
2011-02-21
DATES COVERED (1/01/08-30/11/10) 4. TITLE AND SUBTITLE Thermo- Chemical Phenomena Simulation for Ablation 5a. CONTRACT NUMBER...First, a physic based chemical kinetic model for high-temperature gas is developed and verified by comparing with data from the RAM-C-II probe and the...found to be negligible and the energy exchange is dominated by the chemical process for conductive-convective heat transfer. A simplified and more
Engine management during NTRE start up
NASA Technical Reports Server (NTRS)
Bulman, Mel; Saltzman, Dave
1993-01-01
The topics are presented in viewgraph form and include the following: total engine system management critical to successful nuclear thermal rocket engine (NTRE) start up; NERVA type engine start windows; reactor power control; heterogeneous reactor cooling; propellant feed system dynamics; integrated NTRE start sequence; moderator cooling loop and efficient NTRE starting; analytical simulation and low risk engine development; accurate simulation through dynamic coupling of physical processes; and integrated NTRE and mission performance.
2015-09-01
NC. 14. ABSTRACT A high-resolution numerical simulation of jet breakup and spray formation from a complex diesel fuel injector at diesel engine... diesel fuel injector at diesel engine type conditions has been performed. A full understanding of the primary atomization process in diesel fuel... diesel liquid sprays the complexity is further compounded by the physical attributes present including nozzle turbulence, large density ratios
Status of the Electroforming Shield Design (ESD) project
NASA Technical Reports Server (NTRS)
Fletcher, R. E.
1977-01-01
The utilization of a digital computer to augment electrodeposition/electroforming processes in which nonconducting shielding controls local cathodic current distribution is reported. The primary underlying philosophy of the physics of electrodeposition was presented. The technical approach taken to analytically simulate electrolytic tank variables was also included. A FORTRAN computer program has been developed and implemented. The program utilized finite element techniques and electrostatic theory to simulate electropotential fields and ionic transport.
NASA Astrophysics Data System (ADS)
Hristova-Veleva, S.; Chao, Y.; Vane, D.; Lambrigtsen, B.; Li, P. P.; Knosp, B.; Vu, Q. A.; Su, H.; Dang, V.; Fovell, R.; Tanelli, S.; Garay, M.; Willis, J.; Poulsen, W.; Fishbein, E.; Ao, C. O.; Vazquez, J.; Park, K. J.; Callahan, P.; Marcus, S.; Haddad, Z.; Fetzer, E.; Kahn, R.
2007-12-01
In spite of recent improvements in hurricane track forecast accuracy, currently there are still many unanswered questions about the physical processes that determine hurricane genesis, intensity, track and impact on large- scale environment. Furthermore, a significant amount of work remains to be done in validating hurricane forecast models, understanding their sensitivities and improving their parameterizations. None of this can be accomplished without a comprehensive set of multiparameter observations that are relevant to both the large- scale and the storm-scale processes in the atmosphere and in the ocean. To address this need, we have developed a prototype of a comprehensive hurricane information system of high- resolution satellite, airborne and in-situ observations and model outputs pertaining to: i) the thermodynamic and microphysical structure of the storms; ii) the air-sea interaction processes; iii) the larger-scale environment as depicted by the SST, ocean heat content and the aerosol loading of the environment. Our goal was to create a one-stop place to provide the researchers with an extensive set of observed hurricane data, and their graphical representation, together with large-scale and convection-resolving model output, all organized in an easy way to determine when coincident observations from multiple instruments are available. Analysis tools will be developed in the next step. The analysis tools will be used to determine spatial, temporal and multiparameter covariances that are needed to evaluate model performance, provide information for data assimilation and characterize and compare observations from different platforms. We envision that the developed hurricane information system will help in the validation of the hurricane models, in the systematic understanding of their sensitivities and in the improvement of the physical parameterizations employed by the models. Furthermore, it will help in studying the physical processes that affect hurricane development and impact on large-scale environment. This talk will describe the developed prototype of the hurricane information systems. Furthermore, we will use a set of WRF hurricane simulations and compare simulated to observed structures to illustrate how the information system can be used to discriminate between simulations that employ different physical parameterizations. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics ans Space Administration.
Feng, Zhihong; Zhao, Jinlong; Zhou, Libin; Dong, Yan; Zhao, Yimin
2009-10-01
The purpose of this report is to show the establishment of an animal model with a unilateral maxilla defect, application of virtual reality and rapid prototyping in the surgical planning for dentoalveolar distraction osteogenesis (DO). Two adult dogs were used to develop an animal model with a unilateral maxillary defect. The 3-dimensional model of the canine craniofacial skeleton was reconstructed with computed tomography data using the software Mimics, version 12.0 (Materialise Group, Leuven, Belgium). A virtual individual distractor was designed and transferred onto the model with the defect, and the osteotomies and distraction processes were simulated. A precise casting technique and numeric control technology were applied to produce the titanium distraction device, which was installed on the physical model with the defect, which was generated using Selective Laser Sintering technology, and the in vitro simulation of osteotomies and DO was done. The 2 dogs survived the operation and were lively. The osteotomies and distraction process were simulated successfully whether on the virtual or the physical model. The bone transport could be distracted to the desired position both in the virtual environment and on the physical model. The novel method to develop an animal model with a unilateral maxillary defect was feasible, and the animal model was suitable to develop the reconstruction method for unilateral maxillary defect cases with dentoalveolar DO. Computer-assisted surgical planning and simulation improved the reliability of the maxillofacial surgery, especially for the complex cases. The novel idea to reconstruct the unilateral maxillary defect with dentoalveolar DO was proved through the model experiment.
Constraining Stochastic Parametrisation Schemes Using High-Resolution Model Simulations
NASA Astrophysics Data System (ADS)
Christensen, H. M.; Dawson, A.; Palmer, T.
2017-12-01
Stochastic parametrisations are used in weather and climate models as a physically motivated way to represent model error due to unresolved processes. Designing new stochastic schemes has been the target of much innovative research over the last decade. While a focus has been on developing physically motivated approaches, many successful stochastic parametrisation schemes are very simple, such as the European Centre for Medium-Range Weather Forecasts (ECMWF) multiplicative scheme `Stochastically Perturbed Parametrisation Tendencies' (SPPT). The SPPT scheme improves the skill of probabilistic weather and seasonal forecasts, and so is widely used. However, little work has focused on assessing the physical basis of the SPPT scheme. We address this matter by using high-resolution model simulations to explicitly measure the `error' in the parametrised tendency that SPPT seeks to represent. The high resolution simulations are first coarse-grained to the desired forecast model resolution before they are used to produce initial conditions and forcing data needed to drive the ECMWF Single Column Model (SCM). By comparing SCM forecast tendencies with the evolution of the high resolution model, we can measure the `error' in the forecast tendencies. In this way, we provide justification for the multiplicative nature of SPPT, and for the temporal and spatial scales of the stochastic perturbations. However, we also identify issues with the SPPT scheme. It is therefore hoped these measurements will improve both holistic and process based approaches to stochastic parametrisation. Figure caption: Instantaneous snapshot of the optimal SPPT stochastic perturbation, derived by comparing high-resolution simulations with a low resolution forecast model.
A density-adaptive SPH method with kernel gradient correction for modeling explosive welding
NASA Astrophysics Data System (ADS)
Liu, M. B.; Zhang, Z. L.; Feng, D. L.
2017-09-01
Explosive welding involves processes like the detonation of explosive, impact of metal structures and strong fluid-structure interaction, while the whole process of explosive welding has not been well modeled before. In this paper, a novel smoothed particle hydrodynamics (SPH) model is developed to simulate explosive welding. In the SPH model, a kernel gradient correction algorithm is used to achieve better computational accuracy. A density adapting technique which can effectively treat large density ratio is also proposed. The developed SPH model is firstly validated by simulating a benchmark problem of one-dimensional TNT detonation and an impact welding problem. The SPH model is then successfully applied to simulate the whole process of explosive welding. It is demonstrated that the presented SPH method can capture typical physics in explosive welding including explosion wave, welding surface morphology, jet flow and acceleration of the flyer plate. The welding angle obtained from the SPH simulation agrees well with that from a kinematic analysis.
Bayramzadeh, Sara; Joseph, Anjali; Allison, David; Shultz, Jonas; Abernathy, James
2018-07-01
This paper describes the process and tools developed as part of a multidisciplinary collaborative simulation-based approach for iterative design and evaluation of operating room (OR) prototypes. Full-scale physical mock-ups of healthcare spaces offer an opportunity to actively communicate with and to engage multidisciplinary stakeholders in the design process. While mock-ups are increasingly being used in healthcare facility design projects, they are rarely evaluated in a manner to support active user feedback and engagement. Researchers and architecture students worked closely with clinicians and architects to develop OR design prototypes and engaged clinical end-users in simulated scenarios. An evaluation toolkit was developed to compare design prototypes. The mock-up evaluation helped the team make key decisions about room size, location of OR table, intra-room zoning, and doors location. Structured simulation based mock-up evaluations conducted in the design process can help stakeholders visualize their future workspace and provide active feedback. Copyright © 2018 Elsevier Ltd. All rights reserved.
Acoustic response of cemented granular sedimentary rocks: molecular dynamics modeling.
García, Xavier; Medina, Ernesto
2007-06-01
The effect of cementation processes on the acoustical properties of sands is studied via molecular dynamics simulation methods. We propose numerical methods where the initial uncemented sand is built by simulating the settling process of sediments. Uncemented samples of different porosity are considered by emulating natural mechanical compaction of sediments due to overburden. Cementation is considered through a particle-based model that captures the underlying physics behind the process. In our simulations, we consider samples with different degrees of compaction and cementing materials with distinct elastic properties. The microstructure of cemented sands is taken into account while adding cement at specific locations within the pores, such as grain-to-grain contacts. Results show that the acoustical properties of cemented sands are strongly dependent on the amount of cement, its stiffness relative to the hosting medium, and its location within the pores. Simulation results are in good correspondence with available experimental data and compare favorably with some theoretical predictions for the sound velocity within a range of cement saturation, porosity, and confining pressure.
Shibuta, Yasushi; Sakane, Shinji; Miyoshi, Eisuke; Okita, Shin; Takaki, Tomohiro; Ohno, Munekazu
2017-04-05
Can completely homogeneous nucleation occur? Large scale molecular dynamics simulations performed on a graphics-processing-unit rich supercomputer can shed light on this long-standing issue. Here, a billion-atom molecular dynamics simulation of homogeneous nucleation from an undercooled iron melt reveals that some satellite-like small grains surrounding previously formed large grains exist in the middle of the nucleation process, which are not distributed uniformly. At the same time, grains with a twin boundary are formed by heterogeneous nucleation from the surface of the previously formed grains. The local heterogeneity in the distribution of grains is caused by the local accumulation of the icosahedral structure in the undercooled melt near the previously formed grains. This insight is mainly attributable to the multi-graphics processing unit parallel computation combined with the rapid progress in high-performance computational environments.Nucleation is a fundamental physical process, however it is a long-standing issue whether completely homogeneous nucleation can occur. Here the authors reveal, via a billion-atom molecular dynamics simulation, that local heterogeneity exists during homogeneous nucleation in an undercooled iron melt.
Shraiki, Mario; Arba-Mosquera, Samuel
2011-06-01
To evaluate ablation algorithms and temperature changes in laser refractive surgery. The model (virtual laser system [VLS]) simulates different physical effects of an entire surgical process, simulating the shot-by-shot ablation process based on a modeled beam profile. The model is comprehensive and directly considers applied correction; corneal geometry, including astigmatism; laser beam characteristics; and ablative spot properties. Pulse lists collected from actual treatments were used to simulate the temperature increase during the ablation process. Ablation efficiency reduction in the periphery resulted in a lower peripheral temperature increase. Steep corneas had lesser temperature increases than flat ones. The maximum rise in temperature depends on the spatial density of the ablation pulses. For the same number of ablative pulses, myopic corrections showed the highest temperature increase, followed by myopic astigmatism, mixed astigmatism, phototherapeutic keratectomy (PTK), hyperopic astigmatism, and hyperopic treatments. The proposed model can be used, at relatively low cost, for calibration, verification, and validation of the laser systems used for ablation processes and would directly improve the quality of the results.
Exploring the physical layer frontiers of cellular uplink: The Vienna LTE-A Uplink Simulator.
Zöchmann, Erich; Schwarz, Stefan; Pratschner, Stefan; Nagel, Lukas; Lerch, Martin; Rupp, Markus
Communication systems in practice are subject to many technical/technological constraints and restrictions. Multiple input, multiple output (MIMO) processing in current wireless communications, as an example, mostly employs codebook-based pre-coding to save computational complexity at the transmitters and receivers. In such cases, closed form expressions for capacity or bit-error probability are often unattainable; effects of realistic signal processing algorithms on the performance of practical communication systems rather have to be studied in simulation environments. The Vienna LTE-A Uplink Simulator is a 3GPP LTE-A standard compliant MATLAB-based link level simulator that is publicly available under an academic use license, facilitating reproducible evaluations of signal processing algorithms and transceiver designs in wireless communications. This paper reviews research results that have been obtained by means of the Vienna LTE-A Uplink Simulator, highlights the effects of single-carrier frequency-division multiplexing (as the distinguishing feature to LTE-A downlink), extends known link adaptation concepts to uplink transmission, shows the implications of the uplink pilot pattern for gathering channel state information at the receiver and completes with possible future research directions.
NASA Astrophysics Data System (ADS)
Yiran, P.; Li, J.; von Salzen, K.; Dai, T.; Liu, D.
2014-12-01
Mineral dust is a significant contributor to global and Asian aerosol burden. Currently, large uncertainties still exist in simulated aerosol processes in global climate models (GCMs), which lead to a diversity in dust mass loading and spatial distribution of GCM projections. In this study, satellite measurements from CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization) and observed aerosol data from Asian stations are compared with modelled aerosol in the Canadian Atmospheric Global Climate Model (CanAM4.2). Both seasonal and annual variations in Asian dust distribution are investigated. Vertical profile of simulated aerosol in troposphere is evaluated with CALIOP Level 3 products and local observed extinction for dust and total aerosols. Physical processes in GCM such as horizontal advection, vertical mixing, dry and wet removals are analyzed according to model simulation and available measurements of aerosol. This work aims to improve current understanding of Asian dust transport and vertical exchange on a large scale, which may help to increase the accuracy of GCM simulation on aerosols.
Planar Multipol-Resonance-Probe: A Spectral Kinetic Approach
NASA Astrophysics Data System (ADS)
Friedrichs, Michael; Gong, Junbo; Brinkmann, Ralf Peter; Oberrath, Jens; Wilczek, Sebastian
2016-09-01
Measuring plasma parameters, e.g. electron density and electron temperature, is an important procedure to verify the stability and behavior of a plasma process. For this purpose the multipole resonance probe (MRP) represents a satisfying solution to measure the electron density. However the influence of the probe on the plasma through its physical presence makes it unattractive for some processes in industrial application. A solution to combine the benefits of the spherical MRP with the ability to integrate the probe into the plasma reactor is introduced by the planar model of the MRP (pMRP). Introducing the spectral kinetic formalism leads to a reduced simulation-circle compared to particle-in-cell simulations. The model of the pMRP is implemented and first simulation results are presented.
Simulation of the microwave heating of a thin multilayered composite material: A parameter analysis
NASA Astrophysics Data System (ADS)
Tertrais, Hermine; Barasinski, Anaïs; Chinesta, Francisco
2018-05-01
Microwave (MW) technology relies on volumetric heating. Thermal energy is transferred to the material that can absorb it at specific frequencies. The complex physics involved in this process is far from being understood and that is why a simulation tool has been developed in order to solve the electromagnetic and thermal equations in such a complex material as a multilayered composite part. The code is based on the in-plane-out-of-plane separated representation within the Proper Generalized Decomposition framework. To improve the knowledge on the process, a parameter study in carried out in this paper.
NASA Technical Reports Server (NTRS)
Vogel, Bernhard; Vogel, Heike; Fiedler, Franz
1994-01-01
A model system is presented that takes into account the main physical and chemical processes on the regional scale here in an area of 100x100 sq km. The horizontal gridsize used is 2x2 sq km. For a case study, it is demonstrated how the model system can be used to separate the contributions of the processes advection, turbulent diffusion, and chemical reactions to the diurnal cycle of ozone. In this way, typical features which are visible in observations and are reproduced by the numerical simulations can be interpreted.
Elementary process and meteor train spectra
NASA Technical Reports Server (NTRS)
Ovezgeldyev, O. G.
1987-01-01
Mechanisms of excitation of individual spectral line radiation were studied experimentally and theoretically and it was demonstrated that such processes as oxidation, resonant charge exchange, dissociative recombination and others play an important part in the chemistry of excited particles. The foundation was laid toward simulating the elementary processes of meteor physics. Having a number of advantages and possibilities, this method is sure to find a wide use in the future.
Parameter extraction with neural networks
NASA Astrophysics Data System (ADS)
Cazzanti, Luca; Khan, Mumit; Cerrina, Franco
1998-06-01
In semiconductor processing, the modeling of the process is becoming more and more important. While the ultimate goal is that of developing a set of tools for designing a complete process (Technology CAD), it is also necessary to have modules to simulate the various technologies and, in particular, to optimize specific steps. This need is particularly acute in lithography, where the continuous decrease in CD forces the technologies to operate near their limits. In the development of a 'model' for a physical process, we face several levels of challenges. First, it is necessary to develop a 'physical model,' i.e. a rational description of the process itself on the basis of know physical laws. Second, we need an 'algorithmic model' to represent in a virtual environment the behavior of the 'physical model.' After a 'complete' model has been developed and verified, it becomes possible to do performance analysis. In many cases the input parameters are poorly known or not accessible directly to experiment. It would be extremely useful to obtain the values of these 'hidden' parameters from experimental results by comparing model to data. This is particularly severe, because the complexity and costs associated with semiconductor processing make a simple 'trial-and-error' approach infeasible and cost- inefficient. Even when computer models of the process already exists, obtaining data through simulations may be time consuming. Neural networks (NN) are powerful computational tools to predict the behavior of a system from an existing data set. They are able to adaptively 'learn' input/output mappings and to act as universal function approximators. In this paper we use artificial neural networks to build a mapping from the input parameters of the process to output parameters which are indicative of the performance of the process. Once the NN has been 'trained,' it is also possible to observe the process 'in reverse,' and to extract the values of the inputs which yield outputs with desired characteristics. Using this method, we can extract optimum values for the parameters and determine the process latitude very quickly.
NASA Astrophysics Data System (ADS)
Corti, Giacomo; Zeoli, Antonio; Belmaggio, Pietro; Folco, Luigi
2008-03-01
Three-dimensional laboratory physical experiments have been used to investigate the influence of bedrock topography and ablation on ice flow. Different models were tested in a Plexiglas box, where a transparent silicone simulating ice in nature was allowed to flow. Experimental results show how the flow field (in terms of both flow lines and velocity) and variations in the topography of the free surface and internal layers of the ice are strongly influenced by the presence and height of bedrock obstacles. In particular, the buttressing effect forces the ice to slow down, rise up, and avoid the obstacle; the higher the bedrock barrier, the more pronounced the process. Only limited uplift of internal layers is observed in these experiments. In order to exhume deep material embedded in the ice, ablation (simulated by physically removing portions of silicone from the model surface to maintain a constant topographic depression) must be included in the physical models. In this case, the analogue ice replenishes the area of material removal, thereby allowing deep layers to move vertically to the surface and severely altering the local ice flow pattern. This process is analogous to the ice flow model proposed in the literature for the origin of meteorite concentrations in blue ice areas of the Antarctic plateau.
Using quantum theory to simplify input-output processes
NASA Astrophysics Data System (ADS)
Thompson, Jayne; Garner, Andrew J. P.; Vedral, Vlatko; Gu, Mile
2017-02-01
All natural things process and transform information. They receive environmental information as input, and transform it into appropriate output responses. Much of science is dedicated to building models of such systems-algorithmic abstractions of their input-output behavior that allow us to simulate how such systems can behave in the future, conditioned on what has transpired in the past. Here, we show that classical models cannot avoid inefficiency-storing past information that is unnecessary for correct future simulation. We construct quantum models that mitigate this waste, whenever it is physically possible to do so. This suggests that the complexity of general input-output processes depends fundamentally on what sort of information theory we use to describe them.
A multi-scale ''soil water structure'' model based on the pedostructure concept
NASA Astrophysics Data System (ADS)
Braudeau, E.; Mohtar, R. H.; El Ghezal, N.; Crayol, M.; Salahat, M.; Martin, P.
2009-02-01
Current soil water models do not take into account the internal organization of the soil medium and, a fortiori, the physical interaction between the water film surrounding the solid particles of the soil structure, and the surface charges of this structure. In that sense they empirically deal with the physical soil properties that are all generated from this soil water-structure interaction. As a result, the thermodynamic state of the soil water medium, which constitutes the local physical conditions, namely the pedo-climate, for biological and geo-chemical processes in soil, is not defined in these models. The omission of soil structure from soil characterization and modeling does not allow for coupling disciplinary models for these processes with soil water models. This article presents a soil water structure model, Kamel®, which was developed based on a new paradigm in soil physics where the hierarchical soil structure is taken into account allowing for defining its thermodynamic properties. After a review of soil physics principles which forms the basis of the paradigm, we describe the basic relationships and functionality of the model. Kamel® runs with a set of 15 soil input parameters, the pedohydral parameters, which are parameters of the physically-based equations of four soil characteristic curves that can be measured in the laboratory. For cases where some of these parameters are not available, we show how to estimate these parameters from commonly available soil information using published pedotransfer functions. A published field experimental study on the dynamics of the soil moisture profile following a pounded infiltration rainfall event was used as an example to demonstrate soil characterization and Kamel® simulations. The simulated soil moisture profile for a period of 60 days showed very good agreement with experimental field data. Simulations using input data calculated from soil texture and pedotransfer functions were also generated and compared to simulations of the more ideal characterization. The later comparison illustrates how Kamel® can be used and adapt to any case of soil data availability. As physically based model on soil structure, it may be used as a standard reference to evaluate other soil-water models and also pedotransfer functions at a given location or agronomical situation.
NASA Astrophysics Data System (ADS)
Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Christensen, Hannah M.; Juricke, Stephan; Subramanian, Aneesh; Watson, Peter A. G.; Weisheimer, Antje; Palmer, Tim N.
2017-03-01
The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), together with coupled transient runs (1850-2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate - specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).
Cosmological Simulations of Galaxy Clusters
NASA Astrophysics Data System (ADS)
Borgani, Stefano; Kravtsov, Andrey
2011-02-01
We review recent progress in the description of the formation and evolution of galaxy clusters in a cosmological context by using state-of-art numerical simulations. We focus our presentation on the comparison between simulated and observed X-ray properties, while we will also discuss numerical predictions on properties of the galaxy population in clusters, as observed in the optical band. Many of the salient observed properties of clusters, such as scaling relations between X-ray observables and total mass, radial profiles of entropy and density of the intracluster gas, and radial distribution of galaxies are reproduced quite well. In particular, the outer regions of cluster at radii beyond about 10 per cent of the virial radius are quite regular and exhibit scaling with mass remarkably close to that expected in the simplest case in which only the action of gravity determines the evolution of the intra-cluster gas. However, simulations generally fail at reproducing the observed "cool core" structure of clusters: simulated clusters generally exhibit a significant excess of gas cooling in their central regions, which causes both an overestimate of the star formation in the cluster centers and incorrect temperature and entropy profiles. The total baryon fraction in clusters is below the mean universal value, by an amount which depends on the cluster-centric distance and the physics included in the simulations, with interesting tensions between observed stellar and gas fractions in clusters and predictions of simulations. Besides their important implications for the cosmological application of clusters, these puzzles also point towards the important role played by additional physical processes, beyond those already included in the simulations. We review the role played by these processes, along with the difficulty for their implementation, and discuss the outlook for the future progress in numerical modeling of clusters.
NASA Astrophysics Data System (ADS)
Markelov, A. Y.; Shiryaevskii, V. L.; Kudrinskiy, A. A.; Anpilov, S. V.; Bobrakov, A. N.
2017-11-01
A computational method of analysis of physical and chemical processes of high-temperature mineralizing of low-level radioactive waste in gas stream in the process of plasma treatment of radioactive waste in shaft furnaces was introduced. It was shown that the thermodynamic simulation method allows fairly adequately describing the changes in the composition of the pyrogas withdrawn from the shaft furnace at different waste treatment regimes. This offers a possibility of developing environmentally and economically viable technologies and small-sized low-cost facilities for plasma treatment of radioactive waste to be applied at currently operating nuclear power plants.
Random bits, true and unbiased, from atmospheric turbulence
Marangon, Davide G.; Vallone, Giuseppe; Villoresi, Paolo
2014-01-01
Random numbers represent a fundamental ingredient for secure communications and numerical simulation as well as to games and in general to Information Science. Physical processes with intrinsic unpredictability may be exploited to generate genuine random numbers. The optical propagation in strong atmospheric turbulence is here taken to this purpose, by observing a laser beam after a 143 km free-space path. In addition, we developed an algorithm to extract the randomness of the beam images at the receiver without post-processing. The numbers passed very selective randomness tests for qualification as genuine random numbers. The extracting algorithm can be easily generalized to random images generated by different physical processes. PMID:24976499
Advanced physical-chemical life support systems research
NASA Technical Reports Server (NTRS)
Evanich, Peggy L.
1988-01-01
A proposed NASA space research and technology development program will provide adequate data for designing closed loop life support systems for long-duration manned space missions. This program, referred to as the Pathfinder Physical-Chemical Closed Loop Life Support Program, is to identify and develop critical chemical engineering technologies for the closure of air and water loops within the spacecraft, surface habitats or mobility devices. Computerized simulation can be used both as a research and management tool. Validated models will guide the selection of the best known applicable processes and in the development of new processes. For the integration of the habitat system, a biological subsystem would be introduced to provide food production and to enhance the physical-chemical life support functions on an ever-increasing basis.
NASA Astrophysics Data System (ADS)
Lamraoui, F.; Booth, J. F.; Naud, C. M.
2017-12-01
The representation of subgrid-scale processes of low-level marine clouds located in the post-cold-frontal region poses a serious challenge for climate models. More precisely, the boundary layer parameterizations are predominantly designed for individual regimes that can evolve gradually over time and does not accommodate the cold front passage that can overly modify the boundary layer rapidly. Also, the microphysics schemes respond differently to the quick development of the boundary layer schemes, especially under unstable conditions. To improve the understanding of cloud physics in the post-cold frontal region, the present study focuses on exploring the relationship between cloud properties, the local processes and large-scale conditions. In order to address these questions, we explore the WRF sensitivity to the interaction between various combinations of the boundary layer and microphysics parameterizations, including the Community Atmospheric Model version 5 (CAM5) physical package in a perturbed physics ensemble. Then, we evaluate these simulations against ground-based ARM observations over the Azores. The WRF-based simulations demonstrate particular sensitivities of the marine cold front passage and the associated post-cold frontal clouds to the domain size, the resolution and the physical parameterizations. First, it is found that in multiple different case studies the model cannot generate the cold front passage when the domain size is larger than 3000 km2. Instead, the modeled cold front stalls, which shows the importance of properly capturing the synoptic scale conditions. The simulation reveals persistent delay in capturing the cold front passage and also an underestimated duration of the post-cold-frontal conditions. Analysis of the perturbed physics ensemble shows that changing the microphysics scheme leads to larger differences in the modeled clouds than changing the boundary layer scheme. The in-cloud heating tendencies are analyzed to explain this sensitivity.
NASA Astrophysics Data System (ADS)
Raj, Rahul; van der Tol, Christiaan; Hamm, Nicholas Alexander Samuel; Stein, Alfred
2018-01-01
Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP) data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT), ratio of fine root carbon to leaf carbon (FRC : LC), ratio of carbon to nitrogen in leaf (C : Nleaf), canopy water interception coefficient (Wint), fraction of leaf nitrogen in RuBisCO (FLNR), and effective soil rooting depth (SD) characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash-Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.
Phantom-based interactive simulation system for dental treatment training.
Sae-Kee, Bundit; Riener, Robert; Frey, Martin; Pröll, Thomas; Burgkart, Rainer
2004-01-01
In this paper, we propose a new interactive simulation system for dental treatment training. The system comprises a virtual reality environment and a force-torque measuring device to enhance the capabilities of a passive phantom of tooth anatomy in dental treatment training processes. The measuring device is connected to the phantom, and provides essential input data for generating the graphic animations of physical behaviors such as drilling and bleeding. The animation methods of those physical behaviors are also presented. This system is not only able to enhance interactivity and accessibility of the training system compared to conventional methods but it also provides possibilities of recording, evaluating, and verifying the training results.
Efficient Numerical Simulation of Aerothermoelastic Hypersonic Vehicles
NASA Astrophysics Data System (ADS)
Klock, Ryan J.
Hypersonic vehicles operate in a high-energy flight environment characterized by high dynamic pressures, high thermal loads, and non-equilibrium flow dynamics. This environment induces strong fluid, thermal, and structural dynamics interactions that are unique to this flight regime. If these vehicles are to be effectively designed and controlled, then a robust and intuitive understanding of each of these disciplines must be developed not only in isolation, but also when coupled. Limitations on scaling and the availability of adequate test facilities mean that physical investigation is infeasible. Ever growing computational power offers the ability to perform elaborate numerical simulations, but also has its own limitations. The state of the art in numerical simulation is either to create ever more high-fidelity physics models that do not couple well and require too much processing power to consider more than a few seconds of flight, or to use low-fidelity analytical models that can be tightly coupled and processed quickly, but do not represent realistic systems due to their simplifying assumptions. Reduced-order models offer a middle ground by distilling the dominant trends of high-fidelity training solutions into a form that can be quickly processed and more tightly coupled. This thesis presents a variably coupled, variable-fidelity, aerothermoelastic framework for the simulation and analysis of high-speed vehicle systems using analytical, reduced-order, and surrogate modeling techniques. Full launch-to-landing flights of complete vehicles are considered and used to define flight envelopes with aeroelastic, aerothermal, and thermoelastic limits, tune in-the-loop flight controllers, and inform future design considerations. A partitioned approach to vehicle simulation is considered in which regions dominated by particular combinations of processes are made separate from the overall solution and simulated by a specialized set of models to improve overall processing speed and overall solution fidelity. A number of enhancements to this framework are made through 1. the implementation of a publish-subscribe code architecture for rapid prototyping of physics and process models. 2. the implementation of a selection of linearization and model identification methods including high-order pseudo-time forward difference, complex-step, and direct identification from ordinary differential equation inspection. 3. improvements to the aeroheating and thermal models with non-equilibrium gas dynamics and generalized temperature dependent material thermal properties. A variety of model reduction and surrogate model techniques are applied to a representative hypersonic vehicle on a terminal trajectory to enable complete aerothermoelastic flight simulations. Multiple terminal trajectories of various starting altitudes and Mach numbers are optimized to maximize final kinetic energy of the vehicle upon reaching the surface. Surrogate models are compared to represent the variation of material thermal properties with temperature. A new method is developed and shown to be both accurate and computationally efficient. While the numerically efficient simulation of high-speed vehicles is developed within the presented framework, the goal of real time simulation is hampered by the necessity of multiple nested convergence loops. An alternative all-in-one surrogate model method is developed based on singular-value decomposition and regression that is near real time. Finally, the aeroelastic stability of pressurized cylindrical shells is investigated in the context of a maneuvering axisymmetric high-speed vehicle. Moderate internal pressurization is numerically shown to decrease stability, as showed experimentally in the literature, yet not well reproduced analytically. Insights are drawn from time simulation results and used to inform approaches for future vehicle model development.
Integration Process for Payloads in the Fluids and Combustion Facility
NASA Technical Reports Server (NTRS)
Free, James M.; Nall, Marsha M.
2001-01-01
The Fluids and Combustion Facility (FCF) is an ISS research facility located in the United States Laboratory (US Lab), Destiny. The FCF is a multi-discipline facility that performs microgravity research primarily in fluids physics science and combustion science. This facility remains on-orbit and provides accommodations to multi-user and Principal investigator (PI) unique hardware. The FCF is designed to accommodate 15 PI's per year. In order to allow for this number of payloads per year, the FCF has developed an end-to-end analytical and physical integration process. The process includes provision of integration tools, products and interface management throughout the life of the payload. The payload is provided with a single point of contact from the facility and works with that interface from PI selection through post flight processing. The process utilizes electronic tools for creation of interface documents/agreements, storage of payload data and rollup for facility submittals to ISS. Additionally, the process provides integration to and testing with flight-like simulators prior to payload delivery to KSC. These simulators allow the payload to test in the flight configuration and perform final facility interface and science verifications. The process also provides for support to the payload from the FCF through the Payload Safety Review Panel (PSRP). Finally, the process includes support in the development of operational products and the operation of the payload on-orbit.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Badal, Andreu; Badano, Aldo
Purpose: It is a known fact that Monte Carlo simulations of radiation transport are computationally intensive and may require long computing times. The authors introduce a new paradigm for the acceleration of Monte Carlo simulations: The use of a graphics processing unit (GPU) as the main computing device instead of a central processing unit (CPU). Methods: A GPU-based Monte Carlo code that simulates photon transport in a voxelized geometry with the accurate physics models from PENELOPE has been developed using the CUDA programming model (NVIDIA Corporation, Santa Clara, CA). Results: An outline of the new code and a sample x-raymore » imaging simulation with an anthropomorphic phantom are presented. A remarkable 27-fold speed up factor was obtained using a GPU compared to a single core CPU. Conclusions: The reported results show that GPUs are currently a good alternative to CPUs for the simulation of radiation transport. Since the performance of GPUs is currently increasing at a faster pace than that of CPUs, the advantages of GPU-based software are likely to be more pronounced in the future.« less
Order out of Randomness: Self-Organization Processes in Astrophysics
NASA Astrophysics Data System (ADS)
Aschwanden, Markus J.; Scholkmann, Felix; Béthune, William; Schmutz, Werner; Abramenko, Valentina; Cheung, Mark C. M.; Müller, Daniel; Benz, Arnold; Chernov, Guennadi; Kritsuk, Alexei G.; Scargle, Jeffrey D.; Melatos, Andrew; Wagoner, Robert V.; Trimble, Virginia; Green, William H.
2018-03-01
Self-organization is a property of dissipative nonlinear processes that are governed by a global driving force and a local positive feedback mechanism, which creates regular geometric and/or temporal patterns, and decreases the entropy locally, in contrast to random processes. Here we investigate for the first time a comprehensive number of (17) self-organization processes that operate in planetary physics, solar physics, stellar physics, galactic physics, and cosmology. Self-organizing systems create spontaneous " order out of randomness", during the evolution from an initially disordered system to an ordered quasi-stationary system, mostly by quasi-periodic limit-cycle dynamics, but also by harmonic (mechanical or gyromagnetic) resonances. The global driving force can be due to gravity, electromagnetic forces, mechanical forces (e.g., rotation or differential rotation), thermal pressure, or acceleration of nonthermal particles, while the positive feedback mechanism is often an instability, such as the magneto-rotational (Balbus-Hawley) instability, the convective (Rayleigh-Bénard) instability, turbulence, vortex attraction, magnetic reconnection, plasma condensation, or a loss-cone instability. Physical models of astrophysical self-organization processes require hydrodynamic, magneto-hydrodynamic (MHD), plasma, or N-body simulations. Analytical formulations of self-organizing systems generally involve coupled differential equations with limit-cycle solutions of the Lotka-Volterra or Hopf-bifurcation type.
ME(SSY)**2: Monte Carlo Code for Star Cluster Simulations
NASA Astrophysics Data System (ADS)
Freitag, Marc Dewi
2013-02-01
ME(SSY)**2 stands for “Monte-carlo Experiments with Spherically SYmmetric Stellar SYstems." This code simulates the long term evolution of spherical clusters of stars; it was devised specifically to treat dense galactic nuclei. It is based on the pioneering Monte Carlo scheme proposed by Hénon in the 70's and includes all relevant physical ingredients (2-body relaxation, stellar mass spectrum, collisions, tidal disruption, ldots). It is basically a Monte Carlo resolution of the Fokker-Planck equation. It can cope with any stellar mass spectrum or velocity distribution. Being a particle-based method, it also allows one to take stellar collisions into account in a very realistic way. This unique code, featuring most important physical processes, allows million particle simulations, spanning a Hubble time, in a few CPU days on standard personal computers and provides a wealth of data only rivalized by N-body simulations. The current version of the software requires the use of routines from the "Numerical Recipes in Fortran 77" (http://www.nrbook.com/a/bookfpdf.php).
Realistic Modeling of Multi-Scale MHD Dynamics of the Solar Atmosphere
NASA Technical Reports Server (NTRS)
Kitiashvili, Irina; Mansour, Nagi N.; Wray, Alan; Couvidat, Sebastian; Yoon, Seokkwan; Kosovichev, Alexander
2014-01-01
Realistic 3D radiative MHD simulations open new perspectives for understanding the turbulent dynamics of the solar surface, its coupling to the atmosphere, and the physical mechanisms of generation and transport of non-thermal energy. Traditionally, plasma eruptions and wave phenomena in the solar atmosphere are modeled by prescribing artificial driving mechanisms using magnetic or gas pressure forces that might arise from magnetic field emergence or reconnection instabilities. In contrast, our 'ab initio' simulations provide a realistic description of solar dynamics naturally driven by solar energy flow. By simulating the upper convection zone and the solar atmosphere, we can investigate in detail the physical processes of turbulent magnetoconvection, generation and amplification of magnetic fields, excitation of MHD waves, and plasma eruptions. We present recent simulation results of the multi-scale dynamics of quiet-Sun regions, and energetic effects in the atmosphere and compare with observations. For the comparisons we calculate synthetic spectro-polarimetric data to model observational data of SDO, Hinode, and New Solar Telescope.
Physically-based in silico light sheet microscopy for visualizing fluorescent brain models
2015-01-01
Background We present a physically-based computational model of the light sheet fluorescence microscope (LSFM). Based on Monte Carlo ray tracing and geometric optics, our method simulates the operational aspects and image formation process of the LSFM. This simulated, in silico LSFM creates synthetic images of digital fluorescent specimens that can resemble those generated by a real LSFM, as opposed to established visualization methods producing visually-plausible images. We also propose an accurate fluorescence rendering model which takes into account the intrinsic characteristics of fluorescent dyes to simulate the light interaction with fluorescent biological specimen. Results We demonstrate first results of our visualization pipeline to a simplified brain tissue model reconstructed from the somatosensory cortex of a young rat. The modeling aspects of the LSFM units are qualitatively analysed, and the results of the fluorescence model were quantitatively validated against the fluorescence brightness equation and characteristic emission spectra of different fluorescent dyes. AMS subject classification Modelling and simulation PMID:26329404
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, J; Park, S; Jeong, J
Purpose: In particle therapy and radiobiology, the investigation of mechanisms leading to the death of target cancer cells induced by ionising radiation is an active field of research. Recently, several studies based on Monte Carlo simulation codes have been initiated in order to simulate physical interactions of ionising particles at cellular scale and in DNA. Geant4-DNA is the one of them; it is an extension of the general purpose Geant4 Monte Carlo simulation toolkit for the simulation of physical interactions at sub-micrometre scale. In this study, we present Geant4-DNA Monte Carlo simulations for the prediction of DNA strand breakage usingmore » a geometrical modelling of DNA structure. Methods: For the simulation of DNA strand breakage, we developed a specific DNA geometrical structure. This structure consists of DNA components, such as the deoxynucleotide pairs, the DNA double helix, the nucleosomes and the chromatin fibre. Each component is made of water because the cross sections models currently available in Geant4-DNA for protons apply to liquid water only. Also, at the macroscopic-scale, protons were generated with various energies available for proton therapy at the National Cancer Center, obtained using validated proton beam simulations developed in previous studies. These multi-scale simulations were combined for the validation of Geant4-DNA in radiobiology. Results: In the double helix structure, the deposited energy in a strand allowed to determine direct DNA damage from physical interaction. In other words, the amount of dose and frequency of damage in microscopic geometries was related to direct radiobiological effect. Conclusion: In this report, we calculated the frequency of DNA strand breakage using Geant4- DNA physics processes for liquid water. This study is now on-going in order to develop geometries which use realistic DNA material, instead of liquid water. This will be tested as soon as cross sections for DNA material become available in Geant4-DNA.« less
WE-D-204-00: Session in Memory of Franca Kuchnir: Excellence in Medical Physics Residency Education
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Speakers in this session will present overview and details of a specific rotation or feature of their Medical Physics Residency Program that is particularly exceptional and noteworthy. The featured rotations include foundational topics executed with exceptional acumen and innovative educational rotations perhaps not commonly found in Medical Physics Residency Programs. A site-specific clinical rotation will be described, where the medical physics resident follows the physician and medical resident for two weeks into patient consultations, simulation sessions, target contouring sessions, planning meetings with dosimetry, patient follow up visits, and tumor boards, to gain insight into the thought processes of the radiationmore » oncologist. An incident learning rotation will be described where the residents learns about and practices evaluating clinical errors and investigates process improvements for the clinic. The residency environment at a Canadian medical physics residency program will be described, where the training and interactions with radiation oncology residents is integrated. And the first month rotation will be described, where the medical physics resident rotates through the clinical areas including simulation, dosimetry, and treatment units, gaining an overview of the clinical flow and meeting all the clinical staff to begin the residency program. This session will be of particular interest to residency programs who are interested in adopting or adapting these curricular ideas into their programs and to residency candidates who want to learn about programs already employing innovative practices. Learning Objectives: To learn about exceptional and innovative clinical rotations or program features within existing Medical Physics Residency Programs. To understand how to adopt/adapt innovative curricular designs into your own Medical Physics Residency Program, if appropriate.« less
NASA Astrophysics Data System (ADS)
Esposti Ongaro, T.; Barsotti, S.; de'Michieli Vitturi, M.; Favalli, M.; Longo, A.; Nannipieri, L.; Neri, A.; Papale, P.; Saccorotti, G.
2009-12-01
Physical and numerical modelling is becoming of increasing importance in volcanology and volcanic hazard assessment. However, new interdisciplinary problems arise when dealing with complex mathematical formulations, numerical algorithms and their implementations on modern computer architectures. Therefore new frameworks are needed for sharing knowledge, software codes, and datasets among scientists. Here we present the Volcano Modelling and Simulation gateway (VMSg, accessible at http://vmsg.pi.ingv.it), a new electronic infrastructure for promoting knowledge growth and transfer in the field of volcanological modelling and numerical simulation. The new web portal, developed in the framework of former and ongoing national and European projects, is based on a dynamic Content Manager System (CMS) and was developed to host and present numerical models of the main volcanic processes and relationships including magma properties, magma chamber dynamics, conduit flow, plume dynamics, pyroclastic flows, lava flows, etc. Model applications, numerical code documentation, simulation datasets as well as model validation and calibration test-cases are also part of the gateway material.
CMS Physics Technical Design Report, Volume II: Physics Performance
NASA Astrophysics Data System (ADS)
CMS Collaboration
2007-06-01
CMS is a general purpose experiment, designed to study the physics of pp collisions at 14 TeV at the Large Hadron Collider (LHC). It currently involves more than 2000 physicists from more than 150 institutes and 37 countries. The LHC will provide extraordinary opportunities for particle physics based on its unprecedented collision energy and luminosity when it begins operation in 2007. The principal aim of this report is to present the strategy of CMS to explore the rich physics programme offered by the LHC. This volume demonstrates the physics capability of the CMS experiment. The prime goals of CMS are to explore physics at the TeV scale and to study the mechanism of electroweak symmetry breaking—through the discovery of the Higgs particle or otherwise. To carry out this task, CMS must be prepared to search for new particles, such as the Higgs boson or supersymmetric partners of the Standard Model particles, from the start-up of the LHC since new physics at the TeV scale may manifest itself with modest data samples of the order of a few fb -1 or less. The analysis tools that have been developed are applied to study in great detail and with all the methodology of performing an analysis on CMS data specific benchmark processes upon which to gauge the performance of CMS. These processes cover several Higgs boson decay channels, the production and decay of new particles such as Z' and supersymmetric particles, B s production and processes in heavy ion collisions. The simulation of these benchmark processes includes subtle effects such as possible detector miscalibration and misalignment. Besides these benchmark processes, the physics reach of CMS is studied for a large number of signatures arising in the Standard Model and also in theories beyond the Standard Model for integrated luminosities ranging from 1 fb -1 to 30 fb -1 . The Standard Model processes include QCD, B -physics, diffraction, detailed studies of the top quark properties, and electroweak physics topics such as the W and Z 0 boson properties. The production and decay of the Higgs particle is studied for many observable decays, and the precision with which the Higgs boson properties can be derived is determined. About ten different supersymmetry benchmark points are analysed using full simulation. The CMS discovery reach is evaluated in the SUSY parameter space covering a large variety of decay signatures. Furthermore, the discovery reach for a plethora of alternative models for new physics is explored, notably extra dimensions, new vector boson high mass states, little Higgs models, technicolour and others. Methods to discriminate between models have been investigated. This report is organized as follows. Chapter 1, the Introduction, describes the context of this document. Chapters 2 6 describe examples of full analyses, with photons, electrons, muons, jets, missing E T , B-mesons and τ's, and for quarkonia in heavy ion collisions. Chapters 7 15 describe the physics reach for Standard Model processes, Higgs discovery and searches for new physics beyond the Standard Model.
NASA Astrophysics Data System (ADS)
Willgoose, G. R.; Cohen, S.; Svoray, T.; Sela, S.; Hancock, G. R.
2010-12-01
Numerical models are an important tool for studying landscape processes as they allow us to isolate specific processes and drivers and test various physics and spatio-temporal scenarios. Here we use a distributed physically-based soil evolution model (mARM4D) to describe the drivers and processes controlling soil-landscape evolution on a field-site at the fringe between the Mediterranean and desert regions of Israel. This study is an initial effort in a larger project aimed at improving our understanding of the mechanisms and drivers that led to the extensive removal of soils from the loess covered hillslopes of this region. This specific region is interesting as it is located between the Mediterranean climate region in which widespread erosion from hillslopes was attributed to human activity during the Holocene and the arid region in which extensive removal of loess from hillslopes was shown to have been driven by climatic changes during the late-Pleistocene. First we study the sediment transport mechanism of the soil-landscape evolution processes in our study-site. We simulate soil-landscape evolution with only one sediment transport process (fluvial or diffusive) at a time. We find that diffusive sediment transport is likely the dominant process in this site as it resulted in soil distributions that better corresponds to current observations. We then simulate several realistic climatic/anthropogenic scenarios (based on the literature) in order to quantify the sensitivity of the soil-landscape evolution process to temporal fluctuations. We find that this site is relatively insensitive to short term (several thousands of years) sharp, changes. This suggests that climate, rather then human activity, was the main driver for the extensive removal of loess from the hillslopes.
The GBS code for tokamak scrape-off layer simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halpern, F.D., E-mail: federico.halpern@epfl.ch; Ricci, P.; Jolliet, S.
2016-06-15
We describe a new version of GBS, a 3D global, flux-driven plasma turbulence code to simulate the turbulent dynamics in the tokamak scrape-off layer (SOL), superseding the code presented by Ricci et al. (2012) [14]. The present work is driven by the objective of studying SOL turbulent dynamics in medium size tokamaks and beyond with a high-fidelity physics model. We emphasize an intertwining framework of improved physics models and the computational improvements that allow them. The model extensions include neutral atom physics, finite ion temperature, the addition of a closed field line region, and a non-Boussinesq treatment of the polarizationmore » drift. GBS has been completely refactored with the introduction of a 3-D Cartesian communicator and a scalable parallel multigrid solver. We report dramatically enhanced parallel scalability, with the possibility of treating electromagnetic fluctuations very efficiently. The method of manufactured solutions as a verification process has been carried out for this new code version, demonstrating the correct implementation of the physical model.« less
Nuclear spectroscopy with Geant4. The superheavy challenge
NASA Astrophysics Data System (ADS)
Sarmiento, Luis G.
2016-12-01
The simulation toolkit Geant4 was originally developed at CERN for high-energy physics. Over the years it has been established as a swiss army knife not only in particle physics but it has seen an accelerated expansion towards nuclear physics and more recently to medical imaging and γ- and ion- therapy to mention but a handful of new applications. The validity of Geant4 is vast and large across many particles, ions, materials, and physical processes with typically various different models to choose from. Unfortunately, atomic nuclei with atomic number Z > 100 are not properly supported. This is likely due to the rather novelty of the field, its comparably small user base, and scarce evaluated experimental data. To circumvent this situation different workarounds have been used over the years. In this work the simulation toolkit Geant4 will be introduced with its different components and the effort to bring the software to the heavy and superheavy region will be described.
Physical mechanisms of solar activity effects in the middle atmosphere
NASA Technical Reports Server (NTRS)
Ebel, A.
1989-01-01
A great variety of physical mechanisms of possibly solar induced variations in the middle atmosphere has been discussed in the literature during the last decades. The views which have been put forward are often controversial in their physical consequences. The reason may be the complexity and non-linearity of the atmospheric response to comparatively weak forcing resulting from solar activity. Therefore this review focuses on aspects which seem to indicate nonlinear processes in the development of solar induced variations. Results from observations and numerical simulations are discussed.
An Introduction to Simulated Annealing
ERIC Educational Resources Information Center
Albright, Brian
2007-01-01
An attempt to model the physical process of annealing lead to the development of a type of combinatorial optimization algorithm that takes on the problem of getting trapped in a local minimum. The author presents a Microsoft Excel spreadsheet that illustrates how this works.
NASA Astrophysics Data System (ADS)
Siahaan, P.; Suryani, A.; Kaniawati, I.; Suhendi, E.; Samsudin, A.
2017-02-01
The purpose of this research is to identify the development of students’ science process skills (SPS) on linear motion concept by utilizing simple computer simulation. In order to simplify the learning process, the concept is able to be divided into three sub-concepts: 1) the definition of motion, 2) the uniform linear motion and 3) the uniformly accelerated motion. This research was administered via pre-experimental method with one group pretest-posttest design. The respondents which were involved in this research were 23 students of seventh grade in one of junior high schools in Bandung City. The improving process of students’ science process skill is examined based on normalized gain analysis from pretest and posttest scores for all sub-concepts. The result of this research shows that students’ science process skills are dramatically improved by 47% (moderate) on observation skill; 43% (moderate) on summarizing skill, 70% (high) on prediction skill, 44% (moderate) on communication skill and 49% (moderate) on classification skill. These results clarify that the utilizing simple computer simulations in physics learning is be able to improve overall science skills at moderate level.
Modeling radionuclide migration from underground nuclear explosions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harp, Dylan Robert; Stauffer, Philip H.; Viswanathan, Hari S.
2017-03-06
The travel time of radionuclide gases to the ground surface in fracture rock depends on many complex factors. Numerical simulators are the most complete repositories of knowledge of the complex processes governing radionuclide gas migration to the ground surface allowing us to verify conceptualizations of physical processes against observations and forecast radionuclide gas travel times to the ground surface and isotopic ratios
ERIC Educational Resources Information Center
Pretelín-Ricárdez, Angel; Sacristán, Ana Isabel
2015-01-01
We present some results of an ongoing research project where university engineering students were asked to construct videogames involving the use of physical systems models. The objective is to help them identify and understand the elements and concepts involved in the modelling process. That is, we use game design as a constructionist approach…
Mathur, Rohit; Xing, Jia; Gilliam, Robert; Sarwar, Golam; Hogrefe, Christian; Pleim, Jonathan; Pouliot, George; Roselle, Shawn; Spero, Tanya L.; Wong, David C.; Young, Jeffrey
2018-01-01
The Community Multiscale Air Quality (CMAQ) modeling system is extended to simulate ozone, particulate matter, and related precursor distributions throughout the Northern Hemisphere. Modelled processes were examined and enhanced to suitably represent the extended space and time scales for such applications. Hemispheric scale simulations with CMAQ and the Weather Research and Forecasting (WRF) model are performed for multiple years. Model capabilities for a range of applications including episodic long-range pollutant transport, long-term trends in air pollution across the Northern Hemisphere, and air pollution-climate interactions are evaluated through detailed comparison with available surface, aloft, and remotely sensed observations. The expansion of CMAQ to simulate the hemispheric scales provides a framework to examine interactions between atmospheric processes occurring at various spatial and temporal scales with physical, chemical, and dynamical consistency. PMID:29681922
Atmospheric Modeling And Sensor Simulation (AMASS) study
NASA Technical Reports Server (NTRS)
Parker, K. G.
1984-01-01
The capabilities of the atmospheric modeling and sensor simulation (AMASS) system were studied in order to enhance them. This system is used in processing atmospheric measurements which are utilized in the evaluation of sensor performance, conducting design-concept simulation studies, and also in the modeling of the physical and dynamical nature of atmospheric processes. The study tasks proposed in order to both enhance the AMASS system utilization and to integrate the AMASS system with other existing equipment to facilitate the analysis of data for modeling and image processing are enumerated. The following array processors were evaluated for anticipated effectiveness and/or improvements in throughput by attachment of the device to the P-e: (1) Floating Point Systems AP-120B; (2) Floating Point Systems 5000; (3) CSP, Inc. MAP-400; (4) Analogic AP500; (5) Numerix MARS-432; and (6) Star Technologies, Inc. ST-100.
Predictive Models for Semiconductor Device Design and Processing
NASA Technical Reports Server (NTRS)
Meyyappan, Meyya; Arnold, James O. (Technical Monitor)
1998-01-01
The device feature size continues to be on a downward trend with a simultaneous upward trend in wafer size to 300 mm. Predictive models are needed more than ever before for this reason. At NASA Ames, a Device and Process Modeling effort has been initiated recently with a view to address these issues. Our activities cover sub-micron device physics, process and equipment modeling, computational chemistry and material science. This talk would outline these efforts and emphasize the interaction among various components. The device physics component is largely based on integrating quantum effects into device simulators. We have two parallel efforts, one based on a quantum mechanics approach and the second, a semiclassical hydrodynamics approach with quantum correction terms. Under the first approach, three different quantum simulators are being developed and compared: a nonequlibrium Green's function (NEGF) approach, Wigner function approach, and a density matrix approach. In this talk, results using various codes will be presented. Our process modeling work focuses primarily on epitaxy and etching using first-principles models coupling reactor level and wafer level features. For the latter, we are using a novel approach based on Level Set theory. Sample results from this effort will also be presented.
NASA Astrophysics Data System (ADS)
Inam, Azhar; Adamowski, Jan; Prasher, Shiv; Halbe, Johannes; Malard, Julien; Albano, Raffaele
2017-08-01
Many simulation models focus on simulating a single physical process and do not constitute balanced representations of the physical, social and economic components of a system. The present study addresses this challenge by integrating a physical (P) model (SAHYSMOD) with a group (stakeholder) built system dynamics model (GBSDM) through a component modeling approach based on widely applied tools such as MS Excel, Python and Visual Basic for Applications (VBA). The coupled model (P-GBSDM) was applied to test soil salinity management scenarios (proposed by stakeholders) for the Haveli region of the Rechna Doab Basin in Pakistan. Scenarios such as water banking, vertical drainage, canal lining, and irrigation water reallocation were simulated with the integrated model. Spatiotemporal maps and economic and environmental trade-off criteria were used to examine the effectiveness of the selected management scenarios. After 20 years of simulation, canal lining reduced soil salinity by 22% but caused an initial reduction of 18% in farm income, which requires an initial investment from the government. The government-sponsored Salinity Control and Reclamation Project (SCARP) is a short-term policy that resulted in a 37% increase in water availability with a 12% increase in farmer income. However, it showed detrimental effects on soil salinity in the long term, with a 21% increase in soil salinity due to secondary salinization. The new P-GBSDM was shown to be an effective platform for engaging stakeholders and simulating their proposed management policies while taking into account socioeconomic considerations. This was not possible using the physically based SAHYSMOD model alone.
NASA Technical Reports Server (NTRS)
Strahan, Susan E.; Douglass, Anne R.
2003-01-01
The Global Modeling Initiative has integrated two 35-year simulations of an ozone recovery scenario with an offline chemistry and transport model using two different meteorological inputs. Physically based diagnostics, derived from satellite and aircraft data sets, are described and then used to evaluate the realism of temperature and transport processes in the simulations. Processes evaluated include barrier formation in the subtropics and polar regions, and extratropical wave-driven transport. Some diagnostics are especially relevant to simulation of lower stratospheric ozone, but most are applicable to any stratospheric simulation. The temperature evaluation, which is relevant to gas phase chemical reactions, showed that both sets of meteorological fields have near climatological values at all latitudes and seasons at 30 hPa and below. Both simulations showed weakness in upper stratospheric wave driving. The simulation using input from a general circulation model (GMI(sub GCM)) showed a very good residual circulation in the tropics and northern hemisphere. The simulation with input from a data assimilation system (GMI(sub DAS)) performed better in the midlatitudes than at high latitudes. Neither simulation forms a realistic barrier at the vortex edge, leading to uncertainty in the fate of ozone-depleted vortex air. Overall, tracer transport in the offline GMI(sub GCM) has greater fidelity throughout the stratosphere than the GMI(sub DAS).
NASA Astrophysics Data System (ADS)
Nowak, W.; Koch, J.
2014-12-01
Predicting DNAPL fate and transport in heterogeneous aquifers is challenging and subject to an uncertainty that needs to be quantified. Models for this task needs to be equipped with an accurate source zone description, i.e., the distribution of mass of all partitioning phases (DNAPL, water, and soil) in all possible states ((im)mobile, dissolved, and sorbed), mass-transfer algorithms, and the simulation of transport processes in the groundwater. Such detailed models tend to be computationally cumbersome when used for uncertainty quantification. Therefore, a selective choice of the relevant model states, processes, and scales are both sensitive and indispensable. We investigate the questions: what is a meaningful level of model complexity and how to obtain an efficient model framework that is still physically and statistically consistent. In our proposed model, aquifer parameters and the contaminant source architecture are conceptualized jointly as random space functions. The governing processes are simulated in a three-dimensional, highly-resolved, stochastic, and coupled model that can predict probability density functions of mass discharge and source depletion times. We apply a stochastic percolation approach as an emulator to simulate the contaminant source formation, a random walk particle tracking method to simulate DNAPL dissolution and solute transport within the aqueous phase, and a quasi-steady-state approach to solve for DNAPL depletion times. Using this novel model framework, we test whether and to which degree the desired model predictions are sensitive to simplifications often found in the literature. With this we identify that aquifer heterogeneity, groundwater flow irregularity, uncertain and physically-based contaminant source zones, and their mutual interlinkages are indispensable components of a sound model framework.
Correlation models for waste tank sludges and slurries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahoney, L.A.; Trent, D.S.
This report presents the results of work conducted to support the TEMPEST computer modeling under the Flammable Gas Program (FGP) and to further the comprehension of the physical processes occurring in the Hanford waste tanks. The end products of this task are correlation models (sets of algorithms) that can be added to the TEMPEST computer code to improve the reliability of its simulation of the physical processes that occur in Hanford tanks. The correlation models can be used to augment, not only the TEMPEST code, but other computer codes that can simulate sludge motion and flammable gas retention. This reportmore » presents the correlation models, also termed submodels, that have been developed to date. The submodel-development process is an ongoing effort designed to increase our understanding of sludge behavior and improve our ability to realistically simulate the sludge fluid characteristics that have an impact on safety analysis. The effort has employed both literature searches and data correlation to provide an encyclopedia of tank waste properties in forms that are relatively easy to use in modeling waste behavior. These properties submodels will be used in other tasks to simulate waste behavior in the tanks. Density, viscosity, yield strength, surface tension, heat capacity, thermal conductivity, salt solubility, and ammonia and water vapor pressures were compiled for solutions and suspensions of sodium nitrate and other salts (where data were available), and the data were correlated by linear regression. In addition, data for simulated Hanford waste tank supernatant were correlated to provide density, solubility, surface tension, and vapor pressure submodels for multi-component solutions containing sodium hydroxide, sodium nitrate, sodium nitrite, and sodium aluminate.« less
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
Physically representative atomistic modeling of atomic-scale friction
NASA Astrophysics Data System (ADS)
Dong, Yalin
Nanotribology is a research field to study friction, adhesion, wear and lubrication occurred between two sliding interfaces at nano scale. This study is motivated by the demanding need of miniaturization mechanical components in Micro Electro Mechanical Systems (MEMS), improvement of durability in magnetic storage system, and other industrial applications. Overcoming tribological failure and finding ways to control friction at small scale have become keys to commercialize MEMS with sliding components as well as to stimulate the technological innovation associated with the development of MEMS. In addition to the industrial applications, such research is also scientifically fascinating because it opens a door to understand macroscopic friction from the most bottom atomic level, and therefore serves as a bridge between science and engineering. This thesis focuses on solid/solid atomic friction and its associated energy dissipation through theoretical analysis, atomistic simulation, transition state theory, and close collaboration with experimentalists. Reduced-order models have many advantages for its simplification and capacity to simulating long-time event. We will apply Prandtl-Tomlinson models and their extensions to interpret dry atomic-scale friction. We begin with the fundamental equations and build on them step-by-step from the simple quasistatic one-spring, one-mass model for predicting transitions between friction regimes to the two-dimensional and multi-atom models for describing the effect of contact area. Theoretical analysis, numerical implementation, and predicted physical phenomena are all discussed. In the process, we demonstrate the significant potential for this approach to yield new fundamental understanding of atomic-scale friction. Atomistic modeling can never be overemphasized in the investigation of atomic friction, in which each single atom could play a significant role, but is hard to be captured experimentally. In atomic friction, the interesting physical process is buried between the two contact interfaces, thus makes a direct measurement more difficult. Atomistic simulation is able to simulate the process with the dynamic information of each single atom, and therefore provides valuable interpretations for experiments. In this, we will systematically to apply Molecular Dynamics (MD) simulation to optimally model the Atomic Force Microscopy (AFM) measurement of atomic friction. Furthermore, we also employed molecular dynamics simulation to correlate the atomic dynamics with the friction behavior observed in experiments. For instance, ParRep dynamics (an accelerated molecular dynamic technique) is introduced to investigate velocity dependence of atomic friction; we also employ MD simulation to "see" how the reconstruction of gold surface modulates the friction, and the friction enhancement mechanism at a graphite step edge. Atomic stick-slip friction can be treated as a rate process. Instead of running a direction simulation of the process, we can apply transition state theory to predict its property. We will have a rigorous derivation of velocity and temperature dependence of friction based on the Prandtl-Tomlinson model as well as transition theory. A more accurate relation to prediction velocity and temperature dependence is obtained. Furthermore, we have included instrumental noise inherent in AFM measurement to interpret two discoveries in experiments, suppression of friction at low temperature and the attempt frequency discrepancy between AFM measurement and theoretical prediction. We also discuss the possibility to treat wear as a rate process.
Modeling physical vapor deposition of energetic materials
Shirvan, Koroush; Forrest, Eric C.
2018-03-28
Morphology and microstructure of organic explosive films formed using physical vapor deposition (PVD) processes strongly depends on local surface temperature during deposition. Currently, there is no accurate means of quantifying the local surface temperature during PVD processes in the deposition chambers. This study focuses on using a multiphysics computational fluid dynamics tool, STARCCM+, to simulate pentaerythritol tetranitrate (PETN) deposition. The PETN vapor and solid phase were simulated using the volume of fluid method and its deposition in the vacuum chamber on spinning silicon wafers was modeled. The model also included the spinning copper cooling block where the wafers are placedmore » along with the chiller operating with forced convection refrigerant. Implicit time-dependent simulations in two- and three-dimensional were performed to derive insights in the governing physics for PETN thin film formation. PETN is deposited at the rate of 14 nm/s at 142.9 °C on a wafer with an initial temperature of 22 °C. The deposition of PETN on the wafers was calculated at an assumed heat transfer coefficient (HTC) of 400 W/m 2 K. This HTC proved to be the most sensitive parameter in determining the local surface temperature during deposition. Previous experimental work found noticeable microstructural changes with 0.5 mm fused silica wafers in place of silicon during the PETN deposition. This work showed that fused silica slows initial wafer cool down and results in ~10 °C difference for the surface temperature at 500 μm PETN film thickness. It was also found that the deposition surface temperature is insensitive to the cooling power of the copper block due to the copper block's very large heat capacity and thermal conductivity relative to the heat input from the PVD process. Future work should incorporate the addition of local stress during PETN deposition. Lastly, based on simulation results, it is also recommended to investigate the impact of wafer surface energy on the PETN microstructure and morphology formation.« less
Modeling physical vapor deposition of energetic materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirvan, Koroush; Forrest, Eric C.
Morphology and microstructure of organic explosive films formed using physical vapor deposition (PVD) processes strongly depends on local surface temperature during deposition. Currently, there is no accurate means of quantifying the local surface temperature during PVD processes in the deposition chambers. This study focuses on using a multiphysics computational fluid dynamics tool, STARCCM+, to simulate pentaerythritol tetranitrate (PETN) deposition. The PETN vapor and solid phase were simulated using the volume of fluid method and its deposition in the vacuum chamber on spinning silicon wafers was modeled. The model also included the spinning copper cooling block where the wafers are placedmore » along with the chiller operating with forced convection refrigerant. Implicit time-dependent simulations in two- and three-dimensional were performed to derive insights in the governing physics for PETN thin film formation. PETN is deposited at the rate of 14 nm/s at 142.9 °C on a wafer with an initial temperature of 22 °C. The deposition of PETN on the wafers was calculated at an assumed heat transfer coefficient (HTC) of 400 W/m 2 K. This HTC proved to be the most sensitive parameter in determining the local surface temperature during deposition. Previous experimental work found noticeable microstructural changes with 0.5 mm fused silica wafers in place of silicon during the PETN deposition. This work showed that fused silica slows initial wafer cool down and results in ~10 °C difference for the surface temperature at 500 μm PETN film thickness. It was also found that the deposition surface temperature is insensitive to the cooling power of the copper block due to the copper block's very large heat capacity and thermal conductivity relative to the heat input from the PVD process. Future work should incorporate the addition of local stress during PETN deposition. Lastly, based on simulation results, it is also recommended to investigate the impact of wafer surface energy on the PETN microstructure and morphology formation.« less
Li, Junli; Li, Chunyan; Qiu, Rui; Yan, Congchong; Xie, Wenzhang; Wu, Zhen; Zeng, Zhi; Tung, Chuanjong
2015-09-01
The method of Monte Carlo simulation is a powerful tool to investigate the details of radiation biological damage at the molecular level. In this paper, a Monte Carlo code called NASIC (Nanodosimetry Monte Carlo Simulation Code) was developed. It includes physical module, pre-chemical module, chemical module, geometric module and DNA damage module. The physical module can simulate physical tracks of low-energy electrons in the liquid water event-by-event. More than one set of inelastic cross sections were calculated by applying the dielectric function method of Emfietzoglou's optical-data treatments, with different optical data sets and dispersion models. In the pre-chemical module, the ionised and excited water molecules undergo dissociation processes. In the chemical module, the produced radiolytic chemical species diffuse and react. In the geometric module, an atomic model of 46 chromatin fibres in a spherical nucleus of human lymphocyte was established. In the DNA damage module, the direct damages induced by the energy depositions of the electrons and the indirect damages induced by the radiolytic chemical species were calculated. The parameters should be adjusted to make the simulation results be agreed with the experimental results. In this paper, the influence study of the inelastic cross sections and vibrational excitation reaction on the parameters and the DNA strand break yields were studied. Further work of NASIC is underway. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Simulation of spatial and temporal properties of aftershocks by means of the fiber bundle model
NASA Astrophysics Data System (ADS)
Monterrubio-Velasco, Marisol; Zúñiga, F. R.; Márquez-Ramírez, Victor Hugo; Figueroa-Soto, Angel
2017-11-01
The rupture processes of any heterogeneous material constitute a complex physical problem. Earthquake aftershocks show temporal and spatial behaviors which are consequence of the heterogeneous stress distribution and multiple rupturing following the main shock. This process is difficult to model deterministically due to the number of parameters and physical conditions, which are largely unknown. In order to shed light on the minimum requirements for the generation of aftershock clusters, in this study, we perform a simulation of the main features of such a complex process by means of a fiber bundle (FB) type model. The FB model has been widely used to analyze the fracture process in heterogeneous materials. It is a simple but powerful tool that allows modeling the main characteristics of a medium such as the brittle shallow crust of the earth. In this work, we incorporate spatial properties, such as the Coulomb stress change pattern, which help simulate observed characteristics of aftershock sequences. In particular, we introduce a parameter ( P) that controls the probability of spatial distribution of initial loads. Also, we use a "conservation" parameter ( π), which accounts for the load dissipation of the system, and demonstrate its influence on the simulated spatio-temporal patterns. Based on numerical results, we find that P has to be in the range 0.06 < P < 0.30, whilst π needs to be limited by a very narrow range ( 0.60 < π < 0.66) in order to reproduce aftershocks pattern characteristics which resemble those of observed sequences. This means that the system requires a small difference in the spatial distribution of initial stress, and a very particular fraction of load transfer in order to generate realistic aftershocks.
NASA Astrophysics Data System (ADS)
Shi, Y.; Davis, K. J.; Eissenstat, D. M.; Kaye, J. P.; Duffy, C.; Yu, X.; He, Y.
2014-12-01
Belowground carbon processes are affected by soil moisture and soil temperature, but current biogeochemical models are 1-D and cannot resolve topographically driven hill-slope soil moisture patterns, and cannot simulate the nonlinear effects of soil moisture on carbon processes. Coupling spatially-distributed physically-based hydrologic models with biogeochemical models may yield significant improvements in the representation of topographic influence on belowground C processes. We will couple the Flux-PIHM model to the Biome-BGC (BBGC) model. Flux-PIHM is a coupled physically-based land surface hydrologic model, which incorporates a land-surface scheme into the Penn State Integrated Hydrologic Model (PIHM). The land surface scheme is adapted from the Noah land surface model. Because PIHM is capable of simulating lateral water flow and deep groundwater, Flux-PIHM is able to represent the link between groundwater and the surface energy balance, as well as the land surface heterogeneities caused by topography. The coupled Flux-PIHM-BBGC model will be tested at the Susquehanna/Shale Hills critical zone observatory (SSHCZO). The abundant observations, including eddy covariance fluxes, soil moisture, groundwater level, sap flux, stream discharge, litterfall, leaf area index, above ground carbon stock, and soil carbon efflux, make SSHCZO an ideal test bed for the coupled model. In the coupled model, each Flux-PIHM model grid will couple a BBGC cell. Flux-PIHM will provide BBGC with soil moisture and soil temperature information, while BBGC provides Flux-PIHM with leaf area index. Preliminary results show that when Biome- BGC is driven by PIHM simulated soil moisture pattern, the simulated soil carbon is clearly impacted by topography.
A Distributed Snow Evolution Modeling System (SnowModel)
NASA Astrophysics Data System (ADS)
Liston, G. E.; Elder, K.
2004-12-01
A spatially distributed snow-evolution modeling system (SnowModel) has been specifically designed to be applicable over a wide range of snow landscapes, climates, and conditions. To reach this goal, SnowModel is composed of four sub-models: MicroMet defines the meteorological forcing conditions, EnBal calculates surface energy exchanges, SnowMass simulates snow depth and water-equivalent evolution, and SnowTran-3D accounts for snow redistribution by wind. While other distributed snow models exist, SnowModel is unique in that it includes a well-tested blowing-snow sub-model (SnowTran-3D) for application in windy arctic, alpine, and prairie environments where snowdrifts are common. These environments comprise 68% of the seasonally snow-covered Northern Hemisphere land surface. SnowModel also accounts for snow processes occurring in forested environments (e.g., canopy interception related processes). SnowModel is designed to simulate snow-related physical processes occurring at spatial scales of 5-m and greater, and temporal scales of 1-hour and greater. These include: accumulation from precipitation; wind redistribution and sublimation; loading, unloading, and sublimation within forest canopies; snow-density evolution; and snowpack ripening and melt. To enhance its wide applicability, SnowModel includes the physical calculations required to simulate snow evolution within each of the global snow classes defined by Sturm et al. (1995), e.g., tundra, taiga, alpine, prairie, maritime, and ephemeral snow covers. The three, 25-km by 25-km, Cold Land Processes Experiment (CLPX) mesoscale study areas (MSAs: Fraser, North Park, and Rabbit Ears) are used as SnowModel simulation examples to highlight model strengths, weaknesses, and features in forested, semi-forested, alpine, and shrubland environments.
Air-cooling characteristics of simulated grape packages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frederick, R.L.; Comunian, F.
Experimental simulation of the external forced convection on the outside of grape packages was performed. Average heat transfer coefficients for air flow around such containers were found to range from 8 to 13.4 W/(m[sup 2]K). A physical description of the convective process was formulated on the basis of data obtained in three types of experiment. Expressions for the average heat transfer coefficient from single packages in air flow were proposed.
Use of Monte-Carlo Simulations in Polyurethane Polymerization Processes
1995-11-01
situations, the mechanisms of molecular species diffusion must be considered. Gupta et al (Ref. 10) have demonstrated the use of Monte-Carlo simulations in...many thoughtful discussions. P154742.PDF [Page: 41 of 78] UNCLASSIFIED 29 9. 0 REFERENCES 1. Muthiah, R. M.; Krishnamurthy, V. N.; Gupta , B. R...Time Evolution of Coupled Chemical Reactions", Journal of Computational Physics, Vol. 22, 1976, pg. 403 7. Pandit,Shubhangi S.; Juvekar, Vinay A
Evaluation of Resuspension from Propeller Wash in Pearl Harbor and San Diego Bay
2014-07-01
water cleanup plans, or total maximum daily loads (TMDLs) must be developed to bring the water body back into compliance. Under the Comprehensive...Waterways Experiment Station ( Johnson et. al., 1991) to simulate physical processes in bays, rivers, lakes and estuaries (Wang and Martin, 1991...Wang, 1992; Wang and McCutcheon, 1993; Wang et al., 1997, 1998; Johnson et al., 1995). The model simulates hydrodynamic currents in four dimensions (x
Modelling and Simulation in the Design Process of Armored Vehicles
2003-03-01
trackway conditions is a demanding optimization task. Basically, a high level of ride comfort requires soft suspension tuning, whereas driving safety relies...The maximum off-road speed is generally limited by traction, input torque, driving safety and ride comfort. When obstacles are to be negotiated, the...wheel travel was defined during the mobility simulation runs. Figure 14: Ramp 1.5m at 40 kph; virtual and physical prototype Driving safety and ride
Video Guidance, Landing, and Imaging system (VGLIS) for space missions
NASA Technical Reports Server (NTRS)
Schappell, R. T.; Knickerbocker, R. L.; Tietz, J. C.; Grant, C.; Flemming, J. C.
1975-01-01
The feasibility of an autonomous video guidance system that is capable of observing a planetary surface during terminal descent and selecting the most acceptable landing site was demonstrated. The system was breadboarded and "flown" on a physical simulator consisting of a control panel and monitor, a dynamic simulator, and a PDP-9 computer. The breadboard VGLIS consisted of an image dissector camera and the appropriate processing logic. Results are reported.
Bernal, M A; Bordage, M C; Brown, J M C; Davídková, M; Delage, E; El Bitar, Z; Enger, S A; Francis, Z; Guatelli, S; Ivanchenko, V N; Karamitros, M; Kyriakou, I; Maigne, L; Meylan, S; Murakami, K; Okada, S; Payno, H; Perrot, Y; Petrovic, I; Pham, Q T; Ristic-Fira, A; Sasaki, T; Štěpán, V; Tran, H N; Villagrasa, C; Incerti, S
2015-12-01
Understanding the fundamental mechanisms involved in the induction of biological damage by ionizing radiation remains a major challenge of today's radiobiology research. The Monte Carlo simulation of physical, physicochemical and chemical processes involved may provide a powerful tool for the simulation of early damage induction. The Geant4-DNA extension of the general purpose Monte Carlo Geant4 simulation toolkit aims to provide the scientific community with an open source access platform for the mechanistic simulation of such early damage. This paper presents the most recent review of the Geant4-DNA extension, as available to Geant4 users since June 2015 (release 10.2 Beta). In particular, the review includes the description of new physical models for the description of electron elastic and inelastic interactions in liquid water, as well as new examples dedicated to the simulation of physicochemical and chemical stages of water radiolysis. Several implementations of geometrical models of biological targets are presented as well, and the list of Geant4-DNA examples is described. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Finite element simulation of crack depth measurements in concrete using diffuse ultrasound
NASA Astrophysics Data System (ADS)
Seher, Matthias; Kim, Jin-Yeon; Jacobs, Laurence J.
2012-05-01
This research simulates the measurements of crack depth in concrete using diffuse ultrasound. The finite element method is employed to simulate the ultrasonic diffusion process around cracks with different geometrical shapes, with the goal of gaining physical insight into the data obtained from experimental measurements. The commercial finite element software Ansys is used to implement the two-dimensional concrete model. The model is validated with an analytical solution and experimental results. It is found from the simulation results that preliminary knowledge of the crack geometry is required to interpret the energy evolution curves from measurements and to correctly determine the crack depth.
Effects of forming history on crash simulation of a vehicle
NASA Astrophysics Data System (ADS)
Gökler, M. İ.; Doğan, U. Ç.; Darendeliler, H.
2016-08-01
The effects of forming on the crash simulation of a vehicle have been investigated by considering the load paths produced by sheet metal forming process. The frontal crash analysis has been performed by the finite element method, firstly without considering the forming history, to find out the load paths that absorb the highest energy. The sheet metal forming simulations have been realized for each structural component of the load paths and the frontal crash analysis has been repeated by including forming history. The results of the simulations with and without forming effects have been compared with the physical crash test results available in literature.
Strategy and gaps for modeling, simulation, and control of hybrid systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rabiti, Cristian; Garcia, Humberto E.; Hovsapian, Rob
2015-04-01
The purpose of this report is to establish a strategy for modeling and simulation of candidate hybrid energy systems. Modeling and simulation is necessary to design, evaluate, and optimize the system technical and economic performance. Accordingly, this report first establishes the simulation requirements to analysis candidate hybrid systems. Simulation fidelity levels are established based on the temporal scale, real and synthetic data availability or needs, solution accuracy, and output parameters needed to evaluate case-specific figures of merit. Accordingly, the associated computational and co-simulation resources needed are established; including physical models when needed, code assembly and integrated solutions platforms, mathematical solvers,more » and data processing. This report first attempts to describe the figures of merit, systems requirements, and constraints that are necessary and sufficient to characterize the grid and hybrid systems behavior and market interactions. Loss of Load Probability (LOLP) and effective cost of Effective Cost of Energy (ECE), as opposed to the standard Levelized Cost of Electricty (LCOE), are introduced as technical and economical indices for integrated energy system evaluations. Financial assessment methods are subsequently introduced for evaluation of non-traditional, hybrid energy systems. Algorithms for coupled and iterative evaluation of the technical and economic performance are subsequently discussed. This report further defines modeling objectives, computational tools, solution approaches, and real-time data collection and processing (in some cases using real test units) that will be required to model, co-simulate, and optimize; (a) an energy system components (e.g., power generation unit, chemical process, electricity management unit), (b) system domains (e.g., thermal, electrical or chemical energy generation, conversion, and transport), and (c) systems control modules. Co-simulation of complex, tightly coupled, dynamic energy systems requires multiple simulation tools, potentially developed in several programming languages and resolved on separate time scales. Whereas further investigation and development of hybrid concepts will provide a more complete understanding of the joint computational and physical modeling needs, this report highlights areas in which co-simulation capabilities are warranted. The current development status, quality assurance, availability and maintainability of simulation tools that are currently available for hybrid systems modeling is presented. Existing gaps in the modeling and simulation toolsets and development needs are subsequently discussed. This effort will feed into a broader Roadmap activity for designing, developing, and demonstrating hybrid energy systems.« less
Simulation of Soil Frost and Thaw Fronts Dynamics with Community Land Model 4.5
NASA Astrophysics Data System (ADS)
Gao, J.; Xie, Z.
2016-12-01
Freeze-thaw processes in soils, including changes in frost and thaw fronts (FTFs) , are important physical processes. The movement of FTFs affects soil water and thermal characteristics, as well as energy and water exchanges between land surface and the atmosphere, and then the land surface hydrothermal process. In this study, a two-directional freeze and thaw algorithm for simulating FTFs is incorporated into the community land surface model CLM4.5, which is called CLM4.5-FTF. The simulated FTFs depth and soil temperature of CLM4.5-FTF compared well with the observed data both in D66 station (permafrost) and Hulugou station (seasonally frozen soil). Because the soil temperature profile within a soil layer can be estimated according to the position of FTFs, CLM4.5 performed better in soil temperature simulation. Permafrost and seasonally frozen ground conditions in China from 1980 to 2010 were simulated using the CLM4.5-FTF. Numerical experiments show that the spatial distribution of simulated maximum frost depth by CLM4.5-FTF has seasonal variation obviously. Significant positive active-layer depth trends for permafrost regions and negative maximum freezing depth trends for seasonal frozen soil regions are simulated in response to positive air temperature trends except west of Black Sea.
Simulation of the West African monsoon onset using the HadGEM3-RA regional climate model
NASA Astrophysics Data System (ADS)
Diallo, Ismaïla; Bain, Caroline L.; Gaye, Amadou T.; Moufouma-Okia, Wilfran; Niang, Coumba; Dieng, Mame D. B.; Graham, Richard
2014-08-01
The performance of the Hadley Centre Global Environmental Model version 3 regional climate model (HadGEM3-RA) in simulating the West African monsoon (WAM) is investigated. We focus on performance for monsoon onset timing and for rainfall totals over the June-July-August (JJA) season and on the model's representation of the underlying dynamical processes. Experiments are driven by the ERA-Interim reanalysis and follow the CORDEX experimental protocol. Simulations with the HadGEM3 global model, which shares a common physical formulation with HadGEM3-RA, are used to gain insight into the causes of HadGEM3-RA simulation errors. It is found that HadGEM3-RA simulations of monsoon onset timing are realistic, with an error in mean onset date of two pentads. However, the model has a dry bias over the Sahel during JJA of 15-20 %. Analysis suggests that this is related to errors in the positioning of the Saharan heat low, which is too far south in HadGEM3-RA and associated with an insufficient northward reach of the south-westerly low-level monsoon flow and weaker moisture convergence over the Sahel. Despite these biases HadGEM3-RA's representation of the general rainfall distribution during the WAM appears superior to that of ERA-Interim when using Global Precipitation Climatology Project or Tropical Rain Measurement Mission data as reference. This suggests that the associated dynamical features seen in HadGEM3-RA can complement the physical picture available from ERA-Interim. This approach is supported by the fact that the global HadGEM3 model generates realistic simulations of the WAM without the benefit of pseudo-observational forcing at the lateral boundaries; suggesting that the physical formulation shared with HadGEM3-RA, is able to represent the driving processes. HadGEM3-RA simulations confirm previous findings that the main rainfall peak near 10°N during June-August is maintained by a region of mid-tropospheric ascent located, latitudinally, between the cores of the African Easterly Jet and Tropical Easterly Jet that intensifies around the time of onset. This region of ascent is weaker and located further south near 5°N in the driving ERA-Interim reanalysis, for reasons that may be related to the coarser resolution or the physics of the underlying model, and this is consistent with a less realistic latitudinal rainfall profile than found in the HadGEM3-RA simulations.
Weinreich, André; Funcke, Jakob Maria
2014-01-01
Drawing on recent findings, this study examines whether valence concordant electromyography (EMG) responses can be explained as an unconditional effect of mere stimulus processing or as somatosensory simulation driven by task-dependent processing strategies. While facial EMG over the Corrugator supercilii and the Zygomaticus major was measured, each participant performed two tasks with pictures of album covers. One task was an affective evaluation task and the other was to attribute the album covers to one of five decades. The Embodied Emotion Account predicts that valence concordant EMG is more likely to occur if the task necessitates a somatosensory simulation of the evaluative meaning of stimuli. Results support this prediction with regard to Corrugator supercilii in that valence concordant EMG activity was only present in the affective evaluation task but not in the non-evaluative task. Results for the Zygomaticus major were ambiguous. Our findings are in line with the view that EMG activity is an embodied part of the evaluation process and not a mere physical outcome.
NASA Astrophysics Data System (ADS)
Sun, Guodong; Mu, Mu
2017-05-01
An important source of uncertainty, which causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. Therefore, finding a subset among numerous physical parameters in numerical models in the atmospheric and oceanic sciences, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach in China. The results imply that nonlinear interactions among parameters play a key role in the identification of sensitive parameters in arid and semi-arid regions of China compared to those in northern, northeastern, and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.
New Features in the Computational Infrastructure for Nuclear Astrophysics
NASA Astrophysics Data System (ADS)
Smith, M. S.; Lingerfelt, E. J.; Scott, J. P.; Hix, W. R.; Nesaraja, C. D.; Koura, H.; Roberts, L. F.
2006-04-01
The Computational Infrastructure for Nuclear Astrophysics is a suite of computer codes online at nucastrodata.org that streamlines the incorporation of recent nuclear physics results into astrophysical simulations. The freely-available, cross- platform suite enables users to upload cross sections and s-factors, convert them into reaction rates, parameterize the rates, store the rates in customizable libraries, setup and run custom post-processing element synthesis calculations, and visualize the results. New features include the ability for users to comment on rates or libraries using an email-type interface, a nuclear mass model evaluator, enhanced techniques for rate parameterization, better treatment of rate inverses, and creation and exporting of custom animations of simulation results. We also have online animations of r- process, rp-process, and neutrino-p process element synthesis occurring in stellar explosions.
Investigation of the DSMC Approach for Ion/neutral Species in Modeling Low Pressure Plasma Reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng Hao; Li, Z.; Levin, D.
2011-05-20
Low pressure plasma reactors are important tools for ionized metal physical vapor deposition (IMPVD), a semiconductor plasma processing technology that is increasingly being applied to deposit Cu seed layers on semiconductor surfaces of trenches and vias with the high aspect ratio (e.g., >5:1). A large fraction of ionized atoms produced by the IMPVD process leads to an anisotropic deposition flux towards the substrate, a feature which is critical for attaining a void-free and uniform fill. Modeling such devices is challenging due to their high plasma density, reactive environment, but low gas pressure. A modular code developed by the Computational Opticalmore » and Discharge Physics Group, the Hybrid Plasma Equipment Model (HPEM), has been successfully applied to the numerical investigations of IMPVD by modeling a hollow cathode magnetron (HCM) device. However, as the development of semiconductor devices progresses towards the lower pressure regime (e.g., <5 mTorr), the breakdown of the continuum assumption limits the application of the fluid model in HPEM and suggests the incorporation of the kinetic method, such as the direct simulation Monte Carlo (DSMC), in the plasma simulation.The DSMC method, which solves the Boltzmann equation of transport, has been successfully applied in modeling micro-fluidic flows in MEMS devices with low Reynolds numbers, a feature shared with the HCM. Modeling of the basic physical and chemical processes for ion/neutral species in plasma have been developed and implemented in DSMC, which include ion particle motion due to the Lorentz force, electron impact reactions, charge exchange reactions, and charge recombination at the surface. The heating of neutrals due to collisions with ions and the heating of ions due to the electrostatic field will be shown to be captured by the DSMC simulations. In this work, DSMC calculations were coupled with the modules from HPEM so that the plasma can be self-consistently solved. Differences in the Ar results, the dominant species in the reactor, produced by the DSMC-HPEM coupled simulation will be shown in comparison with the original HPEM results. The effects of the DSMC calculations for ion/neutral species on HPEM plasma simulation will be further analyzed.« less
Hands-on-Entropy, Energy Balance with Biological Relevance
NASA Astrophysics Data System (ADS)
Reeves, Mark
2015-03-01
Entropy changes underlie the physics that dominates biological interactions. Indeed, introductory biology courses often begin with an exploration of the qualities of water that are important to living systems. However, one idea that is not explicitly addressed in most introductory physics or biology textbooks is important contribution of the entropy in driving fundamental biological processes towards equilibrium. From diffusion to cell-membrane formation, to electrostatic binding in protein folding, to the functioning of nerve cells, entropic effects often act to counterbalance deterministic forces such as electrostatic attraction and in so doing, allow for effective molecular signaling. A small group of biology, biophysics and computer science faculty have worked together for the past five years to develop curricular modules (based on SCALEUP pedagogy). This has enabled students to create models of stochastic and deterministic processes. Our students are first-year engineering and science students in the calculus-based physics course and they are not expected to know biology beyond the high-school level. In our class, they learn to reduce complex biological processes and structures in order model them mathematically to account for both deterministic and probabilistic processes. The students test these models in simulations and in laboratory experiments that are biologically relevant such as diffusion, ionic transport, and ligand-receptor binding. Moreover, the students confront random forces and traditional forces in problems, simulations, and in laboratory exploration throughout the year-long course as they move from traditional kinematics through thermodynamics to electrostatic interactions. This talk will present a number of these exercises, with particular focus on the hands-on experiments done by the students, and will give examples of the tangible material that our students work with throughout the two-semester sequence of their course on introductory physics with a bio focus. Supported by NSF DUE.
Ocean Carbon States: Data Mining in Observations and Numerical Simulations Results
NASA Astrophysics Data System (ADS)
Latto, R.; Romanou, A.
2017-12-01
Advanced data mining techniques are rapidly becoming widely used in Climate and Earth Sciences with the purpose of extracting new meaningful information from increasingly larger and more complex datasets. This is particularly important in studies of the global carbon cycle, where any lack of understanding of its combined physical and biogeochemical drivers is detrimental to our ability to accurately describe, understand, and predict CO2 concentrations and their changes in the major carbon reservoirs. The analysis presented here evaluates the use of cluster analysis as a means of identifying and comparing spatial and temporal patterns extracted from observational and model datasets. As the observational data is organized into various regimes, which we will call "ocean carbon states", we gain insight into the physical and/or biogeochemical processes controlling the ocean carbon cycle as well as how well these processes are simulated by a state-of-the-art climate model. We find that cluster analysis effectively produces realistic, dynamic regimes that can be associated with specific processes at different temporal scales for both observations and the model. In addition, we show how these regimes can be used to illustrate and characterize the model biases in the model air-sea flux of CO2. These biases are attributed to biases in salinity, sea surface temperature, wind speed, and nitrate, which are then used to identify the physical processes that are inaccurately reproduced by the model. In this presentation, we provide a proof-of-concept application using simple datasets, and we expand to more complex ones, using several physical and biogeochemical variable pairs, thus providing considerable insight into the mechanisms and phases of the ocean carbon cycle over different temporal and spatial scales.
NASA Astrophysics Data System (ADS)
Grenier, P.
2017-12-01
Statistical post-processing techniques aim at generating plausible climate scenarios from climate simulations and observation-based reference products. These techniques are generally not physically-based, and consequently they remedy the problem of simulation biases at the risk of generating physical inconsistency (PI). Although this concern is often emphasized, it is rarely addressed quantitatively. Here, PI generated by quantile mapping (QM), a technique widely used in climatological and hydrological applications, is investigated using relative humidity (RH) and its parent variables, namely specific humidity (SH), temperature and pressure. PI is classified into two types: 1) inadequate value for an individual variable (e.g. RH > 100 %), and 2) breaking of an inter-variable relationship. Scenarios built for this study correspond to twelve sites representing a variety of climate types over North America. Data used are an ensemble of ten 3-hourly global (CMIP5) and regional (CORDEX-NAM) simulations, as well as the CFSR reanalysis. PI of type 1 is discussed in terms of frequency of occurrence and amplitude of unphysical cases for RH and SH variables. PI of type 2 is investigated with heuristic proxies designed to directly compare the physical inconsistency problem with the initial bias problem. Finally, recommendations are provided for an appropriate use of QM given the potential to generate physical inconsistency of types 1 and 2.
Pressure profiles of the BRing based on the simulation used in the CSRm
NASA Astrophysics Data System (ADS)
Wang, J. C.; Li, P.; Yang, J. C.; Yuan, Y. J.; Wu, B.; Chai, Z.; Luo, C.; Dong, Z. Q.; Zheng, W. H.; Zhao, H.; Ruan, S.; Wang, G.; Liu, J.; Chen, X.; Wang, K. D.; Qin, Z. M.; Yin, B.
2017-07-01
HIAF-BRing, a new multipurpose accelerator facility of the High Intensity heavy-ion Accelerator Facility project, requires an extremely high vacuum lower than 10-11 mbar to fulfill the requirements of radioactive beam physics and high energy density physics. To achieve the required process pressure, the bench-marked codes of VAKTRAK and Molflow+ are used to simulate the pressure profiles of the BRing system. In order to ensure the accuracy of the implementation of VAKTRAK, the computational results are verified by measured pressure data and compared with a new simulation code BOLIDE on the current synchrotron CSRm. Since the verification of VAKTRAK has been done, the pressure profiles of the BRing are calculated with different parameters such as conductance, out-gassing rates and pumping speeds. According to the computational results, the optimal parameters are selected to achieve the required pressure for the BRing.
NASA Astrophysics Data System (ADS)
Yudin, M. S.
2017-11-01
In the present paper, stratification effects on surface pressure in the propagation of an atmospheric gravity current (cold front) over flat terrain are estimated with a non-hydrostatic finite-difference model of atmospheric dynamics. Artificial compressibility is introduced into the model in order to make its equations hyperbolic. For comparison with available simulation data, the physical processes under study are assumed to be adiabatic. The influence of orography is also eliminated. The front surface is explicitly described by a special equation. A time filter is used to suppress the non-physical oscillations. The results of simulations of surface pressure under neutral and stable stratification are presented. Under stable stratification the front moves faster and shows an abrupt pressure jump at the point of observation. This fact is in accordance with observations and the present-day theory of atmospheric fronts.
GBS: Global 3D simulation of tokamak edge region
NASA Astrophysics Data System (ADS)
Zhu, Ben; Fisher, Dustin; Rogers, Barrett; Ricci, Paolo
2012-10-01
A 3D two-fluid global code, namely Global Braginskii Solver (GBS), is being developed to explore the physics of turbulent transport, confinement, self-consistent profile formation, pedestal scaling and related phenomena in the edge region of tokamaks. Aimed at solving drift-reduced Braginskii equations [1] in complex magnetic geometry, the GBS is used for turbulence simulation in SOL region. In the recent upgrade, the simulation domain is expanded into close flux region with twist-shift boundary conditions. Hence, the new GBS code is able to explore global transport physics in an annular full-torus domain from the top of the pedestal into the far SOL. We are in the process of identifying and analyzing the linear and nonlinear instabilities in the system using the new GBS code. Preliminary results will be presented and compared with other codes if possible.[4pt] [1] A. Zeiler, J. F. Drake and B. Rogers, Phys. Plasmas 4, 2134 (1997)
Internet Based Simulations of Debris Dispersion of Shuttle Launch
NASA Technical Reports Server (NTRS)
Bardina, Jorge; Thirumalainambi, Rajkumar
2004-01-01
The debris dispersion model (which dispersion model?) is so heterogeneous and interrelated with various factors, 3D graphics combined with physical models are useful in understanding the complexity of launch and range operations. Modeling and simulation in this area mainly focuses on orbital dynamics and range safety concepts, including destruct limits, telemetry and tracking, and population risk. Particle explosion modeling is the process of simulating an explosion by breaking the rocket into many pieces. The particles are scattered throughout their motion using the laws of physics eventually coming to rest. The size of the foot print explains the type of explosion and distribution of the particles. The shuttle launch and range operations in this paper are discussed based on the operations of the Kennedy Space Center, Florida, USA. Java 3D graphics provides geometric and visual content with suitable modeling behaviors of Shuttle launches.
gpuSPHASE-A shared memory caching implementation for 2D SPH using CUDA
NASA Astrophysics Data System (ADS)
Winkler, Daniel; Meister, Michael; Rezavand, Massoud; Rauch, Wolfgang
2017-04-01
Smoothed particle hydrodynamics (SPH) is a meshless Lagrangian method that has been successfully applied to computational fluid dynamics (CFD), solid mechanics and many other multi-physics problems. Using the method to solve transport phenomena in process engineering requires the simulation of several days to weeks of physical time. Based on the high computational demand of CFD such simulations in 3D need a computation time of years so that a reduction to a 2D domain is inevitable. In this paper gpuSPHASE, a new open-source 2D SPH solver implementation for graphics devices, is developed. It is optimized for simulations that must be executed with thousands of frames per second to be computed in reasonable time. A novel caching algorithm for Compute Unified Device Architecture (CUDA) shared memory is proposed and implemented. The software is validated and the performance is evaluated for the well established dambreak test case.
Employment of adaptive learning techniques for the discrimination of acoustic emissions
NASA Astrophysics Data System (ADS)
Erkes, J. W.; McDonald, J. F.; Scarton, H. A.; Tam, K. C.; Kraft, R. P.
1983-11-01
The following aspects of this study on the discrimination of acoustic emissions (AE) were examined: (1) The analytical development and assessment of digital signal processing techniques for AE signal dereverberation, noise reduction, and source characterization; (2) The modeling and verification of some aspects of key selected techniques through a computer-based simulation; and (3) The study of signal propagation physics and their effect on received signal characteristics for relevant physical situations.
Increasing complexity with quantum physics.
Anders, Janet; Wiesner, Karoline
2011-09-01
We argue that complex systems science and the rules of quantum physics are intricately related. We discuss a range of quantum phenomena, such as cryptography, computation and quantum phases, and the rules responsible for their complexity. We identify correlations as a central concept connecting quantum information and complex systems science. We present two examples for the power of correlations: using quantum resources to simulate the correlations of a stochastic process and to implement a classically impossible computational task.
[Several indicators of tissue oxygen during modeling of extravehicular activity of man].
Lan'shina, O E; Loginov, V A; Akinfiev, A V; Kovalenko, E A
1995-01-01
Investigations of tissue oxygen indices during simulation of extravehicular activity (EVA) of cosmonauts demonstrated that breathing pure oxygen at approximately 280 mmHg elevates oxygen tension in capillary blood, and capillary-tissue gradient during physical work. Physical work alone stimulates tissue oxygenation due to, apparently, intensification of the processes of oxidative phosphorylation. The observed shifts in oxygen status reverse significantly within the first 5 min after completion of the experiment.
WE-H-BRA-04: Biological Geometries for the Monte Carlo Simulation Toolkit TOPASNBio
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNamara, A; Held, K; Paganetti, H
2016-06-15
Purpose: New advances in radiation therapy are most likely to come from the complex interface of physics, chemistry and biology. Computational simulations offer a powerful tool for quantitatively investigating radiation interactions with biological tissue and can thus help bridge the gap between physics and biology. The aim of TOPAS-nBio is to provide a comprehensive tool to generate advanced radiobiology simulations. Methods: TOPAS wraps and extends the Geant4 Monte Carlo (MC) simulation toolkit. TOPAS-nBio is an extension to TOPAS which utilizes the physics processes in Geant4-DNA to model biological damage from very low energy secondary electrons. Specialized cell, organelle and molecularmore » geometries were designed for the toolkit. Results: TOPAS-nBio gives the user the capability of simulating biological geometries, ranging from the micron-scale (e.g. cells and organelles) to complex nano-scale geometries (e.g. DNA and proteins). The user interacts with TOPAS-nBio through easy-to-use input parameter files. For example, in a simple cell simulation the user can specify the cell type and size as well as the type, number and size of included organelles. For more detailed nuclear simulations, the user can specify chromosome territories containing chromatin fiber loops, the later comprised of nucleosomes on a double helix. The chromatin fibers can be arranged in simple rigid geometries or within factual globules, mimicking realistic chromosome territories. TOPAS-nBio also provides users with the capability of reading protein data bank 3D structural files to simulate radiation damage to proteins or nucleic acids e.g. histones or RNA. TOPAS-nBio has been validated by comparing results to other track structure simulation software and published experimental measurements. Conclusion: TOPAS-nBio provides users with a comprehensive MC simulation tool for radiobiological simulations, giving users without advanced programming skills the ability to design and run complex simulations.« less
Physical Model for the Evolution of the Genetic Code
NASA Astrophysics Data System (ADS)
Yamashita, Tatsuro; Narikiyo, Osamu
2011-12-01
Using the shape space of codons and tRNAs we give a physical description of the genetic code evolution on the basis of the codon capture and ambiguous intermediate scenarios in a consistent manner. In the lowest dimensional version of our description, a physical quantity, codon level is introduced. In terms of the codon levels two scenarios are typically classified into two different routes of the evolutional process. In the case of the ambiguous intermediate scenario we perform an evolutional simulation implemented cost selection of amino acids and confirm a rapid transition of the code change. Such rapidness reduces uncomfortableness of the non-unique translation of the code at intermediate state that is the weakness of the scenario. In the case of the codon capture scenario the survival against mutations under the mutational pressure minimizing GC content in genomes is simulated and it is demonstrated that cells which experience only neutral mutations survive.
A new physics-based modeling approach for tsunami-ionosphere coupling
NASA Astrophysics Data System (ADS)
Meng, X.; Komjathy, A.; Verkhoglyadova, O. P.; Yang, Y.-M.; Deng, Y.; Mannucci, A. J.
2015-06-01
Tsunamis can generate gravity waves propagating upward through the atmosphere, inducing total electron content (TEC) disturbances in the ionosphere. To capture this process, we have implemented tsunami-generated gravity waves into the Global Ionosphere-Thermosphere Model (GITM) to construct a three-dimensional physics-based model WP (Wave Perturbation)-GITM. WP-GITM takes tsunami wave properties, including the wave height, wave period, wavelength, and propagation direction, as inputs and time-dependently characterizes the responses of the upper atmosphere between 100 km and 600 km altitudes. We apply WP-GITM to simulate the ionosphere above the West Coast of the United States around the time when the tsunami associated with the March 2011 Tohuku-Oki earthquke arrived. The simulated TEC perturbations agree with Global Positioning System observations reasonably well. For the first time, a fully self-consistent and physics-based model has reproduced the GPS-observed traveling ionospheric signatures of an actual tsunami event.
NASA Astrophysics Data System (ADS)
Dugger, A. L.; Rafieeinasab, A.; Gochis, D.; Yu, W.; McCreight, J. L.; Karsten, L. R.; Pan, L.; Zhang, Y.; Sampson, K. M.; Cosgrove, B.
2016-12-01
Evaluation of physically-based hydrologic models applied across large regions can provide insight into dominant controls on runoff generation and how these controls vary based on climatic, biological, and geophysical setting. To make this leap, however, we need to combine knowledge of regional forcing skill, model parameter and physics assumptions, and hydrologic theory. If we can successfully do this, we also gain information on how well our current approximations of these dominant physical processes are represented in continental-scale models. In this study, we apply this diagnostic approach to a 5-year retrospective implementation of the WRF-Hydro community model configured for the U.S. National Weather Service's National Water Model (NWM). The NWM is a water prediction model in operations over the contiguous U.S. as of summer 2016, providing real-time estimates and forecasts out to 30 days of streamflow across 2.7 million stream reaches as well as distributed snowpack, soil moisture, and evapotranspiration at 1-km resolution. The WRF-Hydro system permits not only the standard simulation of vertical energy and water fluxes common in continental-scale models, but augments these processes with lateral redistribution of surface and subsurface water, simple groundwater dynamics, and channel routing. We evaluate 5 years of NLDAS-2 precipitation forcing and WRF-Hydro streamflow and evapotranspiration simulation across the contiguous U.S. at a range of spatial (gage, basin, ecoregion) and temporal (hourly, daily, monthly) scales and look for consistencies and inconsistencies in performance in terms of bias, timing, and extremes. Leveraging results from other CONUS-scale hydrologic evaluation studies, we translate our performance metrics into a matrix of likely dominant process controls and error sources (forcings, parameter estimates, and model physics). We test our hypotheses in a series of controlled model experiments on a subset of representative basins from distinct "problem" environments (Southeast U.S. Coastal Plain, Central and Coastal Texas, Northern Plains, and Arid Southwest). The results from these longer-term model diagnostics will inform future improvements in forcing bias correction, parameter calibration, and physics developments in the National Water Model.