Enabling UAS Research at the NASA EAV Laboratory
NASA Technical Reports Server (NTRS)
Ippolito, Corey A.
2015-01-01
The Exploration Aerial Vehicles (EAV) Laboratory at NASA Ames Research Center leads research into intelligent autonomy and advanced control systems, bridging the gap between simulation and full-scale technology through flight test experimentation on unmanned sub-scale test vehicles.
The Analysis, Numerical Simulation, and Diagnosis of Extratropical Weather Systems
1999-09-30
The Analysis, Numerical Simulation, and Diagnosis of Extratropical Weather Systems Dr. Melvyn A. Shapiro NOAA/Environmental Technology Laboratory...formulation, and numerical prediction of the life cycles of synoptic-scale and mesoscale extratropical weather systems, including the influence of planetary...scale inter-annual and intra-seasonal variability on their evolution. These weather systems include: extratropical oceanic and land-falling cyclones
Note: Measurement system for the radiative forcing of greenhouse gases in a laboratory scale.
Kawamura, Yoshiyuki
2016-01-01
The radiative forcing of the greenhouse gases has been studied being based on computational simulations or the observation of the real atmosphere meteorologically. In order to know the greenhouse effect more deeply and to study it from various viewpoints, the study on it in a laboratory scale is important. We have developed a direct measurement system for the infrared back radiation from the carbon dioxide (CO2) gas. The system configuration is similar with that of the practical earth-atmosphere-space system. Using this system, the back radiation from the CO2 gas was directly measured in a laboratory scale, which roughly coincides with meteorologically predicted value.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Zhijie; Lai, Canhai; Marcy, Peter William
2017-05-01
A challenging problem in designing pilot-scale carbon capture systems is to predict, with uncertainty, the adsorber performance and capture efficiency under various operating conditions where no direct experimental data exist. Motivated by this challenge, we previously proposed a hierarchical framework in which relevant parameters of physical models were sequentially calibrated from different laboratory-scale carbon capture unit (C2U) experiments. Specifically, three models of increasing complexity were identified based on the fundamental physical and chemical processes of the sorbent-based carbon capture technology. Results from the corresponding laboratory experiments were used to statistically calibrate the physical model parameters while quantifying some of theirmore » inherent uncertainty. The parameter distributions obtained from laboratory-scale C2U calibration runs are used in this study to facilitate prediction at a larger scale where no corresponding experimental results are available. In this paper, we first describe the multiphase reactive flow model for a sorbent-based 1-MW carbon capture system then analyze results from an ensemble of simulations with the upscaled model. The simulation results are used to quantify uncertainty regarding the design’s predicted efficiency in carbon capture. In particular, we determine the minimum gas flow rate necessary to achieve 90% capture efficiency with 95% confidence.« less
If You've Got It, Use It (Simulation, That Is...)
NASA Technical Reports Server (NTRS)
Frost, Chad; Tucker, George
2006-01-01
This viewgraph presentation reviews the Rotorcraft Aircrew Systems Concept Airborne Laboratory (RASCAL) UH-60 in-flight simulator, the use of simulation in support of safety monitor design specification development, the development of a failure/recovery (F/R) rating scale, the use of F/R Rating Scale as a common element between simulation and flight evaluation, and the expansion of the flight envelope without benefit of simulation.
(U) Status of Trinity and Crossroads Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archer, Billy Joe; Lujan, James Westley; Hemmert, K. S.
2017-01-10
(U) This paper provides a general overview of current and future plans for the Advanced Simulation and Computing (ASC) Advanced Technology (AT) systems fielded by the New Mexico Alliance for Computing at Extreme Scale (ACES), a collaboration between Los Alamos Laboratory and Sandia National Laboratories. Additionally, this paper touches on research of technology beyond traditional CMOS. The status of Trinity, ASCs first AT system, and Crossroads, anticipated to succeed Trinity as the third AT system in 2020 will be presented, along with initial performance studies of the Intel Knights Landing Xeon Phi processors, introduced on Trinity. The challenges and opportunitiesmore » for our production simulation codes on AT systems will also be discussed. Trinity and Crossroads are a joint procurement by ACES and Lawrence Berkeley Laboratory as part of the Alliance for application Performance at EXtreme scale (APEX) http://apex.lanl.gov.« less
Construction of the Propulsion Systems Laboratory No. 1 and 2
1951-01-21
Construction of the Propulsion Systems Laboratory No. 1 and 2 at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory. When it began operation in late 1952, the Propulsion Systems Laboratory was the NACA’s most powerful facility for testing full-scale engines at simulated flight altitudes. The facility contained two altitude simulating test chambers which were a technological combination of the static sea-level test stands and the complex Altitude Wind Tunnel, which recreated actual flight conditions on a larger scale. NACA Lewis began designing the new facility in 1947 as part of a comprehensive plan to improve the altitude testing capabilities across the lab. The exhaust, refrigeration, and combustion air systems from all the major test facilities were linked. In this way, different facilities could be used to complement the capabilities of one another. Propulsion Systems Laboratory construction began in late summer 1949 with the installation of an overhead exhaust pipe connecting the facility to the Altitude Wind Tunnel and Engine Research Building. The large test section pieces arriving in early 1951, when this photograph was taken. The two primary coolers for the altitude exhaust are in place within the framework near the center of the photograph.
Hydrodynamic Scalings: from Astrophysics to Laboratory
NASA Astrophysics Data System (ADS)
Ryutov, D. D.; Remington, B. A.
2000-05-01
A surprisingly general hydrodynamic similarity has been recently described in Refs. [1,2]. One can call it the Euler similarity because it works for the Euler equations (with MHD effects included). Although the dissipation processes are assumed to be negligible, the presence of shocks is allowed. For the polytropic medium (i.e., the medium where the energy density is proportional to the pressure), an evolution of an arbitrarily chosen 3D initial state can be scaled to another system, if a single dimensionless parameter (the Euler number) is the same for both initial states. The Euler similarity allows one to properly design laboratory experiments modeling astrophysical phenomena. We discuss several examples of such experiments related to the physics of supernovae [3]. For the problems with a single spatial scale, the condition of the smallness of dissipative processes can be adequately described in terms of the Reynolds, Peclet, and magnetic Reynolds numbers related to this scale (all three numbers must be large). However, if the system develops small-scale turbulence, dissipation may become important at these smaller scales, thereby affecting the gross behavior of the system. We analyze the corresponding constraints. We discuss also constraints imposed by the presence of interfaces between the substances with different polytropic index. Another set of similarities governs evolution of photoevaporation fronts in astrophysics. Convenient scaling laws exist in situations where the density of the ablated material is very low compared to the bulk density. We conclude that a number of hydrodynamical problems related to such objects as the Eagle Nebula can be adequately simulated in the laboratory. We discuss also possible scalings for radiative astrophysical jets (see Ref. [3] and references therein). This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract W-7405-Eng-48. 1. D.D. Ryutov, R.P. Drake, J. Kane, E. Liang, B. A. Remington, and W.M. Wood-Vasey. "Similarity criteria for the laboratory simulation of supernova hydrodynamics." Astrophysical Journal, v. 518, p. 821 (1999). 2. D.D. Ryutov, R.P. Drake, B.A. Remington. "Criteria for scaled laboratory simulations of astrophysical MHD phenomena." To appear in Astrophysical Journal - Supplement, April 2000. 3. Remington, B.A., Phys. Plasmas, 7, # 5 (2000).
Computational analysis of fluid dynamics in pharmaceutical freeze-drying.
Alexeenko, Alina A; Ganguly, Arnab; Nail, Steven L
2009-09-01
Analysis of water vapor flows encountered in pharmaceutical freeze-drying systems, laboratory-scale and industrial, is presented based on the computational fluid dynamics (CFD) techniques. The flows under continuum gas conditions are analyzed using the solution of the Navier-Stokes equations whereas the rarefied flow solutions are obtained by the direct simulation Monte Carlo (DSMC) method for the Boltzmann equation. Examples of application of CFD techniques to laboratory-scale and industrial scale freeze-drying processes are discussed with an emphasis on the utility of CFD for improvement of design and experimental characterization of pharmaceutical freeze-drying hardware and processes. The current article presents a two-dimensional simulation of a laboratory scale dryer with an emphasis on the importance of drying conditions and hardware design on process control and a three-dimensional simulation of an industrial dryer containing a comparison of the obtained results with analytical viscous flow solutions. It was found that the presence of clean in place (CIP)/sterilize in place (SIP) piping in the duct lead to significant changes in the flow field characteristics. The simulation results for vapor flow rates in an industrial freeze-dryer have been compared to tunable diode laser absorption spectroscopy (TDLAS) and gravimetric measurements.
ERIC Educational Resources Information Center
Duarte, B. P. M.; Coelho Pinheiro, M. N.; Silva, D. C. M.; Moura, M. J.
2006-01-01
The experiment described is an excellent opportunity to apply theoretical concepts of distillation, thermodynamics of mixtures and process simulation at laboratory scale, and simultaneously enhance the ability of students to operate, control and monitor complex units.
EPOS-WP16: A Platform for European Multi-scale Laboratories
NASA Astrophysics Data System (ADS)
Spiers, Chris; Drury, Martyn; Kan-Parker, Mirjam; Lange, Otto; Willingshofer, Ernst; Funiciello, Francesca; Rosenau, Matthias; Scarlato, Piergiorgio; Sagnotti, Leonardo; W16 Participants
2016-04-01
The participant countries in EPOS embody a wide range of world-class laboratory infrastructures ranging from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue modeling and paleomagnetic laboratories. Most data produced by the various laboratory centres and networks are presently available only in limited "final form" in publications. As such many data remain inaccessible and/or poorly preserved. However, the data produced at the participating laboratories are crucial to serving society's need for geo-resources exploration and for protection against geo-hazards. Indeed, to model resource formation and system behaviour during exploitation, we need an understanding from the molecular to the continental scale, based on experimental data. This contribution will describe the work plans that the laboratories community in Europe is making, in the context of EPOS. The main objectives are: - To collect and harmonize available and emerging laboratory data on the properties and processes controlling rock system behaviour at multiple scales, in order to generate products accessible and interoperable through services for supporting research activities. - To co-ordinate the development, integration and trans-national usage of the major solid Earth Science laboratory centres and specialist networks. The length scales encompassed by the infrastructures included range from the nano- and micrometer levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetre sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. - To provide products and services supporting research into Geo-resources and Geo-storage, Geo-hazards and Earth System Evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hodge, Bri-Mathias
2016-08-11
This paper discusses the development of, approaches for, experiences with, and some results from a large-scale, high-performance-computer-based (HPC-based) co-simulation of electric power transmission and distribution systems using the Integrated Grid Modeling System (IGMS). IGMS was developed at the National Renewable Energy Laboratory (NREL) as a novel Independent System Operator (ISO)-to-appliance scale electric power system modeling platform that combines off-the-shelf tools to simultaneously model 100s to 1000s of distribution systems in co-simulation with detailed ISO markets, transmission power flows, and AGC-level reserve deployment. Lessons learned from the co-simulation architecture development are shared, along with a case study that explores the reactivemore » power impacts of PV inverter voltage support on the bulk power system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kawamura, Yoshiyuki
The radiative forcing of the greenhouse gases has been studied being based on computational simulations or the observation of the real atmosphere meteorologically. In order to know the greenhouse effect more deeply and to study it from various viewpoints, the study on it in a laboratory scale is important. We have developed a direct measurement system for the infrared back radiation from the carbon dioxide (CO{sub 2}) gas. The system configuration is similar with that of the practical earth-atmosphere-space system. Using this system, the back radiation from the CO{sub 2} gas was directly measured in a laboratory scale, which roughlymore » coincides with meteorologically predicted value.« less
EFFECT OF VARYING FLOW REGIMES ON BIOFILM DENSITIES IN A DISTRIBUTION SYSTEM SIMULATOR
Maintenance of a free chlorine residual within water distribution systems is used to reduce the possibility of microbial contamination. However, it has been demonstrated that biofilms within water distribution systems can harbor coliforms. In laboratory scale studies, others have...
A numerical model of surfactant enhanced solubilization was developed and applied to the simulation of nonaqueous phase liquid recovery in two-dimensional heterogeneous laboratory sand tank systems. Model parameters were derived from independent, small-scale, ...
Multiscale Laboratory Infrastructure and Services to users: Plans within EPOS
NASA Astrophysics Data System (ADS)
Spiers, Chris; Willingshofer, Ernst; Drury, Martyn; Funiciello, Francesca; Rosenau, Matthias; Scarlato, Piergiorgio; Sagnotti, Leonardo; EPOS WG6, Corrado Cimarelli
2015-04-01
The participant countries in EPOS embody a wide range of world-class laboratory infrastructures ranging from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue modeling and paleomagnetic laboratories. Most data produced by the various laboratory centres and networks are presently available only in limited "final form" in publications. Many data remain inaccessible and/or poorly preserved. However, the data produced at the participating laboratories are crucial to serving society's need for geo-resources exploration and for protection against geo-hazards. Indeed, to model resource formation and system behaviour during exploitation, we need an understanding from the molecular to the continental scale, based on experimental data. This contribution will describe the plans that the laboratories community in Europe is making, in the context of EPOS. The main objectives are: • To collect and harmonize available and emerging laboratory data on the properties and processes controlling rock system behaviour at multiple scales, in order to generate products accessible and interoperable through services for supporting research activities. • To co-ordinate the development, integration and trans-national usage of the major solid Earth Science laboratory centres and specialist networks. The length scales encompassed by the infrastructures included range from the nano- and micrometer levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetre sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. • To provide products and services supporting research into Geo-resources and Geo-storage, Geo-hazards and Earth System Evolution. If the EPOS Implementation Phase proposal presently under construction is successful, then a range of services and transnational activities will be put in place to realize these objectives.
Rathfelder, K M; Abriola, L M; Taylor, T P; Pennell, K D
2001-04-01
A numerical model of surfactant enhanced solubilization was developed and applied to the simulation of nonaqueous phase liquid recovery in two-dimensional heterogeneous laboratory sand tank systems. Model parameters were derived from independent, small-scale, batch and column experiments. These parameters included viscosity, density, solubilization capacity, surfactant sorption, interfacial tension, permeability, capillary retention functions, and interphase mass transfer correlations. Model predictive capability was assessed for the evaluation of the micellar solubilization of tetrachloroethylene (PCE) in the two-dimensional systems. Predicted effluent concentrations and mass recovery agreed reasonably well with measured values. Accurate prediction of enhanced solubilization behavior in the sand tanks was found to require the incorporation of pore-scale, system-dependent, interphase mass transfer limitations, including an explicit representation of specific interfacial contact area. Predicted effluent concentrations and mass recovery were also found to depend strongly upon the initial NAPL entrapment configuration. Numerical results collectively indicate that enhanced solubilization processes in heterogeneous, laboratory sand tank systems can be successfully simulated using independently measured soil parameters and column-measured mass transfer coefficients, provided that permeability and NAPL distributions are accurately known. This implies that the accuracy of model predictions at the field scale will be constrained by our ability to quantify soil heterogeneity and NAPL distribution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2018-01-23
Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.
Barth, Gilbert R.; Illangasekare, T.H.; Rajaram, H.
2003-01-01
This work considers the applicability of conservative tracers for detecting high-saturation nonaqueous-phase liquid (NAPL) entrapment in heterogeneous systems. For this purpose, a series of experiments and simulations was performed using a two-dimensional heterogeneous system (10??1.2 m), which represents an intermediate scale between laboratory and field scales. Tracer tests performed prior to injecting the NAPL provide the baseline response of the heterogeneous porous medium. Two NAPL spill experiments were performed and the entrapped-NAPL saturation distribution measured in detail using a gamma-ray attenuation system. Tracer tests following each of the NAPL spills produced breakthrough curves (BTCs) reflecting the impact of entrapped NAPL on conservative transport. To evaluate significance, the impact of NAPL entrapment on the conservative-tracer breakthrough curves was compared to simulated breakthrough curve variability for different realizations of the heterogeneous distribution. Analysis of the results reveals that the NAPL entrapment has a significant impact on the temporal moments of conservative-tracer breakthrough curves. ?? 2003 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Schirmer, Mario; Molson, John W.; Frind, Emil O.; Barker, James F.
2000-12-01
Biodegradation of organic contaminants in groundwater is a microscale process which is often observed on scales of 100s of metres or larger. Unfortunately, there are no known equivalent parameters for characterizing the biodegradation process at the macroscale as there are, for example, in the case of hydrodynamic dispersion. Zero- and first-order degradation rates estimated at the laboratory scale by model fitting generally overpredict the rate of biodegradation when applied to the field scale because limited electron acceptor availability and microbial growth are not considered. On the other hand, field-estimated zero- and first-order rates are often not suitable for predicting plume development because they may oversimplify or neglect several key field scale processes, phenomena and characteristics. This study uses the numerical model BIO3D to link the laboratory and field scales by applying laboratory-derived Monod kinetic degradation parameters to simulate a dissolved gasoline field experiment at the Canadian Forces Base (CFB) Borden. All input parameters were derived from independent laboratory and field measurements or taken from the literature a priori to the simulations. The simulated results match the experimental results reasonably well without model calibration. A sensitivity analysis on the most uncertain input parameters showed only a minor influence on the simulation results. Furthermore, it is shown that the flow field, the amount of electron acceptor (oxygen) available, and the Monod kinetic parameters have a significant influence on the simulated results. It is concluded that laboratory-derived Monod kinetic parameters can adequately describe field scale degradation, provided all controlling factors are incorporated in the field scale model. These factors include advective-dispersive transport of multiple contaminants and electron acceptors and large-scale spatial heterogeneities.
Simulating Extraterrestrial Ices in the Laboratory
NASA Astrophysics Data System (ADS)
Berisford, D. F.; Carey, E. M.; Hand, K. P.; Choukroun, M.
2017-12-01
Several ongoing experiments at JPL attempt to simulate the ice environment for various regimes associated with icy moons. The Europa Penitent Ice Experiment (EPIX) simulates the surface environment of an icy moon, to investigate the physics of ice surface morphology growth. This experiment features half-meter-scale cryogenic ice samples, cryogenic radiative sink environment, vacuum conditions, and diurnal cycling solar simulation. The experiment also includes several smaller fixed-geometry vacuum chambers for ice simulation at Earth-like and intermediate temperature and vacuum conditions for development of surface morphology growth scaling relations. Additionally, an ice cutting facility built on a similar platform provides qualitative data on the mechanical behavior of cryogenic ice with impurities under vacuum, and allows testing of ice cutting/sampling tools relevant for landing spacecraft. A larger cutting facility is under construction at JPL, which will provide more quantitative data and allow full-scale sampling tool tests. Another facility, the JPL Ice Physics Laboratory, features icy analog simulant preparation abilities that range icy solar system objects such as Mars, Ceres and the icy satellites of Saturn and Jupiter. In addition, the Ice Physics Lab has unique facilities for Icy Analog Tidal Simulation and Rheological Studies of Cryogenic Icy Slurries, as well as equipment to perform thermal and mechanical properties testing on icy analog materials and their response to sinusoidal tidal stresses.
Innovative mathematical modeling in environmental remediation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yeh, Gour T.; National Central Univ.; Univ. of Central Florida
2013-05-01
There are two different ways to model reactive transport: ad hoc and innovative reaction-based approaches. The former, such as the Kd simplification of adsorption, has been widely employed by practitioners, while the latter has been mainly used in scientific communities for elucidating mechanisms of biogeochemical transport processes. It is believed that innovative mechanistic-based models could serve as protocols for environmental remediation as well. This paper reviews the development of a mechanistically coupled fluid flow, thermal transport, hydrologic transport, and reactive biogeochemical model and example-applications to environmental remediation problems. Theoretical bases are sufficiently described. Four example problems previously carried out aremore » used to demonstrate how numerical experimentation can be used to evaluate the feasibility of different remediation approaches. The first one involved the application of a 56-species uranium tailing problem to the Melton Branch Subwatershed at Oak Ridge National Laboratory (ORNL) using the parallel version of the model. Simulations were made to demonstrate the potential mobilization of uranium and other chelating agents in the proposed waste disposal site. The second problem simulated laboratory-scale system to investigate the role of natural attenuation in potential off-site migration of uranium from uranium mill tailings after restoration. It showed inadequacy of using a single Kd even for a homogeneous medium. The third example simulated laboratory experiments involving extremely high concentrations of uranium, technetium, aluminum, nitrate, and toxic metals (e.g.,Ni, Cr, Co).The fourth example modeled microbially-mediated immobilization of uranium in an unconfined aquifer using acetate amendment in a field-scale experiment. The purposes of these modeling studies were to simulate various mechanisms of mobilization and immobilization of radioactive wastes and to illustrate how to apply reactive transport models for environmental remediation.The second problem simulated laboratory-scale system to investigate the role of natural attenuation in potential off-site migration of uranium from uranium mill tailings after restoration. It showed inadequacy of using a single Kd even for a homogeneous medium.« less
McDonough, J M
2009-06-01
Outline of the derivation and mathematical and physical interpretations are presented for a discrete dynamical system known as the "poor man's Navier-Stokes equation." Numerical studies demonstrate that velocity fields produced by this dynamical system are similar to those seen in laboratory experiments and in detailed simulations, and they lead to scaling for the turbulence kinetic energy spectrum in accord with Kolmogorov K41 theory.
NASA Astrophysics Data System (ADS)
Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao
2017-01-01
Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is shown that the parallel FE simulator developed in this study is an efficient tool for modeling the whole trans-scale failure process of rock from meso- to engineering-scale.
Validation of mathematical model for CZ process using small-scale laboratory crystal growth furnace
NASA Astrophysics Data System (ADS)
Bergfelds, Kristaps; Sabanskis, Andrejs; Virbulis, Janis
2018-05-01
The present material is focused on the modelling of small-scale laboratory NaCl-RbCl crystal growth furnace. First steps towards fully transient simulations are taken in the form of stationary simulations that deal with the optimization of material properties to match the model to experimental conditions. For this purpose, simulation software primarily used for the modelling of industrial-scale silicon crystal growth process was successfully applied. Finally, transient simulations of the crystal growth are presented, giving a sufficient agreement to experimental results.
A tide prediction and tide height control system for laboratory mesocosms
Long, Jeremy D.
2015-01-01
Experimental mesocosm studies of rocky shore and estuarine intertidal systems may benefit from the application of natural tide cycles to better replicate variation in immersion time, water depth, and attendant fluctuations in abiotic and edaphic conditions. Here we describe a stand-alone microcontroller tide prediction open-source software program, coupled with a mechanical tidal elevation control system, which allows continuous adjustment of aquarium water depths in synchrony with local tide cycles. We used this system to monitor the growth of Spartina foliosa marsh cordgrass and scale insect herbivores at three simulated shore elevations in laboratory mesocosms. Plant growth decreased with increasing shore elevation, while scale insect population growth on the plants was not strongly affected by immersion time. This system shows promise for a range of laboratory mesocosm studies where natural tide cycling could impact organism performance or behavior, while the tide prediction system could additionally be utilized in field experiments where treatments need to be applied at certain stages of the tide cycle. PMID:26623195
Turbulence-enhanced bottom melting of a horizontal glacier--lake interface
NASA Astrophysics Data System (ADS)
Keitzl, T.; Mellado, J. P.; Notz, D.
2014-12-01
We use laboratory tank experiments and direct numerical simulations to investigate the meltrates of a horizontal bottom glacier--lake interface as a function of lake temperature. Existing parameterisations of such meltrates are usually based on empirical fits to field observations. To understand the meltrates of an ice--water interface more systematically we study an idealised system in terms of its temperature-driven buoyancy forcing. In such systems, the meltrate can be expressed analytically for a stable stratification. Here we investigate the unstable case and present how the meltrate depends on the lake temperature when the water beneath the ice is overturning and turbulent. We use laboratory tank experiments and direct numerical simulations to study an idealised ice--water boundary. The laboratory tank experiments provide robust observation-based mean-temperature profiles. The numerical simulations provide the full three-dimensional structure of the turbulent flow down to scales not accessible in the laboratory, with a minimum 0.2mm gridspacing. Our laboratory mean-temperature profiles agree well with the numerical simulations and lend credibility to our numerical setup. The structure of the turbulent flow in our simulations is well described by two self-similar subregions, a diffusion-dominated inner layer close to the ice and a turbulence-dominated outer layer far from the ice. We provide an explicit expression for the parameterisation of the meltrate of a horizontal glacier--lake interface as a function of lake temperature.
Study and Development of an Air Conditioning System Operating on a Magnetic Heat Pump Cycle
NASA Technical Reports Server (NTRS)
Wang, Pao-Lien
1991-01-01
This report describes the design of a laboratory scale demonstration prototype of an air conditioning system operating on a magnetic heat pump cycle. Design parameters were selected through studies performed by a Kennedy Space Center (KSC) System Simulation Computer Model. The heat pump consists of a rotor turning through four magnetic fields that are created by permanent magnets. Gadolinium was selected as the working material for this demonstration prototype. The rotor was designed to be constructed of flat parallel disks of gadolinium with very little space in between. The rotor rotates in an aluminum housing. The laboratory scale demonstration prototype is designed to provide a theoretical Carnot Cycle efficiency of 62 percent and a Coefficient of Performance of 16.55.
NASA Astrophysics Data System (ADS)
Lin, Shian-Jiann; Harris, Lucas; Chen, Jan-Huey; Zhao, Ming
2014-05-01
A multi-scale High-Resolution Atmosphere Model (HiRAM) is being developed at NOAA/Geophysical Fluid Dynamics Laboratory. The model's dynamical framework is the non-hydrostatic extension of the vertically Lagrangian finite-volume dynamical core (Lin 2004, Monthly Wea. Rev.) constructed on a stretchable (via Schmidt transformation) cubed-sphere grid. Physical parametrizations originally designed for IPCC-type climate predictions are in the process of being modified and made more "scale-aware", in an effort to make the model suitable for multi-scale weather-climate applications, with horizontal resolution ranging from 1 km (near the target high-resolution region) to as low as 400 km (near the antipodal point). One of the main goals of this development is to enable simulation of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously thought impossible. We will present preliminary results, covering a very wide spectrum of temporal-spatial scales, ranging from simulation of tornado genesis (hours), Madden-Julian Oscillations (intra-seasonal), topical cyclones (seasonal), to Quasi Biennial Oscillations (intra-decadal), using the same global multi-scale modeling system.
Development of an autonomous video rendezvous and docking system, phase 2
NASA Technical Reports Server (NTRS)
Tietz, J. C.; Richardson, T. E.
1983-01-01
The critical elements of an autonomous video rendezvous and docking system were built and used successfully in a physical laboratory simulation. The laboratory system demonstrated that a small, inexpensive electronic package and a flight computer of modest size can analyze television images to derive guidance information for spacecraft. In the ultimate application, the system would use a docking aid consisting of three flashing lights mounted on a passive target spacecraft. Television imagery of the docking aid would be processed aboard an active chase vehicle to derive relative positions and attitudes of the two spacecraft. The demonstration system used scale models of the target spacecraft with working docking aids. A television camera mounted on a 6 degree of freedom (DOF) simulator provided imagery of the target to simulate observations from the chase vehicle. A hardware video processor extracted statistics from the imagery, from which a computer quickly computed position and attitude. Computer software known as a Kalman filter derived velocity information from position measurements.
Thibault, J. C.; Roe, D. R.; Eilbeck, K.; Cheatham, T. E.; Facelli, J. C.
2015-01-01
Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data – both within the same organization and among different ones – remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations. PMID:26387907
Thibault, J C; Roe, D R; Eilbeck, K; Cheatham, T E; Facelli, J C
2015-01-01
Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data - both within the same organization and among different ones - remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations.
Workplace Exposure to Titanium Dioxide Nanopowder Released from a Bag Filter System
Ji, Jun Ho; Kim, Jong Bum; Lee, Gwangjae; Noh, Jung-Hun; Yook, Se-Jin; Cho, So-Hye; Bae, Gwi-Nam
2015-01-01
Many researchers who use laboratory-scale synthesis systems to manufacture nanomaterials could be easily exposed to airborne nanomaterials during the research and development stage. This study used various real-time aerosol detectors to investigate the presence of nanoaerosols in a laboratory used to manufacture titanium dioxide (TiO2). The TiO2 nanopowders were produced via flame synthesis and collected by a bag filter system for subsequent harvesting. Highly concentrated nanopowders were released from the outlet of the bag filter system into the laboratory. The fractional particle collection efficiency of the bag filter system was only 20% at particle diameter of 100 nm, which is much lower than the performance of a high-efficiency particulate air (HEPA) filter. Furthermore, the laboratory hood system was inadequate to fully exhaust the air discharged from the bag filter system. Unbalanced air flow rates between bag filter and laboratory hood systems could result in high exposure to nanopowder in laboratory settings. Finally, we simulated behavior of nanopowders released in the laboratory using computational fluid dynamics (CFD). PMID:26125024
LABORATORY-SCALE SIMULATION OF RUNOFF RESPONSE FROM PERVIOUS-IMPERVIOUS SYSTEMS
Urban development yields landscapes that are composites of impervious and pervious areas, with a consequent reduction in infiltration and increase in stormwater runoff. Although basic rainfall-runoff models are used in the vast majority of runoff prediction in urban landscapes, t...
NASA Astrophysics Data System (ADS)
van den Ende, M. P. A.; Chen, J.; Ampuero, J.-P.; Niemeijer, A. R.
2018-05-01
Rate-and-state friction (RSF) is commonly used for the characterisation of laboratory friction experiments, such as velocity-step tests. However, the RSF framework provides little physical basis for the extrapolation of these results to the scales and conditions of natural fault systems, and so open questions remain regarding the applicability of the experimentally obtained RSF parameters for predicting seismic cycle transients. As an alternative to classical RSF, microphysics-based models offer means for interpreting laboratory and field observations, but are generally over-simplified with respect to heterogeneous natural systems. In order to bridge the temporal and spatial gap between the laboratory and nature, we have implemented existing microphysical model formulations into an earthquake cycle simulator. Through this numerical framework, we make a direct comparison between simulations exhibiting RSF-controlled fault rheology, and simulations in which the fault rheology is dictated by the microphysical model. Even though the input parameters for the RSF simulation are directly derived from the microphysical model, the microphysics-based simulations produce significantly smaller seismic event sizes than the RSF-based simulation, and suggest a more stable fault slip behaviour. Our results reveal fundamental limitations in using classical rate-and-state friction for the extrapolation of laboratory results. The microphysics-based approach offers a more complete framework in this respect, and may be used for a more detailed study of the seismic cycle in relation to material properties and fault zone pressure-temperature conditions.
Comparing field investigations with laboratory models to predict landfill leachate emissions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fellner, Johann; Doeberl, Gernot; Allgaier, Gerhard
2009-06-15
Investigations into laboratory reactors and landfills are used for simulating and predicting emissions from municipal solid waste landfills. We examined water flow and solute transport through the same waste body for different volumetric scales (laboratory experiment: 0.08 m{sup 3}, landfill: 80,000 m{sup 3}), and assessed the differences in water flow and leachate emissions of chloride, total organic carbon and Kjeldahl nitrogen. The results indicate that, due to preferential pathways, the flow of water in field-scale landfills is less uniform than in laboratory reactors. Based on tracer experiments, it can be discerned that in laboratory-scale experiments around 40% of pore watermore » participates in advective solute transport, whereas this fraction amounts to less than 0.2% in the investigated full-scale landfill. Consequences of the difference in water flow and moisture distribution are: (1) leachate emissions from full-scale landfills decrease faster than predicted by laboratory experiments, and (2) the stock of materials remaining in the landfill body, and thus the long-term emission potential, is likely to be underestimated by laboratory landfill simulations.« less
Application of lab derived kinetic biodegradation parameters at the field scale
NASA Astrophysics Data System (ADS)
Schirmer, M.; Barker, J. F.; Butler, B. J.; Frind, E. O.
2003-04-01
Estimating the intrinsic remediation potential of an aquifer typically requires the accurate assessment of the biodegradation kinetics, the level of available electron acceptors and the flow field. Zero- and first-order degradation rates derived at the laboratory scale generally overpredict the rate of biodegradation when applied to the field scale, because limited electron acceptor availability and microbial growth are typically not considered. On the other hand, field estimated zero- and first-order rates are often not suitable to forecast plume development because they may be an oversimplification of the processes at the field scale and ignore several key processes, phenomena and characteristics of the aquifer. This study uses the numerical model BIO3D to link the laboratory and field scale by applying laboratory derived Monod kinetic degradation parameters to simulate a dissolved gasoline field experiment at Canadian Forces Base (CFB) Borden. All additional input parameters were derived from laboratory and field measurements or taken from the literature. The simulated results match the experimental results reasonably well without having to calibrate the model. An extensive sensitivity analysis was performed to estimate the influence of the most uncertain input parameters and to define the key controlling factors at the field scale. It is shown that the most uncertain input parameters have only a minor influence on the simulation results. Furthermore it is shown that the flow field, the amount of electron acceptor (oxygen) available and the Monod kinetic parameters have a significant influence on the simulated results. Under the field conditions modelled and the assumptions made for the simulations, it can be concluded that laboratory derived Monod kinetic parameters can adequately describe field scale degradation processes, if all controlling factors are incorporated in the field scale modelling that are not necessarily observed at the lab scale. In this way, there are no scale relationships to be found that link the laboratory and the field scale, accurately incorporating the additional processes, phenomena and characteristics, such as a) advective and dispersive transport of one or more contaminants, b) advective and dispersive transport and availability of electron acceptors, c) mass transfer limitations and d) spatial heterogeneities, at the larger scale and applying well defined lab scale parameters should accurately describe field scale processes.
Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.
2017-03-27
A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO 2-CH 4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this paper, we present a set ofmore » fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. Finally, the mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.« less
NASA Astrophysics Data System (ADS)
Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.
2017-06-01
A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO2-CH4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this work, we present a set of fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. The mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.
A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO 2-CH 4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this paper, we present a set ofmore » fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. Finally, the mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.« less
Laboratory formation of a scaled protostellar jet by coaligned poloidal magnetic field.
Albertazzi, B; Ciardi, A; Nakatsutsumi, M; Vinci, T; Béard, J; Bonito, R; Billette, J; Borghesi, M; Burkley, Z; Chen, S N; Cowan, T E; Herrmannsdörfer, T; Higginson, D P; Kroll, F; Pikuz, S A; Naughton, K; Romagnani, L; Riconda, C; Revet, G; Riquier, R; Schlenvoigt, H-P; Skobelev, I Yu; Faenov, A Ya; Soloviev, A; Huarte-Espinosa, M; Frank, A; Portugall, O; Pépin, H; Fuchs, J
2014-10-17
Although bipolar jets are seen emerging from a wide variety of astrophysical systems, the issue of their formation and morphology beyond their launching is still under study. Our scaled laboratory experiments, representative of young stellar object outflows, reveal that stable and narrow collimation of the entire flow can result from the presence of a poloidal magnetic field whose strength is consistent with observations. The laboratory plasma becomes focused with an interior cavity. This gives rise to a standing conical shock from which the jet emerges. Following simulations of the process at the full astrophysical scale, we conclude that it can also explain recently discovered x-ray emission features observed in low-density regions at the base of protostellar jets, such as the well-studied jet HH 154. Copyright © 2014, American Association for the Advancement of Science.
Kim, Kyung Hwan; Kim, Sun Hwa; Jung, Young Rim; Kim, Man Goo
2008-09-12
As one of the measures to improve the environment in an automobile, malodor caused by the automobile air-conditioning system evaporator was evaluated and analyzed using laboratory-scale test cooling bench. The odor was simulated with an evaporator test cooling bench equipped with an airflow controller, air temperature and relative humidity controller. To simulate the same odor characteristics that occur from automobiles, one previously used automobile air conditioner evaporator associated with unpleasant odors was selected. The odor was evaluated by trained panels and collected with aluminum polyester bags. Collected samples were analyzed by thermal desorption into a cryotrap and subsequent gas chromatographic separation, followed by simultaneous olfactometry, flame ionization detector and identified by atomic emission detection and mass spectrometry. Compounds such as alcohols, aldehydes, and organic acids were identified as responsible odor-active compounds. Gas chromatography/flame ionization detection/olfactometry combined sensory method with instrumental analysis was very effective as an odor evaluation method in an automobile air-conditioning system evaporator.
NASA Astrophysics Data System (ADS)
DePaolo, D. J.; Steefel, C. I.; Bourg, I. C.
2013-12-01
This talk will review recent research relating to pore scale reactive transport effects done in the context of the Department of Energy-sponsored Energy Frontier Research Center led by Lawrence Berkeley National Laboratory with several other laboratory and University partners. This Center, called the Center for Nanoscale Controls on Geologic CO2 (NCGC) has focused effort on the behavior of supercritical CO2 being injected into and/or residing as capillary trapped-bubbles in sandstone and shale, with particular emphasis on the description of nanoscale to pore scale processes that could provide the basis for advanced simulations. In general, simulation of reservoir-scale behavior of CO2 sequestration assumes a number of mostly qualitative relationships that are defensible as nominal first-order descriptions of single-fluid systems, but neglect the many complications that are associated with a two-phase or three-phase reactive system. The contrasts in properties, and the mixing behavior of scCO2 and brine provide unusual conditions for water-rock interaction, and the NCGC has investigated the underlying issues by a combination of approaches including theoretical and experimental studies of mineral nucleation and growth, experimental studies of brine films, mineral wetting properties, dissolution-precipitation rates and infiltration patterns, molecular dynamic simulations and neutron scattering experiments of fluid properties for fluid confined in nanopores, and various approaches to numerical simulation of reactive transport processes. The work to date has placed new constraints on the thickness of brine films, and also on the wetting properties of CO2 versus brine, a property that varies between minerals and with salinity, and may also change with time as a result of the reactivity of CO2-saturated brine. Mineral dissolution is dependent on reactive surface area, which can be shown to vary by a large factor for various minerals, especially when correlated with interconnected pore space. High-resolution numerical simulations of reactive transport can ultimate lead to quantitative descriptions of pore scale chemistry and flow, and examples of recent developments will be presented. However, only a limited description of the processes can realistically be treated in such simulations, and only for chemically simple systems. Whether and when more complete simulations will be achievable is yet to be determined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutierrez, Marte
The research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to: 1) Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation. 2) Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator. 3) Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the resultsmore » to improve understand of proppant flow and transport. 4) Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production. 5) Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include: 1) A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS, 2) Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock, 3) Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications, and 4) Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutierrez, Marte
2013-12-31
This research project aims to develop and validate an advanced computer model that can be used in the planning and design of stimulation techniques to create engineered reservoirs for Enhanced Geothermal Systems. The specific objectives of the proposal are to; Develop a true three-dimensional hydro-thermal fracturing simulator that is particularly suited for EGS reservoir creation; Perform laboratory scale model tests of hydraulic fracturing and proppant flow/transport using a polyaxial loading device, and use the laboratory results to test and validate the 3D simulator; Perform discrete element/particulate modeling of proppant transport in hydraulic fractures, and use the results to improve understandmore » of proppant flow and transport; Test and validate the 3D hydro-thermal fracturing simulator against case histories of EGS energy production; and Develop a plan to commercialize the 3D fracturing and proppant flow/transport simulator. The project is expected to yield several specific results and benefits. Major technical products from the proposal include; A true-3D hydro-thermal fracturing computer code that is particularly suited to EGS; Documented results of scale model tests on hydro-thermal fracturing and fracture propping in an analogue crystalline rock; Documented procedures and results of discrete element/particulate modeling of flow and transport of proppants for EGS applications; and Database of monitoring data, with focus of Acoustic Emissions (AE) from lab scale modeling and field case histories of EGS reservoir creation.« less
Scaling and pedotransfer in numerical simulations of flow and transport in soils
USDA-ARS?s Scientific Manuscript database
Flow and transport parameters of soils in numerical simulations need to be defined at the support scale of computational grid cells. Such support scale can substantially differ from the support scale in laboratory or field measurements of flow and transport parameters. The scale-dependence of flow a...
Numerical simulation of small-scale thermal convection in the atmosphere
NASA Technical Reports Server (NTRS)
Somerville, R. C. J.
1973-01-01
A Boussinesq system is integrated numerically in three dimensions and time in a study of nonhydrostatic convection in the atmosphere. Simulation of cloud convection is achieved by the inclusion of parametrized effects of latent heat and small-scale turbulence. The results are compared with the cell structure observed in Rayleigh-Benard laboratory conversion experiments in air. At a Rayleigh number of 4000, the numerical model adequately simulates the experimentally observed evolution, including some prominent transients of a flow from a randomly perturbed initial conductive state into the final state of steady large-amplitude two-dimensional rolls. At Rayleigh number 9000, the model reproduces the experimentally observed unsteady equilibrium of vertically coherent oscillatory waves superimposed on rolls.
EPOS-WP16: A coherent and collaborative network of Solid Earth Multi-scale laboratories
NASA Astrophysics Data System (ADS)
Calignano, Elisa; Rosenau, Matthias; Lange, Otto; Spiers, Chris; Willingshofer, Ernst; Drury, Martyn; van Kan-Parker, Mirjam; Elger, Kirsten; Ulbricht, Damian; Funiciello, Francesca; Trippanera, Daniele; Sagnotti, Leonardo; Scarlato, Piergiorgio; Tesei, Telemaco; Winkler, Aldo
2017-04-01
Laboratory facilities are an integral part of Earth Science research. The diversity of methods employed in such infrastructures reflects the multi-scale nature of the Earth system and is essential for the understanding of its evolution, for the assessment of geo-hazards and for the sustainable exploitation of geo-resources. In the frame of EPOS (European Plate Observing System), the Working Package 16 represents a developing community of European Geoscience Multi-scale laboratories. The participant and collaborating institutions (Utrecht University, GFZ, RomaTre University, INGV, NERC, CSIC-ICTJA, CNRS, LMU, C4G-UBI, ETH, CNR*) embody several types of laboratory infrastructures, engaged in different fields of interest of Earth Science: from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue tectonic and geodynamic modelling and paleomagnetic laboratories. The length scales encompassed by these infrastructures range from the nano- and micrometre levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetres-sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. The aim of WP16 is to provide two services by the year 2019: first, providing virtual access to data from laboratories (data service) and, second, providing physical access to laboratories (transnational access, TNA). Regarding the development of a data service, the current status is such that most data produced by the various laboratory centres and networks are available only in limited "final form" in publications, many data remain inaccessible and/or poorly preserved. Within EPOS the TCS Multi-scale laboratories is collecting and harmonizing available and emerging laboratory data on the properties and process controlling rock system behaviour at all relevant scales, in order to generate products accessible and interoperable through services for supporting research activities into Geo-resources and Geo-storage, Geo-hazards and Earth System Evolution. Regarding the provision of physical access to laboratories the current situation is such that access to WP16's laboratories is often based on professional relations, available budgets, shared interests and other constraints. In WP16 we aim at reducing the present diversity and non-transparency of access rules and replace ad-hoc procedures for access by a streamlined mechanisms, objective rules and a transparent policy. We work on procedures and mechanisms regulating application, negotiation, evaluation, feedback, selection, admission, approval, feasibility check, setting-up, use, monitoring and dismantling. In the end laboratories should each have a single point providing clear and transparent information on the facility itself, its services, access policy, data management policy and the legal terms and conditions for use of equipment. Through its role as an intermediary and information broker, EPOS will acquire a wealth of information from Research Infrastructures and users on the establishment of efficient collaboration agreements.
Multi-mode evaluation of power-maximizing cross-flow turbine controllers
Forbush, Dominic; Cavagnaro, Robert J.; Donegan, James; ...
2017-09-21
A general method for predicting and evaluating the performance of three candidate cross-flow turbine power-maximizing controllers is presented in this paper using low-order dynamic simulation, scaled laboratory experiments, and full-scale field testing. For each testing mode and candidate controller, performance metrics quantifying energy capture (ability of a controller to maximize power), variation in torque and rotation rate (related to drive train fatigue), and variation in thrust loads (related to structural fatigue) are quantified for two purposes. First, for metrics that could be evaluated across all testing modes, we considered the accuracy with which simulation or laboratory experiments could predict performancemore » at full scale. Second, we explored the utility of these metrics to contrast candidate controller performance. For these turbines and set of candidate controllers, energy capture was found to only differentiate controller performance in simulation, while the other explored metrics were able to predict performance of the full-scale turbine in the field with various degrees of success. Finally, effects of scale between laboratory and full-scale testing are considered, along with recommendations for future improvements to dynamic simulations and controller evaluation.« less
Multi-mode evaluation of power-maximizing cross-flow turbine controllers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forbush, Dominic; Cavagnaro, Robert J.; Donegan, James
A general method for predicting and evaluating the performance of three candidate cross-flow turbine power-maximizing controllers is presented in this paper using low-order dynamic simulation, scaled laboratory experiments, and full-scale field testing. For each testing mode and candidate controller, performance metrics quantifying energy capture (ability of a controller to maximize power), variation in torque and rotation rate (related to drive train fatigue), and variation in thrust loads (related to structural fatigue) are quantified for two purposes. First, for metrics that could be evaluated across all testing modes, we considered the accuracy with which simulation or laboratory experiments could predict performancemore » at full scale. Second, we explored the utility of these metrics to contrast candidate controller performance. For these turbines and set of candidate controllers, energy capture was found to only differentiate controller performance in simulation, while the other explored metrics were able to predict performance of the full-scale turbine in the field with various degrees of success. Finally, effects of scale between laboratory and full-scale testing are considered, along with recommendations for future improvements to dynamic simulations and controller evaluation.« less
Multimodel Simulation of Water Flow: Uncertainty Analysis
USDA-ARS?s Scientific Manuscript database
Simulations of soil water flow require measurements of soil hydraulic properties which are particularly difficult at the field scale. Laboratory measurements provide hydraulic properties at scales finer than the field scale, whereas pedotransfer functions (PTFs) integrate information on hydraulic pr...
Model Scaling of Hydrokinetic Ocean Renewable Energy Systems
NASA Astrophysics Data System (ADS)
von Ellenrieder, Karl; Valentine, William
2013-11-01
Numerical simulations are performed to validate a non-dimensional dynamic scaling procedure that can be applied to subsurface and deeply moored systems, such as hydrokinetic ocean renewable energy devices. The prototype systems are moored in water 400 m deep and include: subsurface spherical buoys moored in a shear current and excited by waves; an ocean current turbine excited by waves; and a deeply submerged spherical buoy in a shear current excited by strong current fluctuations. The corresponding model systems, which are scaled based on relative water depths of 10 m and 40 m, are also studied. For each case examined, the response of the model system closely matches the scaled response of the corresponding full-sized prototype system. The results suggest that laboratory-scale testing of complete ocean current renewable energy systems moored in a current is possible. This work was supported by the U.S. Southeast National Marine Renewable Energy Center (SNMREC).
Phase Transitions and Scaling in Systems Far from Equilibrium
NASA Astrophysics Data System (ADS)
Täuber, Uwe C.
2017-03-01
Scaling ideas and renormalization group approaches proved crucial for a deep understanding and classification of critical phenomena in thermal equilibrium. Over the past decades, these powerful conceptual and mathematical tools were extended to continuous phase transitions separating distinct nonequilibrium stationary states in driven classical and quantum systems. In concordance with detailed numerical simulations and laboratory experiments, several prominent dynamical universality classes have emerged that govern large-scale, long-time scaling properties both near and far from thermal equilibrium. These pertain to genuine specific critical points as well as entire parameter space regions for steady states that display generic scale invariance. The exploration of nonstationary relaxation properties and associated physical aging scaling constitutes a complementary potent means to characterize cooperative dynamics in complex out-of-equilibrium systems. This review describes dynamic scaling features through paradigmatic examples that include near-equilibrium critical dynamics, driven lattice gases and growing interfaces, correlation-dominated reaction-diffusion systems, and basic epidemic models.
PAM: Particle automata model in simulation of Fusarium graminearum pathogen expansion.
Wcisło, Rafał; Miller, S Shea; Dzwinel, Witold
2016-01-21
The multi-scale nature and inherent complexity of biological systems are a great challenge for computer modeling and classical modeling paradigms. We present a novel particle automata modeling metaphor in the context of developing a 3D model of Fusarium graminearum infection in wheat. The system consisting of the host plant and Fusarium pathogen cells can be represented by an ensemble of discrete particles defined by a set of attributes. The cells-particles can interact with each other mimicking mechanical resistance of the cell walls and cell coalescence. The particles can move, while some of their attributes can be changed according to prescribed rules. The rules can represent cellular scales of a complex system, while the integrated particle automata model (PAM) simulates its overall multi-scale behavior. We show that due to the ability of mimicking mechanical interactions of Fusarium tip cells with the host tissue, the model is able to simulate realistic penetration properties of the colonization process reproducing both vertical and lateral Fusarium invasion scenarios. The comparison of simulation results with micrographs from laboratory experiments shows encouraging qualitative agreement between the two. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This paper highlights the use of the CAPE-OPEN (CO) standard interfaces in the Advanced Process Engineering Co-Simulator (APECS) developed at the National Energy Technology Laboratory (NETL). The APECS system uses the CO unit operation, thermodynamic, and reaction interfaces to provide its plug-and-play co-simulation capabilities, including the integration of process simulation with computational fluid dynamics (CFD) simulation. APECS also relies heavily on the use of a CO COM/CORBA bridge for running process/CFD co-simulations on multiple operating systems. For process optimization in the face of multiple and some time conflicting objectives, APECS offers stochastic modeling and multi-objective optimization capabilities developed to complymore » with the CO software standard. At NETL, system analysts are applying APECS to a wide variety of advanced power generation systems, ranging from small fuel cell systems to commercial-scale power plants including the coal-fired, gasification-based FutureGen power and hydrogen production plant.« less
Heinz, Hendrik; Ramezani-Dakhel, Hadi
2016-01-21
Natural and man-made materials often rely on functional interfaces between inorganic and organic compounds. Examples include skeletal tissues and biominerals, drug delivery systems, catalysts, sensors, separation media, energy conversion devices, and polymer nanocomposites. Current laboratory techniques are limited to monitor and manipulate assembly on the 1 to 100 nm scale, time-consuming, and costly. Computational methods have become increasingly reliable to understand materials assembly and performance. This review explores the merit of simulations in comparison to experiment at the 1 to 100 nm scale, including connections to smaller length scales of quantum mechanics and larger length scales of coarse-grain models. First, current simulation methods, advances in the understanding of chemical bonding, in the development of force fields, and in the development of chemically realistic models are described. Then, the recognition mechanisms of biomolecules on nanostructured metals, semimetals, oxides, phosphates, carbonates, sulfides, and other inorganic materials are explained, including extensive comparisons between modeling and laboratory measurements. Depending on the substrate, the role of soft epitaxial binding mechanisms, ion pairing, hydrogen bonds, hydrophobic interactions, and conformation effects is described. Applications of the knowledge from simulation to predict binding of ligands and drug molecules to the inorganic surfaces, crystal growth and shape development, catalyst performance, as well as electrical properties at interfaces are examined. The quality of estimates from molecular dynamics and Monte Carlo simulations is validated in comparison to measurements and design rules described where available. The review further describes applications of simulation methods to polymer composite materials, surface modification of nanofillers, and interfacial interactions in building materials. The complexity of functional multiphase materials creates opportunities to further develop accurate force fields, including reactive force fields, and chemically realistic surface models, to enable materials discovery at a million times lower computational cost compared to quantum mechanical methods. The impact of modeling and simulation could further be increased by the advancement of a uniform simulation platform for organic and inorganic compounds across the periodic table and new simulation methods to evaluate system performance in silico.
Computational simulation of laboratory-scale volcanic jets
NASA Astrophysics Data System (ADS)
Solovitz, S.; Van Eaton, A. R.; Mastin, L. G.; Herzog, M.
2017-12-01
Volcanic eruptions produce ash clouds that may travel great distances, significantly impacting aviation and communities downwind. Atmospheric hazard forecasting relies partly on numerical models of the flow physics, which incorporate data from eruption observations and analogue laboratory tests. As numerical tools continue to increase in complexity, they must be validated to fine-tune their effectiveness. Since eruptions are relatively infrequent and challenging to observe in great detail, analogue experiments can provide important insights into expected behavior over a wide range of input conditions. Unfortunately, laboratory-scale jets cannot easily attain the high Reynolds numbers ( 109) of natural volcanic eruption columns. Comparisons between the computational models and analogue experiments can help bridge this gap. In this study, we investigate a 3-D volcanic plume model, the Active Tracer High-resolution Atmospheric Model (ATHAM), which has been used to simulate a variety of eruptions. However, it has not been previously validated using laboratory-scale data. We conducted numerical simulations of three flows that we have studied in the laboratory: a vertical jet in a quiescent environment, a vertical jet in horizontal cross flow, and a particle-laden jet. We considered Reynolds numbers from 10,000 to 50,000, jet-to-cross flow velocity ratios of 2 to 10, and particle mass loadings of up to 25% of the exit mass flow rate. Vertical jet simulations produce Gaussian velocity profiles in the near exit region by 3 diameters downstream, matching the mean experimental profiles. Simulations of air entrainment are of the correct order of magnitude, but they show decreasing entrainment with vertical distance from the vent. Cross flow simulations reproduce experimental trajectories for the jet centerline initially, although confinement appears to impact the response later. Particle-laden simulations display minimal variation in concentration profiles between cases with different mass loadings and size distributions, indicating that differences in particle behavior may not be evident at this laboratory scale.
WeaVR: a self-contained and wearable immersive virtual environment simulation system.
Hodgson, Eric; Bachmann, Eric R; Vincent, David; Zmuda, Michael; Waller, David; Calusdian, James
2015-03-01
We describe WeaVR, a computer simulation system that takes virtual reality technology beyond specialized laboratories and research sites and makes it available in any open space, such as a gymnasium or a public park. Novel hardware and software systems enable HMD-based immersive virtual reality simulations to be conducted in any arbitrary location, with no external infrastructure and little-to-no setup or site preparation. The ability of the WeaVR system to provide realistic motion-tracked navigation for users, to improve the study of large-scale navigation, and to generate usable behavioral data is shown in three demonstrations. First, participants navigated through a full-scale virtual grocery store while physically situated in an open grass field. Trajectory data are presented for both normal tracking and for tracking during the use of redirected walking that constrained users to a predefined area. Second, users followed a straight path within a virtual world for distances of up to 2 km while walking naturally and being redirected to stay within the field, demonstrating the ability of the system to study large-scale navigation by simulating virtual worlds that are potentially unlimited in extent. Finally, the portability and pedagogical implications of this system were demonstrated by taking it to a regional high school for live use by a computer science class on their own school campus.
Besmer, Michael D.; Sigrist, Jürg A.; Props, Ruben; Buysschaert, Benjamin; Mao, Guannan; Boon, Nico; Hammes, Frederik
2017-01-01
Rapid contamination of drinking water in distribution and storage systems can occur due to pressure drop, backflow, cross-connections, accidents, and bio-terrorism. Small volumes of a concentrated contaminant (e.g., wastewater) can contaminate large volumes of water in a very short time with potentially severe negative health impacts. The technical limitations of conventional, cultivation-based microbial detection methods neither allow for timely detection of such contaminations, nor for the real-time monitoring of subsequent emergency remediation measures (e.g., shock-chlorination). Here we applied a newly developed continuous, ultra high-frequency flow cytometry approach to track a rapid pollution event and subsequent disinfection of drinking water in an 80-min laboratory scale simulation. We quantified total (TCC) and intact (ICC) cell concentrations as well as flow cytometric fingerprints in parallel in real-time with two different staining methods. The ingress of wastewater was detectable almost immediately (i.e., after 0.6% volume change), significantly changing TCC, ICC, and the flow cytometric fingerprint. Shock chlorination was rapid and detected in real time, causing membrane damage in the vast majority of bacteria (i.e., drop of ICC from more than 380 cells μl-1 to less than 30 cells μl-1 within 4 min). Both of these effects as well as the final wash-in of fresh tap water followed calculated predictions well. Detailed and highly quantitative tracking of microbial dynamics at very short time scales and for different characteristics (e.g., concentration, membrane integrity) is feasible. This opens up multiple possibilities for targeted investigation of a myriad of bacterial short-term dynamics (e.g., disinfection, growth, detachment, operational changes) both in laboratory-scale research and full-scale system investigations in practice. PMID:29085343
RAM simulation model for SPH/RSV systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, J.C.; Primm, A.H.; Nelson, S.C.
1995-12-31
The US Army`s Project Manager, Crusader is sponsoring the development of technologies that apply to the Self-Propelled Howitzer (SPH), formerly the Advanced Field Artillery System (AFAS), and Resupply Vehicle (RSV), formerly the Future Armored Resupply Vehicle (FARV), weapon system. Oak Ridge National Laboratory (ORNL) is currently performing developmental work in support of the SPH/PSV Crusader system. Supportive analyses of reliability, availability, and maintainability (RAM) aspects were also performed for the SPH/RSV effort. During FY 1994 and FY 1995 OPNL conducted a feasibility study to demonstrate the application of simulation modeling for RAM analysis of the Crusader system. Following completion ofmore » the feasibility study, a full-scale RAM simulation model of the Crusader system was developed for both the SPH and PSV. This report provides documentation for the simulation model as well as instructions in the proper execution and utilization of the model for the conduct of RAM analyses.« less
Impact gages for detecting meteoroid and other orbital debris impacts on space vehicles.
NASA Technical Reports Server (NTRS)
Mastandrea, J. R.; Scherb, M. V.
1973-01-01
Impacts on space vehicles have been simulated using the McDonnell Douglas Aerophysics Laboratory (MDAL) Light-Gas Guns to launch particles at hypervelocity speeds into scaled space structures. Using impact gages and a triangulation technique, these impacts have been detected and accurately located. This paper describes in detail the various types of impact gages (piezoelectric PZT-5A, quartz, electret, and off-the-shelf plastics) used. This description includes gage design and experimental results for gages installed on single-walled scaled payload carriers, multiple-walled satellites and space stations, and single-walled full-scale Delta tank structures. A brief description of the triangulation technique, the impact simulation, and the data acquisition system are also included.
BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs
NASA Astrophysics Data System (ADS)
Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes
2017-06-01
Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.
3D printing application and numerical simulations in a fracture system
NASA Astrophysics Data System (ADS)
Yoon, H.; Martinez, M. J.
2017-12-01
The hydrogeological and mechanical properties in fractured and porous media are fundamental to predicting coupled multiphysics processes in the subsurface. Recent advances in experimental methods and multi-scale imaging capabilities have revolutionized our ability to quantitatively characterize geomaterials and digital counterparts are now routinely used for numerical simulations to characterize petrophysical and mechanical properties across scales. 3D printing is a very effective and creative technique that reproduce the digital images in a controlled way. For geoscience applications, 3D printing can be co-opted to print reproducible porous and fractured structures derived from CT-imaging of actual rocks and theoretical algorithms for experimental testing. In this work we used a stereolithography (SLA) method to create a single fracture network. The fracture in shale was first scanned using a microCT system and then the digital fracture network was printed into two parts and assembled. Aperture ranges from 0.3 to 1 mm. In particular, we discuss the design of single fracture network and the progress of printing practices to reproduce the fracture network system. Printed samples at different scales are used to measure the permeability and surface roughness. Various numerical simulations including (non-)reactive transport and multiphase flow cases are performed to study fluid flow characterization. We will also discuss the innovative advancement of 3D printing techniques applicable for coupled processes in the subsurface. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology & Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525.
Comparison of Refractory Performance in Black Liquor Gasifiers and a Smelt Test System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peascoe, RA
2001-09-25
Prior laboratory corrosion studies along with experience at the black liquor gasifier in New Bern, North Carolina, clearly demonstrate that serious material problems exist with the gasifier's refractory lining. Mullite-based and alumina-based refractories used at the New Bern facility suffered significant degradation even though they reportedly performed adequately in smaller scale systems. Oak Ridge National Laboratory's involvement in the failure analysis, and the initial exploration of suitable replacement materials, led to the realization that a simple and reliable, complementary method for refractory screening was needed. The development of a laboratory test system and its suitability for simulating the environment ofmore » black liquor gasifiers was undertaken. Identification and characterization of corrosion products were used to evaluate the test system as a rapid screening tool for refractory performance and as a predictor of refractory lifetime. Results from the test systems and pl ants were qualitatively similar.« less
Review of sonic-boom simulation devices and techniques.
NASA Technical Reports Server (NTRS)
Edge, P. M., Jr.; Hubbard, H. H.
1972-01-01
Research on aircraft-generated sonic booms has led to the development of special techniques to generate controlled sonic-boom-type disturbances without the complications and expense of supersonic flight operations. This paper contains brief descriptions of several of these techniques along with the significant hardware items involved and indicates the advantages and disadvantages of each in research applications. Included are wind tunnels, ballistic ranges, spark discharges, piston phones, shock tubes, high-speed valve systems, and shaped explosive charges. Specialized applications include sonic-boom generation and propagation studies and the responses of structures, terrain, people, and animals. Situations for which simulators are applicable are shown to include both small-scale and large-scale laboratory tests and full-scale field tests. Although no one approach to simulation is ideal, the various techniques available generally complement each other to provide desired capability for a broad range of sonic-boom studies.
High performance computing in biology: multimillion atom simulations of nanoscale systems
Sanbonmatsu, K. Y.; Tung, C.-S.
2007-01-01
Computational methods have been used in biology for sequence analysis (bioinformatics), all-atom simulation (molecular dynamics and quantum calculations), and more recently for modeling biological networks (systems biology). Of these three techniques, all-atom simulation is currently the most computationally demanding, in terms of compute load, communication speed, and memory load. Breakthroughs in electrostatic force calculation and dynamic load balancing have enabled molecular dynamics simulations of large biomolecular complexes. Here, we report simulation results for the ribosome, using approximately 2.64 million atoms, the largest all-atom biomolecular simulation published to date. Several other nanoscale systems with different numbers of atoms were studied to measure the performance of the NAMD molecular dynamics simulation program on the Los Alamos National Laboratory Q Machine. We demonstrate that multimillion atom systems represent a 'sweet spot' for the NAMD code on large supercomputers. NAMD displays an unprecedented 85% parallel scaling efficiency for the ribosome system on 1024 CPUs. We also review recent targeted molecular dynamics simulations of the ribosome that prove useful for studying conformational changes of this large biomolecular complex in atomic detail. PMID:17187988
MHD scaling: from astrophysics to the laboratory
NASA Astrophysics Data System (ADS)
Ryutov, Dmitri
2000-10-01
During the last few years, considerable progress has been made in simulating astrophysical phenomena in laboratory experiments with high power lasers [1]. Astrophysical phenomena that have drawn particular interest include supernovae explosions; young supernova remnants; galactic jets; the formation of fine structures in late supernova remnants by instabilities; and the ablation driven evolution of molecular clouds illuminated by nearby bright stars, which may affect star formation. A question may arise as to what extent the laser experiments, which deal with targets of a spatial scale 0.01 cm and occur at a time scale of a few nanoseconds, can reproduce phenomena occurring at spatial scales of a million or more kilometers and time scales from hours to many years. Quite remarkably, if dissipative processes (like, e.g., viscosity, Joule dissipation, etc.) are subdominant in both systems, and the matter behaves as a polytropic gas, there exists a broad hydrodynamic similarity (the ``Euler similarity" of Ref. [2]) that allows a direct scaling of laboratory results to astrophysical phenomena. Following a review of relevant earlier work (in particular, [3]-[5]), discussion is presented of the details of the Euler similarity related to the presence of shocks and to a special case of a strong drive. After that, constraints stemming from possible development of small-scale turbulence are analyzed. Generalization of the Euler similarity to the case of a gas with spatially varying polytropic index is presented. A possibility of scaled simulations of ablation front dynamics is one more topic covered in this paper. It is shown that, with some additional constraints, a simple similarity exists. This, in particular, opens up the possibility of scaled laboratory simulation of the aforementioned ablation (photoevaporation) fronts. A nonlinear transformation [6] that establishes a duality between implosion and explosion processes is also discussed in the paper. 1. B.A. Remington et al., Phys. Plasmas, v.7, p. 1641 (2000); Science, v. 284, p. 1488 (1999). 2. D.D. Ryutov et al., Ap. J, v. 518, 821 (1999). 3. B.B. Kadomtsev. Sov. J. Plasma Phys., v. 1, p. 296 (1975). 4. J.W. Connor, J.B. Taylor. Nucl. Fus., v. 17, p. 377 (1977). 5. Q. Zhiang, M.J. Graham. Phys. Rev. Lett., v. 79, p. 2674 (1997). 6. L. O'C. Drury, J.T. Mendonca. Paper at 3rd Intern. Conf. on Laser. Astrophys., Rice Univ., Houston, 2000.
NASA Astrophysics Data System (ADS)
Jackson, S. J.; Krevor, S. C.; Agada, S.
2017-12-01
A number of studies have demonstrated the prevalent impact that small-scale rock heterogeneity can have on larger scale flow in multiphase flow systems including petroleum production and CO2sequestration. Larger scale modeling has shown that this has a significant impact on fluid flow and is possibly a significant source of inaccuracy in reservoir simulation. Yet no core analysis protocol has been developed that faithfully represents the impact of these heterogeneities on flow functions used in modeling. Relative permeability is derived from core floods performed at conditions with high flow potential in which the impact of capillary heterogeneity is voided. A more accurate representation would be obtained if measurements were made at flow conditions where the impact of capillary heterogeneity on flow is scaled to be representative of the reservoir system. This, however, is generally impractical due to laboratory constraints and the role of the orientation of the rock heterogeneity. We demonstrate a workflow of combined observations and simulations, in which the impact of capillary heterogeneity may be faithfully represented in the derivation of upscaled flow properties. Laboratory measurements that are a variation of conventional protocols are used for the parameterization of an accurate digital rock model for simulation. The relative permeability at the range of capillary numbers relevant to flow in the reservoir is derived primarily from numerical simulations of core floods that include capillary pressure heterogeneity. This allows flexibility in the orientation of the heterogeneity and in the range of flow rates considered. We demonstrate the approach in which digital rock models have been developed alongside core flood observations for three applications: (1) A Bentheimer sandstone with a simple axial heterogeneity to demonstrate the validity and limitations of the approach, (2) a set of reservoir rocks from the Captain sandstone in the UK North Sea targeted for CO2 storage, and for which the use of capillary pressure hysteresis is necessary, and (3) a secondary CO2-EOR production of residual oil from a Berea sandstone with layered heterogeneities. In all cases the incorporation of heterogeneity is shown to be key to the ultimate derivation of flow properties representative of the reservoir system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zitney, S.E.
This presentation will examine process systems engineering R&D needs for application to advanced fossil energy (FE) systems and highlight ongoing research activities at the National Energy Technology Laboratory (NETL) under the auspices of a recently launched Collaboratory for Process & Dynamic Systems Research. The three current technology focus areas include: 1) High-fidelity systems with NETL's award-winning Advanced Process Engineering Co-Simulator (APECS) technology for integrating process simulation with computational fluid dynamics (CFD) and virtual engineering concepts, 2) Dynamic systems with R&D on plant-wide IGCC dynamic simulation, control, and real-time training applications, and 3) Systems optimization including large-scale process optimization, stochastic simulationmore » for risk/uncertainty analysis, and cost estimation. Continued R&D aimed at these and other key process systems engineering models, methods, and tools will accelerate the development of advanced gasification-based FE systems and produce increasingly valuable outcomes for DOE and the Nation.« less
Laboratory simulation of rocket-borne D-region blunt probe flows
NASA Technical Reports Server (NTRS)
Kaplan, L. B.
1977-01-01
The flow of weakly ionized plasmas that is similar to the flow that occurs over rocket-borne blunt probes as they pass through the lower ionosphere has been simulated in a scaled laboratory environment, and electron collection D region blunt probe theories have been evaluated.
Simulating Astrophysical Jets with Inertial Confinement Fusion Machines
NASA Astrophysics Data System (ADS)
Blue, Brent
2005-10-01
Large-scale directional outflows of supersonic plasma, also known as `jets', are ubiquitous phenomena in astrophysics. The traditional approach to understanding such phenomena is through theoretical analysis and numerical simulations. However, theoretical analysis might not capture all the relevant physics and numerical simulations have limited resolution and fail to scale correctly in Reynolds number and perhaps other key dimensionless parameters. Recent advances in high energy density physics using large inertial confinement fusion devices now allow controlled laboratory experiments on macroscopic volumes of plasma of direct relevance to astrophysics. This talk will present an overview of these facilities as well as results from current laboratory astrophysics experiments designed to study hydrodynamic jets and Rayleigh-Taylor mixing. This work is performed under the auspices of the U. S. DOE by Lawrence Livermore National Laboratory under Contract No. W-7405-ENG-48, Los Alamos National Laboratory under Contract No. W-7405-ENG-36, and the Laboratory for Laser Energetics under Contract No. DE-FC03-92SF19460.
SUSTAINED TURBULENCE IN DIFFERENTIALLY ROTATING MAGNETIZED FLUIDS AT A LOW MAGNETIC PRANDTL NUMBER
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nauman, Farrukh; Pessah, Martin E., E-mail: nauman@nbi.ku.dk
2016-12-20
We show for the first time that sustained turbulence is possible at a low magnetic Prandtl number in local simulations of Keplerian flows with no mean magnetic flux. Our results indicate that increasing the vertical domain size is equivalent to increasing the dynamical range between the energy injection scale and the dissipative scale. This has important implications for a large variety of differentially rotating systems with low magnetic Prandtl number such as protostellar disks and laboratory experiments.
NASA Technical Reports Server (NTRS)
Kalinowski, Kevin F.; Tucker, George E.; Moralez, Ernesto, III
2006-01-01
Engineering development and qualification of a Research Flight Control System (RFCS) for the Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) JUH-60A has motivated the development of a pilot rating scale for evaluating failure transients in fly-by-wire flight control systems. The RASCAL RFCS includes a highly-reliable, dual-channel Servo Control Unit (SCU) to command and monitor the performance of the fly-by-wire actuators and protect against the effects of erroneous commands from the flexible, but single-thread Flight Control Computer. During the design phase of the RFCS, two piloted simulations were conducted on the Ames Research Center Vertical Motion Simulator (VMS) to help define the required performance characteristics of the safety monitoring algorithms in the SCU. Simulated failures, including hard-over and slow-over commands, were injected into the command path, and the aircraft response and safety monitor performance were evaluated. A subjective Failure/Recovery Rating (F/RR) scale was developed as a means of quantifying the effects of the injected failures on the aircraft state and the degree of pilot effort required to safely recover the aircraft. A brief evaluation of the rating scale was also conducted on the Army/NASA CH-47B variable stability helicopter to confirm that the rating scale was likely to be equally applicable to in-flight evaluations. Following the initial research flight qualification of the RFCS in 2002, a flight test effort was begun to validate the performance of the safety monitors and to validate their design for the safe conduct of research flight testing. Simulated failures were injected into the SCU, and the F/RR scale was applied to assess the results. The results validate the performance of the monitors, and indicate that the Failure/Recovery Rating scale is a very useful tool for evaluating failure transients in fly-by-wire flight control systems.
NASA Astrophysics Data System (ADS)
Huerta, N. J.; Fahrman, B.; Rod, K. A.; Fernandez, C. A.; Crandall, D.; Moore, J.
2017-12-01
Laboratory experiments provide a robust method to analyze well integrity. Experiments are relatively cheap, controlled, and repeatable. However, simplifying assumptions, apparatus limitations, and scaling are ubiquitous obstacles for translating results from the bench to the field. We focus on advancing the correlation between laboratory results and field conditions by characterizing how failure varies with specimen geometry using two experimental approaches. The first approach is designed to measure the shear bond strength between steel and cement in a down-scaled (< 3" diameter) well geometry. We use several cylindrical casing-cement-casing geometries that either mimic the scaling ratios found in the field or maximize the amount of metal and cement in the sample. We subject the samples to thermal shock cycles to simulate damage to the interfaces from operations. The bond was then measured via a push-out test. We found that not only did expected parameters, e.g. curing time, play a role in shear-bond strength but also that scaling of the geometry was important. The second approach is designed to observe failure of the well system due to pressure applied on the inside of a lab-scale (1.5" diameter) cylindrical casing-cement-rock geometry. The loading apparatus and sample are housed within an industrial X-ray CT scanner capable of imaging the system while under pressure. Radial tension cracks were observed in the cement after an applied internal pressure of 3000 psi and propagated through the cement and into the rock as pressure was increased. Based on our current suite of tests we find that the relationship between sample diameters and thicknesses is an important consideration when observing the strength and failure of well systems. The test results contribute to our knowledge of well system failure, evaluation and optimization of new cements, as well as the applicability of using scaled-down tests as a proxy for understanding field-scale conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
DWYER,BRIAN P.
2000-01-01
Three reactive materials were evaluated at laboratory scale to identify the optimum treatment reagent for use in a Permeable Reactive Barrier Treatment System at Rocky Flats Environmental Technology Site (RFETS). The contaminants of concern (COCS) are uranium, TCE, PCE, carbon tetrachloride, americium, and vinyl chloride. The three reactive media evaluated included high carbon steel iron filings, an iron-silica alloy in the form of a foam aggregate, and a peculiar humic acid based sorbent (Humasorb from Arctech) mixed with sand. Each material was tested in the laboratory at column scale using simulated site water. All three materials showed promise for themore » 903 Mound Site however, the iron filings were determined to be the least expensive media. In order to validate the laboratory results, the iron filings were further tested at a pilot scale (field columns) using actual site water. Pilot test results were similar to laboratory results; consequently, the iron filings were chosen for the fill-scale demonstration of the reactive barrier technology. Additional design parameters including saturated hydraulic conductivity, treatment residence time, and head loss across the media were also determined and provided to the design team in support of the final design. The final design was completed by the Corps of Engineers in 1997 and the system was constructed in the summer of 1998. The treatment system began fill operation in December, 1998 and despite a few problems has been operational since. Results to date are consistent with the lab and pilot scale findings, i.e., complete removal of the contaminants of concern (COCs) prior to discharge to meet RFETS cleanup requirements. Furthermore, it is fair to say at this point in time that laboratory developed design parameters for the reactive barrier technology are sufficient for fuel scale design; however,the treatment system longevity and the long-term fate of the contaminants are questions that remain unanswered. This project along with others such as the Durango, CO and Monticello, UT reactive barriers will provide the data to determine the long-term effectiveness and return on investment (ROI) for this technology for comparison to the baseline pump and treat.« less
Harnessing Big Data to Represent 30-meter Spatial Heterogeneity in Earth System Models
NASA Astrophysics Data System (ADS)
Chaney, N.; Shevliakova, E.; Malyshev, S.; Van Huijgevoort, M.; Milly, C.; Sulman, B. N.
2016-12-01
Terrestrial land surface processes play a critical role in the Earth system; they have a profound impact on the global climate, food and energy production, freshwater resources, and biodiversity. One of the most fascinating yet challenging aspects of characterizing terrestrial ecosystems is their field-scale (˜30 m) spatial heterogeneity. It has been observed repeatedly that the water, energy, and biogeochemical cycles at multiple temporal and spatial scales have deep ties to an ecosystem's spatial structure. Current Earth system models largely disregard this important relationship leading to an inadequate representation of ecosystem dynamics. In this presentation, we will show how existing global environmental datasets can be harnessed to explicitly represent field-scale spatial heterogeneity in Earth system models. For each macroscale grid cell, these environmental data are clustered according to their field-scale soil and topographic attributes to define unique sub-grid tiles. The state-of-the-art Geophysical Fluid Dynamics Laboratory (GFDL) land model is then used to simulate these tiles and their spatial interactions via the exchange of water, energy, and nutrients along explicit topographic gradients. Using historical simulations over the contiguous United States, we will show how a robust representation of field-scale spatial heterogeneity impacts modeled ecosystem dynamics including the water, energy, and biogeochemical cycles as well as vegetation composition and distribution.
Simulation of German PKL refill/reflood experiment K9A using RELAP4/MOD7. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, M.T.; Davis, C.B.; Behling, S.R.
This paper describes a RELAP4/MOD7 simulation of West Germany's Kraftwerk Union (KWU) Primary Coolant Loop (PKL) refill/reflood experiment K9A. RELAP4/MOD7, a best-estimate computer program for the calculation of thermal and hydraulic phenomena in a nuclear reactor or related system, is the latest version in the RELAP4 code development series. This study was the first major simulation using RELAP4/MOD7 since its release by the Idaho National Engineering Laboratory (INEL). The PKL facility is a reduced scale (1:134) representation of a typical West German four-loop 1300 MW pressurized water reactor (PWR). A prototypical scale of the total volume to power ratio wasmore » maintained. The test facility was designed specifically for an experiment simulating the refill/reflood phase of a Loss-of-Coolant Accident (LOCA).« less
Human Factors and Technical Considerations for a Computerized Operator Support System Prototype
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ulrich, Thomas Anthony; Lew, Roger Thomas; Medema, Heather Dawne
2015-09-01
A prototype computerized operator support system (COSS) has been developed in order to demonstrate the concept and provide a test bed for further research. The prototype is based on four underlying elements consisting of a digital alarm system, computer-based procedures, PI&D system representations, and a recommender module for mitigation actions. At this point, the prototype simulates an interface to a sensor validation module and a fault diagnosis module. These two modules will be fully integrated in the next version of the prototype. The initial version of the prototype is now operational at the Idaho National Laboratory using the U.S. Departmentmore » of Energy’s Light Water Reactor Sustainability (LWRS) Human Systems Simulation Laboratory (HSSL). The HSSL is a full-scope, full-scale glass top simulator capable of simulating existing and future nuclear power plant main control rooms. The COSS is interfaced to the Generic Pressurized Water Reactor (gPWR) simulator with industry-typical control board layouts. The glass top panels display realistic images of the control boards that can be operated by touch gestures. A section of the simulated control board was dedicated to the COSS human-system interface (HSI), which resulted in a seamless integration of the COSS into the normal control room environment. A COSS demonstration scenario has been developed for the prototype involving the Chemical & Volume Control System (CVCS) of the PWR simulator. It involves a primary coolant leak outside of containment that would require tripping the reactor if not mitigated in a very short timeframe. The COSS prototype presents a series of operator screens that provide the needed information and soft controls to successfully mitigate the event.« less
Modeling Gas Slug Break-up in the Lava Lake at Mt. Erebus, Antarctica
NASA Astrophysics Data System (ADS)
Velazquez, L. C.; Qin, Z.; Suckale, J.; Soldati, A.; Rust, A.; Cashman, K. V.
2017-12-01
Lava lakes are perhaps the most direct look scientists can take inside a volcano. They have thus become a fundamental component in our understanding of the dynamics of magmatic systems. Mount Erebus, Ross Island, Antarctica contains one of the most persistent and long-lived lava lakes on Earth, creating a unique and complex area of study. Its persistent magma degassing, convective overturns, and Strombolian eruptions have been studied through extensive field campaigns and analog as well as computational models. These provide diverse insights into the plumbing system not only at Mt. Erebus, but at other volcanoes as well. Eruptions at Erebus are episodic. One of the leading hypotheses to explain this episodicity is the rise and burst of large conduit-filling bubbles, known as gas slugs, at the lava lake surface. These slugs are thought to form deep in the plumbing system, rise through the conduit, and exit through the lava lake. The goal of this study is to investigate the stability of the hypothesized slugs as they transition from the conduit into the lava lake. Analogue laboratory results suggest that the flaring geometry at the transition point may trigger slug breakup and formation of separate daughter bubbles that then burst through the surface separately. We test this hypothesis through numerical simulations. Our model solves the two-fluid Navier-Stokes equations by calculating the conservation of mass and momentum in the gas and liquid. The laboratory experiments use a Hele-Shaw cell, in which the flaring geometry of the lava lake walls can be adjusted. A gas slug of variable volume is then injected into a liquid at different viscosities. We first validate our numerical simulations against these laboratory experiments and then proceed to investigate the same dynamics at the volcanic scale. At the natural scale, we investigate the same system parameters as at the lab scale. First results indicate that simulations reproduce experiments well. The results obtained at the volcano scale will help to assess how slug break-up alters the episodicity of degassing at the lava lake surface. A thorough understanding of this model will help constrain the main processes controlling the episodic eruptions at Mt. Erebus and other, similar volcanoes.
NASA Astrophysics Data System (ADS)
Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.
2014-12-01
The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.
Design of a laboratory scale fluidized bed reactor
NASA Astrophysics Data System (ADS)
Wikström, E.; Andersson, P.; Marklund, S.
1998-04-01
The aim of this project was to construct a laboratory scale fluidized bed reactor that simulates the behavior of full scale municipal solid waste combustors. The design of this reactor is thoroughly described. The size of the laboratory scale fluidized bed reactor is 5 kW, which corresponds to a fuel-feeding rate of approximately 1 kg/h. The reactor system consists of four parts: a bed section, a freeboard section, a convector (postcombustion zone), and an air pollution control (APC) device system. The inside diameter of the reactor is 100 mm at the bed section and it widens to 200 mm in diameter in the freeboard section; the total height of the reactor is 1760 mm. The convector part consists of five identical sections; each section is 2700 mm long and has an inside diameter of 44.3 mm. The reactor is flexible regarding the placement and number of sampling ports. At the beginning of the first convector unit and at the end of each unit there are sampling ports for organic micropollutants (OMP). This makes it possible to study the composition of the flue gases at various residence times. Sampling ports for inorganic compounds and particulate matter are also placed in the convector section. All operating parameters, reactor temperatures, concentrations of CO, CO2, O2, SO2, NO, and NO2 are continuously measured and stored at selected intervals for further evaluation. These unique features enable full control over the fuel feed, air flows, and air distribution as well as over the temperature profile. Elaborate details are provided regarding the configuration of the fuel-feeding systems, the fluidized bed, the convector section, and the APC device. This laboratory reactor enables detailed studies of the formation mechanisms of OMP, such as polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs), poly-chlorinated biphenyls (PCBs), and polychlorinated benzenes (PCBzs). With this system formation mechanisms of OMP occurring in both the combustion and postcombustion zones can be studied. Other advantages are memory effect minimization and the reduction of experimental costs compared to full scale combustors. Comparison of the combustion parameters and emission data from this 5 kW laboratory scale reactor with full scale combustors shows good agreement regarding emission levels and PCDD/PCDF congener patterns. This indicates that the important formation and degradation reactions of OMP in the reactor are the same formation mechanisms as in full scale combustors.
LAMMPS strong scaling performance optimization on Blue Gene/Q
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coffman, Paul; Jiang, Wei; Romero, Nichols A.
2014-11-12
LAMMPS "Large-scale Atomic/Molecular Massively Parallel Simulator" is an open-source molecular dynamics package from Sandia National Laboratories. Significant performance improvements in strong-scaling and time-to-solution for this application on IBM's Blue Gene/Q have been achieved through computational optimizations of the OpenMP versions of the short-range Lennard-Jones term of the CHARMM force field and the long-range Coulombic interaction implemented with the PPPM (particle-particle-particle mesh) algorithm, enhanced by runtime parameter settings controlling thread utilization. Additionally, MPI communication performance improvements were made to the PPPM calculation by re-engineering the parallel 3D FFT to use MPICH collectives instead of point-to-point. Performance testing was done using anmore » 8.4-million atom simulation scaling up to 16 racks on the Mira system at Argonne Leadership Computing Facility (ALCF). Speedups resulting from this effort were in some cases over 2x.« less
Laboratory evaluation of the pointing stability of the ASPS Vernier System
NASA Technical Reports Server (NTRS)
1980-01-01
The annular suspension and pointing system (ASPS) is an end-mount experiment pointing system designed for use in the space shuttle. The results of the ASPS Vernier System (AVS) pointing stability tests conducted in a laboratory environment are documented. A simulated zero-G suspension was used to support the test payload in the laboratory. The AVS and the suspension were modelled and incorporated into a simulation of the laboratory test. Error sources were identified and pointing stability sensitivities were determined via simulation. Statistical predictions of laboratory test performance were derived and compared to actual laboratory test results. The predicted mean pointing stability during simulated shuttle disturbances was 1.22 arc seconds; the actual mean laboratory test pointing stability was 1.36 arc seconds. The successful prediction of laboratory test results provides increased confidence in the analytical understanding of the AVS magnetic bearing technology and allows confident prediction of in-flight performance. Computer simulations of ASPS, operating in the shuttle disturbance environment, predict in-flight pointing stability errors less than 0.01 arc seconds.
NASA Astrophysics Data System (ADS)
Trapani, Davide; Zonta, Daniele; Molinari, Marco; Amditis, Angelos; Bimpas, Matthaios; Bertsch, Nicolas; Spiering, Vincent; Santana, Juan; Sterken, Tom; Torfs, Tom; Bairaktaris, Dimitris; Bairaktaris, Manos; Camarinopulos, Stefanos; Frondistou-Yannas, Mata; Ulieru, Dumitru
2012-04-01
This paper illustrates an experimental campaign conducted under laboratory conditions on a full-scale reinforced concrete three-dimensional frame instrumented with wireless sensors developed within the Memscon project. In particular it describes the assumptions which the experimental campaign was based on, the design of the structure, the laboratory setup and the results of the tests. The aim of the campaign was to validate the performance of Memscon sensing systems, consisting of wireless accelerometers and strain sensors, on a real concrete structure during construction and under an actual earthquake. Another aspect of interest was to assess the effectiveness of the full damage recognition procedure based on the data recorded by the sensors and the reliability of the Decision Support System (DSS) developed in order to provide the stakeholders recommendations for building rehabilitation and the costs of this. With these ends, a Eurocode 8 spectrum-compatible accelerogram with increasing amplitude was applied at the top of an instrumented concrete frame built in the laboratory. MEMSCON sensors were directly compared with wired instruments, based on devices available on the market and taken as references, during both construction and seismic simulation.
COTHERM: Geophysical Modeling of High Enthalpy Geothermal Systems
NASA Astrophysics Data System (ADS)
Grab, Melchior; Maurer, Hansruedi; Greenhalgh, Stewart
2014-05-01
In recent years geothermal heating and electricity generation have become an attractive alternative energy resource, especially natural high enthalpy geothermal systems such as in Iceland. However, the financial risk of installing and operating geothermal power plants is still high and more needs to be known about the geothermal processes and state of the reservoir in the subsurface. A powerful tool for probing the underground system structure is provided by geophysical techniques, which are able to detect flow paths and fracture systems without drilling. It has been amply demonstrated that small-scale features can be well imaged at shallow depths, but only gross structures can be delineated for depths of several kilometers, where most high enthalpy systems are located. Therefore a major goal of our study is to improve geophysical mapping strategies by multi-method geophysical simulations and synthetic data inversions, to better resolve structures at greater depth, characterize the reservoir and monitor any changes within it. The investigation forms part of project COTHERM - COmbined hydrological, geochemical and geophysical modeling of geoTHERMal systems - in which a holistic and synergistic approach is being adopted to achieve multidisciplinary cooperation and mutual benefit. The geophysical simulations are being performed in combination with hydrothermal fluid flow modeling and chemical fluid rock interaction modeling, to provide realistic constraints on lithology, pressure, temperature and fluid conditions of the subsurface. Two sites in Iceland have been selected for the study, Krafla and Reykjanes. As a starting point for the geophysical modeling, we seek to establish petrophysical relations, connecting rock properties and reservoir conditions with geophysical parameters such as seismic wave speed, attenuation, electrical conductivity and magnetic susceptibility with a main focus on seismic properties. Therefore, we follow a comprehensive approach involving three components: (1) A literature study to find relevant, existing theoretical models, (2) laboratory determinations to confirm their validity for Icelandic rocks of interest and (3) a field campaign to obtain in-situ, shallow rock properties from seismic and resistivity tomography surveys over a fossilized and exhumed geothermal system. Theoretical models describing physical behavior for rocks with strong inhomogeneities, complex pore structure and complicated fluid-rock interaction mechanisms are often poorly constrained and require the knowledge about a wide range of parameters that are difficult to quantify. Therefore we calibrate the theoretical models by laboratory measurements on samples of rocks, forming magmatic geothermal reservoirs. Since the samples used in the laboratory are limited in size, and laboratory equipment operates at much higher frequency than the instruments used in the field, the results need to be up-scaled from the laboratory scale to field scale. This is not a simple process and entails many uncertainties.
Process and Learning Outcomes from Remotely-Operated, Simulated, and Hands-on Student Laboratories
ERIC Educational Resources Information Center
Corter, James E.; Esche, Sven K.; Chassapis, Constantin; Ma, Jing; Nickerson, Jeffrey V.
2011-01-01
A large-scale, multi-year, randomized study compared learning activities and outcomes for hands-on, remotely-operated, and simulation-based educational laboratories in an undergraduate engineering course. Students (N = 458) worked in small-group lab teams to perform two experiments involving stress on a cantilever beam. Each team conducted the…
Next Generation Extended Lagrangian Quantum-based Molecular Dynamics
NASA Astrophysics Data System (ADS)
Negre, Christian
2017-06-01
A new framework for extended Lagrangian first-principles molecular dynamics simulations is presented, which overcomes shortcomings of regular, direct Born-Oppenheimer molecular dynamics, while maintaining important advantages of the unified extended Lagrangian formulation of density functional theory pioneered by Car and Parrinello three decades ago. The new framework allows, for the first time, energy conserving, linear-scaling Born-Oppenheimer molecular dynamics simulations, which is necessary to study larger and more realistic systems over longer simulation times than previously possible. Expensive, self-consinstent-field optimizations are avoided and normal integration time steps of regular, direct Born-Oppenheimer molecular dynamics can be used. Linear scaling electronic structure theory is presented using a graph-based approach that is ideal for parallel calculations on hybrid computer platforms. For the first time, quantum based Born-Oppenheimer molecular dynamics simulation is becoming a practically feasible approach in simulations of +100,000 atoms-representing a competitive alternative to classical polarizable force field methods. In collaboration with: Anders Niklasson, Los Alamos National Laboratory.
A Virtual Laboratory for the 4 Bed Molecular Sieve of the Carbon Dioxide Removal Assembly
NASA Technical Reports Server (NTRS)
Coker, Robert; Knox, James; O'Connor, Brian
2016-01-01
Ongoing work to improve water and carbon dioxide separation systems to be used on crewed space vehicles combines sub-scale systems testing and multi-physics simulations. Thus, as part of NASA's Advanced Exploration Systems (AES) program and the Life Support Systems Project (LSSP), fully predictive COMSOL Multiphysics models of the Four Bed Molecular Sieve (4BMS) of the Carbon Dioxide Removal Assembly (CDRA) on the International Space Station (ISS) have been developed. This Virtual Laboratory is being used to help reduce mass, power, and volume requirements for exploration missions. In this paper we describe current and planned modeling developments in the area of carbon dioxide removal to support future missions as well as the resolution of anomalies observed in the ISS CDRA.
Meteorological Predictions in Support of the Mars Science Laboratory Entry, Descent and Landing
NASA Astrophysics Data System (ADS)
Rothchild, A.; Rafkin, S. C.; Pielke, R. A., Sr.
2010-12-01
The Mars Science Laboratory (MSL) entry, descent, and landing (EDL) system employs a standard parachute strategy followed by a new sky crane concept where the rover is lowered to the ground via a tether from a hovering entry vehicle. As with previous missions, EDL system performance is sensitive to atmospheric conditions. While some observations characterizing the mean, large-scale atmospheric temperature and density data are available, there is effectively no information on the atmospheric conditions and variability at the scale that directly affects the spacecraft. In order to evaluate EDL system performance and to assess landing hazards and risk, it is necessary to simulate the atmosphere with a model that provides data at the appropriate spatial and temporal scales. Models also permit the study of the impact of the highly variable atmospheric dust loading on temperature, density and winds. There are four potential MSL landing sites: Mawrth Valle (22.3 N, 16.5W) , Gale Crater (5.4S, 137.7E), Holden Crater (26.1S, 34W), and Eberswalde Crater (24S, 33W). The final selection of the landing site will balance potential science return against landing and operational risk. Atmospheric modeling studies conducted with the Mars Regional Atmospheric Modeling System (MRAMS) is an integral part of the selection process. At each of the landing sites, a variety of simulations are conducted. The first type of simulations provide baseline predictions under nominal atmospheric dust loading conditions within the landing site window of ~Ls 150-170. The second type of simulation explores situations with moderate and high global atmospheric dust loading. The final type of simulation investigates the impact of local dust disturbances at the landing site. Mean and perturbation fields from each type of simulation at each of the potential landing sites are presented in comparison with the engineering performance limitations for the MSL EDL system. Within the lowest scale height, winds are strongly influenced by the local and regional topography and are highly variable in both space and time. Convective activity in the afternoon produces deep vertical circulations anchored primarily to topography. Aloft, winds become increasingly dominated by the large-scale circulation, but with gravity wave perturbations forced by both topography and boundary layer convective activity. The mean density field is tied directly to the level of dust loading; higher dust results in decreased densities and overall warming of the atmospheric column. In local and regional dust storm scenarios, winds are found to be enhanced, particularly in regions of active dust lifting. Local reductions in density are also pronounced. At present, the predicted mean and perturbation fields from all the simulations appear to fall within the engineering requirements, but not always comfortably so. This is in contrast to proposed landing sites for the Mars Exploration Rover mission, where the atmospheric environment presented unacceptable risk. Ongoing work is underway to confirm that atmospheric conditions will permit safe EDL operations with a tolerable level of risk.
Novel laboratory simulations of astrophysical jets
NASA Astrophysics Data System (ADS)
Brady, Parrish Clawson
This thesis was motivated by the promise that some physical aspects of astrophysical jets and collimation processes can be scaled to laboratory parameters through hydrodynamic scaling laws. The simulation of astrophysical jet phenomena with laser-produced plasmas was attractive because the laser- target interaction can inject energetic, repeatable plasma into an external environment. Novel laboratory simulations of astrophysical jets involved constructing and using the YOGA laser, giving a 1064 nm, 8 ns pulse laser with energies up to 3.7 + 0.2 J . Laser-produced plasmas were characterized using Schlieren, interferometry and ICCD photography for their use in simulating jet and magnetosphere physics. The evolution of the laser-produced plasma in various conditions was compared with self-similar solutions and HYADES computer simulations. Millimeter-scale magnetized collimated outflows were produced by a centimeter scale cylindrically symmetric electrode configuration triggered by a laser-produced plasma. A cavity with a flared nozzle surrounded the center electrode and the electrode ablation created supersonic uncollimated flows. This flow became collimated when the center electrode changed from an anodeto a cathode. The plasma jets were in axially directed permanent magnetic fields with strengths up to 5000 Gauss. The collimated magnetized jets were 0.1-0. 3 cm wide, up to 2.0 cm long, and had velocities of ~4.0 × 10 6 cm/s. The dynamics of the evolution of the jet were compared qualitatively and quantitatively with fluxtube simulations from Bellan's formulation [6] giving a calculated estimate of ~2.6 × 10 6 cm/s for jet evolution velocity and evidence for jet rotation. The density measured with interferometry was 1.9 ± 0.2 × 10 17 cm -3 compared with 2.1 × 10 16 cm -3 calculated with Bellan's pressure balance formulation. Kinks in the jet column were produced consistent with the Kruskal-Shafranov condition which allowed stable and symmetric jets to form with the background magnetic fields. The Euler number for the laboratory jet was 9 compared with an estimate of 40 for young stellar object jets [135] which demonstrated adequate scaling between the two frames. A second experiment was performed concerning laboratory simulations of magnetospheres with plasma winds impinging on permanent magnetic dipoles. The ratio of the magnetopause measured with ICCD photography to the calculated magnetopause standoff distance was ~2.
NASA Astrophysics Data System (ADS)
Robbins, Joshua; Voth, Thomas
2011-06-01
Material response to dynamic loading is often dominated by microstructure such as grain topology, porosity, inclusions, and defects; however, many models rely on assumptions of homogeneity. We use the probabilistic finite element method (WK Liu, IJNME, 1986) to introduce local uncertainty to account for material heterogeneity. The PFEM uses statistical information about the local material response (i.e., its expectation, coefficient of variation, and autocorrelation) drawn from knowledge of the microstructure, single crystal behavior, and direct numerical simulation (DNS) to determine the expectation and covariance of the system response (velocity, strain, stress, etc). This approach is compared to resolved grain-scale simulations of the equivalent system. The microstructures used for the DNS are produced using Monte Carlo simulations of grain growth, and a sufficient number of realizations are computed to ensure a meaningful comparison. Finally, comments are made regarding the suitability of one-dimensional PFEM for modeling material heterogeneity. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less
Trinity Phase 2 Open Science: CTH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruggirello, Kevin Patrick; Vogler, Tracy
CTH is an Eulerian hydrocode developed by Sandia National Laboratories (SNL) to solve a wide range of shock wave propagation and material deformation problems. Adaptive mesh refinement is also used to improve efficiency for problems with a wide range of spatial scales. The code has a history of running on a variety of computing platforms ranging from desktops to massively parallel distributed-data systems. For the Trinity Phase 2 Open Science campaign, CTH was used to study mesoscale simulations of the hypervelocity penetration of granular SiC powders. The simulations were compared to experimental data. A scaling study of CTH up tomore » 8192 KNL nodes was also performed, and several improvements were made to the code to improve the scalability.« less
Additional confirmation of the validity of laboratory simulation of cloud radiances
NASA Technical Reports Server (NTRS)
Davis, J. M.; Cox, S. K.
1986-01-01
The results of a laboratory experiment are presented that provide additional verification of the methodology adopted for simulation of the radiances reflected from fields of optically thick clouds using the Cloud Field Optical Simulator (CFOS) at Colorado State University. The comparison of these data with their theoretically derived counterparts indicates that the crucial mechanism of cloud-to-cloud radiance field interaction is accurately simulated in the CFOS experiments and adds confidence to the manner in which the optical depth is scaled.
Wang, Guan; Zhao, Junfei; Haringa, Cees; Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang; Deshmukh, Amit T; van Gulik, Walter; Heijnen, Joseph J; Noorman, Henk J
2018-05-01
In a 54 m 3 large-scale penicillin fermentor, the cells experience substrate gradient cycles at the timescales of global mixing time about 20-40 s. Here, we used an intermittent feeding regime (IFR) and a two-compartment reactor (TCR) to mimic these substrate gradients at laboratory-scale continuous cultures. The IFR was applied to simulate substrate dynamics experienced by the cells at full scale at timescales of tens of seconds to minutes (30 s, 3 min and 6 min), while the TCR was designed to simulate substrate gradients at an applied mean residence time (τc) of 6 min. A biological systems analysis of the response of an industrial high-yielding P. chrysogenum strain has been performed in these continuous cultures. Compared to an undisturbed continuous feeding regime in a single reactor, the penicillin productivity (q PenG ) was reduced in all scale-down simulators. The dynamic metabolomics data indicated that in the IFRs, the cells accumulated high levels of the central metabolites during the feast phase to actively cope with external substrate deprivation during the famine phase. In contrast, in the TCR system, the storage pool (e.g. mannitol and arabitol) constituted a large contribution of carbon supply in the non-feed compartment. Further, transcript analysis revealed that all scale-down simulators gave different expression levels of the glucose/hexose transporter genes and the penicillin gene clusters. The results showed that q PenG did not correlate well with exposure to the substrate regimes (excess, limitation and starvation), but there was a clear inverse relation between q PenG and the intracellular glucose level. © 2018 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
Rapaka, Narsimha R.; Sarkar, Sutanu
2016-10-01
A sharp-interface Immersed Boundary Method (IBM) is developed to simulate density-stratified turbulent flows in complex geometry using a Cartesian grid. The basic numerical scheme corresponds to a central second-order finite difference method, third-order Runge-Kutta integration in time for the advective terms and an alternating direction implicit (ADI) scheme for the viscous and diffusive terms. The solver developed here allows for both direct numerical simulation (DNS) and large eddy simulation (LES) approaches. Methods to enhance the mass conservation and numerical stability of the solver to simulate high Reynolds number flows are discussed. Convergence with second-order accuracy is demonstrated in flow past a cylinder. The solver is validated against past laboratory and numerical results in flow past a sphere, and in channel flow with and without stratification. Since topographically generated internal waves are believed to result in a substantial fraction of turbulent mixing in the ocean, we are motivated to examine oscillating tidal flow over a triangular obstacle to assess the ability of this computational model to represent nonlinear internal waves and turbulence. Results in laboratory-scale (order of few meters) simulations show that the wave energy flux, mean flow properties and turbulent kinetic energy agree well with our previous results obtained using a body-fitted grid (BFG). The deviation of IBM results from BFG results is found to increase with increasing nonlinearity in the wave field that is associated with either increasing steepness of the topography relative to the internal wave propagation angle or with the amplitude of the oscillatory forcing. LES is performed on a large scale ridge, of the order of few kilometers in length, that has the same geometrical shape and same non-dimensional values for the governing flow and environmental parameters as the laboratory-scale topography, but significantly larger Reynolds number. A non-linear drag law is utilized in the large-scale application to parameterize turbulent losses due to bottom friction at high Reynolds number. The large scale problem exhibits qualitatively similar behavior to the laboratory scale problem with some differences: slightly larger intensification of the boundary flow and somewhat higher non-dimensional values for the energy fluxed away by the internal wave field. The phasing of wave breaking and turbulence exhibits little difference between small-scale and large-scale obstacles as long as the important non-dimensional parameters are kept the same. We conclude that IBM is a viable approach to the simulation of internal waves and turbulence in high Reynolds number stratified flows over topography.
Prediction of Gas Injection Performance for Heterogeneous Reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blunt, Martin J.; Orr, Franklin M.
This report describes research carried out in the Department of Petroleum Engineering at Stanford University from September 1997 - September 1998 under the second year of a three-year grant from the Department of Energy on the "Prediction of Gas Injection Performance for Heterogeneous Reservoirs." The research effort is an integrated study of the factors affecting gas injection, from the pore scale to the field scale, and involves theoretical analysis, laboratory experiments, and numerical simulation. The original proposal described research in four areas: (1) Pore scale modeling of three phase flow in porous media; (2) Laboratory experiments and analysis of factorsmore » influencing gas injection performance at the core scale with an emphasis on the fundamentals of three phase flow; (3) Benchmark simulations of gas injection at the field scale; and (4) Development of streamline-based reservoir simulator. Each state of the research is planned to provide input and insight into the next stage, such that at the end we should have an integrated understanding of the key factors affecting field scale displacements.« less
Evolution of the Tropical Cyclone Integrated Data Exchange And Analysis System (TC-IDEAS)
NASA Technical Reports Server (NTRS)
Turk, J.; Chao, Y.; Haddad, Z.; Hristova-Veleva, S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Licata, S.; Poulsen, W.; Su, H.;
2010-01-01
The Tropical Cyclone Integrated Data Exchange and Analysis System (TC-IDEAS) is being jointly developed by the Jet Propulsion Laboratory (JPL) and the Marshall Space Flight Center (MSFC) as part of NASA's Hurricane Science Research Program. The long-term goal is to create a comprehensive tropical cyclone database of satellite and airborne observations, in-situ measurements and model simulations containing parameters that pertain to the thermodynamic and microphysical structure of the storms; the air-sea interaction processes; and the large-scale environment.
Nishizawa, Hiroaki; Nishimura, Yoshifumi; Kobayashi, Masato; Irle, Stephan; Nakai, Hiromi
2016-08-05
The linear-scaling divide-and-conquer (DC) quantum chemical methodology is applied to the density-functional tight-binding (DFTB) theory to develop a massively parallel program that achieves on-the-fly molecular reaction dynamics simulations of huge systems from scratch. The functions to perform large scale geometry optimization and molecular dynamics with DC-DFTB potential energy surface are implemented to the program called DC-DFTB-K. A novel interpolation-based algorithm is developed for parallelizing the determination of the Fermi level in the DC method. The performance of the DC-DFTB-K program is assessed using a laboratory computer and the K computer. Numerical tests show the high efficiency of the DC-DFTB-K program, a single-point energy gradient calculation of a one-million-atom system is completed within 60 s using 7290 nodes of the K computer. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Identification of Curie temperature distributions in magnetic particulate systems
NASA Astrophysics Data System (ADS)
Waters, J.; Berger, A.; Kramer, D.; Fangohr, H.; Hovorka, O.
2017-09-01
This paper develops a methodology for extracting the Curie temperature distribution from magnetisation versus temperature measurements which are realizable by standard laboratory magnetometry. The method is integral in nature, robust against various sources of measurement noise, and can be adopted to a wide range of granular magnetic materials and magnetic particle systems. The validity and practicality of the method is demonstrated using large-scale Monte-Carlo simulations of an Ising-like model as a proof of concept, and general conclusions are drawn about its applicability to different classes of systems and experimental conditions.
Voice control of the space shuttle video system
NASA Technical Reports Server (NTRS)
Bejczy, A. K.; Dotson, R. S.; Brown, J. W.; Lewis, J. L.
1981-01-01
A pilot voice control system developed at the Jet Propulsion Laboratory (JPL) to test and evaluate the feasibility of controlling the shuttle TV cameras and monitors by voice commands utilizes a commercially available discrete word speech recognizer which can be trained to the individual utterances of each operator. Successful ground tests were conducted using a simulated full-scale space shuttle manipulator. The test configuration involved the berthing, maneuvering and deploying a simulated science payload in the shuttle bay. The handling task typically required 15 to 20 minutes and 60 to 80 commands to 4 TV cameras and 2 TV monitors. The best test runs show 96 to 100 percent voice recognition accuracy.
The NASA Langley building solar project and the supporting Lewis solar technology program
NASA Technical Reports Server (NTRS)
Ragsdale, R. G.; Namkoong, D.
1974-01-01
A solar energy technology program is described that includes solar collector testing in an indoor solar simulator facility and in an outdoor test facility, property measurements of solar panel coatings, and operation of a laboratory-scale solar model system test facility. Early results from simulator tests indicate that non-selective coatings behave more nearly in accord with predicted performance than do selective coatings. Initial experiments on the decay rate of thermally stratified hot water in a storage tank have been run. Results suggest that where high temperature water is required, excess solar energy collected by a building solar system should be stored overnight in the form of chilled water rather than hot water.
Use of a PhET Interactive Simulation in General Chemistry Laboratory: Models of the Hydrogen Atom
ERIC Educational Resources Information Center
Clark, Ted M.; Chamberlain, Julia M.
2014-01-01
An activity supporting the PhET interactive simulation, Models of the Hydrogen Atom, has been designed and used in the laboratory portion of a general chemistry course. This article describes the framework used to successfully accomplish implementation on a large scale. The activity guides students through a comparison and analysis of the six…
An Implicit Solver on A Parallel Block-Structured Adaptive Mesh Grid for FLASH
NASA Astrophysics Data System (ADS)
Lee, D.; Gopal, S.; Mohapatra, P.
2012-07-01
We introduce a fully implicit solver for FLASH based on a Jacobian-Free Newton-Krylov (JFNK) approach with an appropriate preconditioner. The main goal of developing this JFNK-type implicit solver is to provide efficient high-order numerical algorithms and methodology for simulating stiff systems of differential equations on large-scale parallel computer architectures. A large number of natural problems in nonlinear physics involve a wide range of spatial and time scales of interest. A system that encompasses such a wide magnitude of scales is described as "stiff." A stiff system can arise in many different fields of physics, including fluid dynamics/aerodynamics, laboratory/space plasma physics, low Mach number flows, reactive flows, radiation hydrodynamics, and geophysical flows. One of the big challenges in solving such a stiff system using current-day computational resources lies in resolving time and length scales varying by several orders of magnitude. We introduce FLASH's preliminary implementation of a time-accurate JFNK-based implicit solver in the framework of FLASH's unsplit hydro solver.
NASA Technical Reports Server (NTRS)
1993-01-01
A description is given of each of the following Langley research and test facilities: 0.3-Meter Transonic Cryogenic Tunnel, 7-by 10-Foot High Speed Tunnel, 8-Foot Transonic Pressure Tunnel, 13-Inch Magnetic Suspension & Balance System, 14-by 22-Foot Subsonic Tunnel, 16-Foot Transonic Tunnel, 16-by 24-Inch Water Tunnel, 20-Foot Vertical Spin Tunnel, 30-by 60-Foot Wind Tunnel, Advanced Civil Transport Simulator (ACTS), Advanced Technology Research Laboratory, Aerospace Controls Research Laboratory (ACRL), Aerothermal Loads Complex, Aircraft Landing Dynamics Facility (ALDF), Avionics Integration Research Laboratory, Basic Aerodynamics Research Tunnel (BART), Compact Range Test Facility, Differential Maneuvering Simulator (DMS), Enhanced/Synthetic Vision & Spatial Displays Laboratory, Experimental Test Range (ETR) Flight Research Facility, General Aviation Simulator (GAS), High Intensity Radiated Fields Facility, Human Engineering Methods Laboratory, Hypersonic Facilities Complex, Impact Dynamics Research Facility, Jet Noise Laboratory & Anechoic Jet Facility, Light Alloy Laboratory, Low Frequency Antenna Test Facility, Low Turbulence Pressure Tunnel, Mechanics of Metals Laboratory, National Transonic Facility (NTF), NDE Research Laboratory, Polymers & Composites Laboratory, Pyrotechnic Test Facility, Quiet Flow Facility, Robotics Facilities, Scientific Visualization System, Scramjet Test Complex, Space Materials Research Laboratory, Space Simulation & Environmental Test Complex, Structural Dynamics Research Laboratory, Structural Dynamics Test Beds, Structures & Materials Research Laboratory, Supersonic Low Disturbance Pilot Tunnel, Thermal Acoustic Fatigue Apparatus (TAFA), Transonic Dynamics Tunnel (TDT), Transport Systems Research Vehicle, Unitary Plan Wind Tunnel, and the Visual Motion Simulator (VMS).
NASA Astrophysics Data System (ADS)
Cassiani, G.; dalla, E.; Brovelli, A.; Pitea, D.; Binley, A. M.
2003-04-01
The development of reliable constitutive laws to translate geophysical properties into hydrological ones is the fundamental step for successful applications of hydrogeophysical techniques. Many such laws have been proposed and applied, particularly with regard to two types of relationships: (a) between moisture content and dielectric properties, and (b) between electrical resistivity, rock structure and water saturation. The classical Archie's law belongs to this latter category. Archie's relationship has been widely used, starting from borehole logs applications, to translate geoelectrical measurements into estimates of saturation. However, in spite of its popularity, it remains an empirical relationship, the parameters of which must be calibrated case by case, e.g. on laboratory data. Pore-scale models have been recently recognized and used as powerful tools to investigate the constitutive relations of multiphase soils from a pore-scale point of view, because they bridge the microscopic and macroscopic scales. In this project, we develop and validate a three-dimensional pore-scale method to compute electrical properties of unsaturated and saturated porous media. First we simulate a random packing of spheres [1] that obeys the grain-size distribution and porosity of an experimental porous medium system; then we simulate primary drainage with a morphological approach [2]; finally, for each state of saturation during the drainage process, we solve the electrical conduction equation within the grain structure with a new numerical model and compute the apparent electrical resistivity of the porous medium. We apply the new method to a semi-consolidated Permo-Triassic Sandstone from the UK (Sherwood Sandstone) for which both pressure-saturation (Van Genuchten) and Archie's law parameters have been measured on laboratory samples. A comparison between simulated and measured relationships has been performed.
Laboratory study of sonic booms and their scaling laws. [ballistic range simulation
NASA Technical Reports Server (NTRS)
Toong, T. Y.
1974-01-01
This program undertook to seek a basic understanding of non-linear effects associated with caustics, through laboratory simulation experiments of sonic booms in a ballistic range and a coordinated theoretical study of scaling laws. Two cases of superbooms or enhanced sonic booms at caustics have been studied. The first case, referred to as acceleration superbooms, is related to the enhanced sonic booms generated during the acceleration maneuvers of supersonic aircrafts. The second case, referred to as refraction superbooms, involves the superbooms that are generated as a result of atmospheric refraction. Important theoretical and experimental results are briefly reported.
Information driving force and its application in agent-based modeling
NASA Astrophysics Data System (ADS)
Chen, Ting-Ting; Zheng, Bo; Li, Yan; Jiang, Xiong-Fei
2018-04-01
Exploring the scientific impact of online big-data has attracted much attention of researchers from different fields in recent years. Complex financial systems are typical open systems profoundly influenced by the external information. Based on the large-scale data in the public media and stock markets, we first define an information driving force, and analyze how it affects the complex financial system. The information driving force is observed to be asymmetric in the bull and bear market states. As an application, we then propose an agent-based model driven by the information driving force. Especially, all the key parameters are determined from the empirical analysis rather than from statistical fitting of the simulation results. With our model, both the stationary properties and non-stationary dynamic behaviors are simulated. Considering the mean-field effect of the external information, we also propose a few-body model to simulate the financial market in the laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vroomen, G.L.M.; Lievens, S.S.; Maes, J.P.
1999-08-01
EPDM (ethylene-propylene rubber) has been used for more than 25 years as the main elastomer in radiator hoses because it offers a well-balanced price/performance ratio in this field of application. Some years ago the automotive and rubber industry became aware of a problem called electrochemical degradation and cracking. Cooling systems broke down due to a typical cracking failure of some radiator hoses. Different test methods were developed to simulate and solve the problem on laboratory scale. The influence of different variables with respect to the electrochemical degradation and cracking. Cooling systems broke down due to a typical cracking failure ofmore » some radiator hoses. Different test methods were developed to simulate and solve the problem on laboratory scale. The influence of different variables with respect to the electrochemical degradation process has been investigated, but until recently the influence of the engine coolant was ignored. Using a test method developed by DSM elastomers, the influence of the composition of the engine coolant as well as of the EPDM composition has now been evaluated. This paper gives an overview of test results with different coolant technologies and offers a plausible explanation of the degradation mechanisms as a function of the elastomer composition.« less
NASA Astrophysics Data System (ADS)
Nole, M.; Daigle, H.; Cook, A.; Malinverno, A.; Hillman, J. I. T.
2016-12-01
We explore the gas hydrate-generating capacity of diffusive methane transport induced by solubility gradients due to pore size contrasts in lithologically heterogeneous marine sediments. Through the use of 1D, 2D, and 3D reactive transport simulations, we investigate scale-dependent processes in diffusion-dominated gas hydrate systems. These simulations all track a sand body, or series of sands, surrounded by clays as they are buried through the gas hydrate stability zone. Methane is sourced by microbial methanogenesis in the clays surrounding the sand layers. In 1D, simulations performed in a Lagrangian reference frame demonstrate that gas hydrate in thin sands (3.6 m thick) can occur in high saturations (upward of 70%) at the edges of sand bodies within the upper 400 meters below the seafloor. Diffusion of methane toward the center of the sand layer depends on the concentration gradient within the sand: broader sand pore size distributions with smaller median pore sizes enhance diffusive action toward the sand's center. Incorporating downhole log- and laboratory-derived sand pore size distributions, gas hydrate saturations in the center of the sand can reach 20% of the hydrate saturations at the sand's edges. Furthermore, we show that hydrate-free zones exist immediately above and below the sand and are approximately 5 m thick, depending on the sand-clay solubility contrast. A moving reference frame is also adopted in 2D, and the angle of gravity is rotated relative to the grid system to simulate a dipping sand layer. This is important to minimize diffusive edge effects or numerical diffusion that might be associated with a dipping sand in an Eulerian grid system oriented orthogonal to gravity. Two-dimensional simulations demonstrate the tendency for gas hydrate to accumulate downdip in a sand body because of greater methane transport at depth due to larger sand-clay solubility contrasts. In 3D, basin-scale simulations illuminate how convergent sand layers in a multilayered system can compete for diffusion from clays between them, resulting in relatively low hydrate saturations. All simulations suggest that when hydrate present in clays dissociates with burial, the additional dissolved methane is soaked up by nearby sands preserving high hydrate saturations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nole, Michael; Daigle, Hugh; Cook, Ann
We explore the gas hydrate-generating capacity of diffusive methane transport induced by solubility gradients due to pore size contrasts in lithologically heterogeneous marine sediments. Through the use of 1D, 2D, and 3D reactive transport simulations, we investigate scale-dependent processes in diffusion-dominated gas hydrate systems. These simulations all track a sand body, or series of sands, surrounded by clays as they are buried through the gas hydrate stability zone. Methane is sourced by microbial methanogenesis in the clays surrounding the sand layers. In 1D, simulations performed in a Lagrangian reference frame demonstrate that gas hydrate in thin sands (3.6 m thick)more » can occur in high saturations (upward of 70%) at the edges of sand bodies within the upper 400 meters below the seafloor. Diffusion of methane toward the center of the sand layer depends on the concentration gradient within the sand: broader sand pore size distributions with smaller median pore sizes enhance diffusive action toward the sand’s center. Incorporating downhole log- and laboratory-derived sand pore size distributions, gas hydrate saturations in the center of the sand can reach 20% of the hydrate saturations at the sand’s edges. Furthermore, we show that hydrate-free zones exist immediately above and below the sand and are approximately 5 m thick, depending on the sand-clay solubility contrast. A moving reference frame is also adopted in 2D, and the angle of gravity is rotated relative to the grid system to simulate a dipping sand layer. This is important to minimize diffusive edge effects or numerical diffusion that might be associated with a dipping sand in an Eulerian grid system oriented orthogonal to gravity. Two-dimensional simulations demonstrate the tendency for gas hydrate to accumulate downdip in a sand body because of greater methane transport at depth due to larger sand-clay solubility contrasts. In 3D, basin-scale simulations illuminate how convergent sand layers in a multilayered system can compete for diffusion from clays between them, resulting in relatively low hydrate saturations. All simulations suggest that when hydrate present in clays dissociates with burial, the additional dissolved methane is soaked up by nearby sands preserving high hydrate saturations.« less
Development of fire test methods for airplane interior materials
NASA Technical Reports Server (NTRS)
Tustin, E. A.
1978-01-01
Fire tests were conducted in a 737 airplane fuselage at NASA-JSC to characterize jet fuel fires in open steel pans (simulating post-crash fire sources and a ruptured airplane fuselage) and to characterize fires in some common combustibles (simulating in-flight fire sources). Design post-crash and in-flight fire source selections were based on these data. Large panels of airplane interior materials were exposed to closely-controlled large scale heating simulations of the two design fire sources in a Boeing fire test facility utilizing a surplused 707 fuselage section. Small samples of the same airplane materials were tested by several laboratory fire test methods. Large scale and laboratory scale data were examined for correlative factors. Published data for dangerous hazard levels in a fire environment were used as the basis for developing a method to select the most desirable material where trade-offs in heat, smoke and gaseous toxicant evolution must be considered.
Bertucco, Alberto; Beraldi, Mariaelena; Sforza, Eleonora
2014-08-01
In this work, the production of Scenedesmus obliquus in a continuous flat-plate laboratory-scale photobioreactor (PBR) under alternated day-night cycles was tested both experimentally and theoretically. Variation of light intensity according to the four seasons of the year were simulated experimentally by a tunable LED lamp, and effects on microalgal growth and productivity were measured to evaluate the conversion efficiency of light energy into biomass during the different seasons. These results were used to validate a mathematical model for algae growth that can be applied to simulate a large-scale production unit, carried out in a flat-plate PBR of similar geometry. The cellular concentration in the PBR was calculated in both steady-state and transient conditions, and the value of the maintenance kinetic term was correlated to experimental profiles. The relevance of this parameter was finally outlined.
Technology demonstration of space intravehicular automation and robotics
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Barker, L. Keith
1994-01-01
Automation and robotic technologies are being developed and capabilities demonstrated which would increase the productivity of microgravity science and materials processing in the space station laboratory module, especially when the crew is not present. The Automation Technology Branch at NASA Langley has been working in the area of intravehicular automation and robotics (IVAR) to provide a user-friendly development facility, to determine customer requirements for automated laboratory systems, and to improve the quality and efficiency of commercial production and scientific experimentation in space. This paper will describe the IVAR facility and present the results of a demonstration using a simulated protein crystal growth experiment inside a full-scale mockup of the space station laboratory module using a unique seven-degree-of-freedom robot.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wei-Ping Pan; Andy Wu; John T. Riley
This report is to present the progress made on the project ''Establishment of an Environmental Control Technology Laboratory (ECTL) with a Circulating Fluidized-Bed Combustion (CFBC) System'' during the period October 1, 2004 through December 31, 2004. The following tasks have been completed. First, the renovation of the new Combustion Laboratory and the construction of the Circulating Fluidized-Bed (CFB) Combustor Building have proceeded well. Second, the detailed design of supporting and hanging structures for the CFBC was completed. Third, the laboratory-scale simulated fluidized-bed facility was modified after completing a series of pretests. The two problems identified during the pretest were solved.more » Fourth, the carbonization of chicken waste and coal was investigated in a tube furnace and a Thermogravimetric Analyzer (TGA). The experimental results from this study are presented in this report. Finally, the proposed work for the next quarter has been outlined in this report.« less
PhysiCell: An open source physics-based cell simulator for 3-D multicellular systems.
Ghaffarizadeh, Ahmadreza; Heiland, Randy; Friedman, Samuel H; Mumenthaler, Shannon M; Macklin, Paul
2018-02-01
Many multicellular systems problems can only be understood by studying how cells move, grow, divide, interact, and die. Tissue-scale dynamics emerge from systems of many interacting cells as they respond to and influence their microenvironment. The ideal "virtual laboratory" for such multicellular systems simulates both the biochemical microenvironment (the "stage") and many mechanically and biochemically interacting cells (the "players" upon the stage). PhysiCell-physics-based multicellular simulator-is an open source agent-based simulator that provides both the stage and the players for studying many interacting cells in dynamic tissue microenvironments. It builds upon a multi-substrate biotransport solver to link cell phenotype to multiple diffusing substrates and signaling factors. It includes biologically-driven sub-models for cell cycling, apoptosis, necrosis, solid and fluid volume changes, mechanics, and motility "out of the box." The C++ code has minimal dependencies, making it simple to maintain and deploy across platforms. PhysiCell has been parallelized with OpenMP, and its performance scales linearly with the number of cells. Simulations up to 105-106 cells are feasible on quad-core desktop workstations; larger simulations are attainable on single HPC compute nodes. We demonstrate PhysiCell by simulating the impact of necrotic core biomechanics, 3-D geometry, and stochasticity on the dynamics of hanging drop tumor spheroids and ductal carcinoma in situ (DCIS) of the breast. We demonstrate stochastic motility, chemical and contact-based interaction of multiple cell types, and the extensibility of PhysiCell with examples in synthetic multicellular systems (a "cellular cargo delivery" system, with application to anti-cancer treatments), cancer heterogeneity, and cancer immunology. PhysiCell is a powerful multicellular systems simulator that will be continually improved with new capabilities and performance improvements. It also represents a significant independent code base for replicating results from other simulation platforms. The PhysiCell source code, examples, documentation, and support are available under the BSD license at http://PhysiCell.MathCancer.org and http://PhysiCell.sf.net.
NASA Technical Reports Server (NTRS)
Aguilar, R.
2006-01-01
Pratt & Whitney Rocketdyne has developed a real-time engine/vehicle system integrated health management laboratory, or testbed, for developing and testing health management system concepts. This laboratory simulates components of an integrated system such as the rocket engine, rocket engine controller, vehicle or test controller, as well as a health management computer on separate general purpose computers. These general purpose computers can be replaced with more realistic components such as actual electronic controllers and valve actuators for hardware-in-the-loop simulation. Various engine configurations and propellant combinations are available. Fault or failure insertion capability on-the-fly using direct memory insertion from a user console is used to test system detection and response. The laboratory is currently capable of simulating the flow-path of a single rocket engine but work is underway to include structural and multiengine simulation capability as well as a dedicated data acquisition system. The ultimate goal is to simulate as accurately and realistically as possible the environment in which the health management system will operate including noise, dynamic response of the engine/engine controller, sensor time delays, and asynchronous operation of the various components. The rationale for the laboratory is also discussed including limited alternatives for demonstrating the effectiveness and safety of a flight system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel
Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less
Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel; ...
2017-07-24
Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less
NASA Astrophysics Data System (ADS)
Bader, D. C.
2015-12-01
The Accelerated Climate Modeling for Energy (ACME) Project is concluding its first year. Supported by the Office of Science in the U.S. Department of Energy (DOE), its vision is to be "an ongoing, state-of-the-science Earth system modeling, modeling simulation and prediction project that optimizes the use of DOE laboratory resources to meet the science needs of the nation and the mission needs of DOE." Included in the "laboratory resources," is a large investment in computational, network and information technologies that will be utilized to both build better and more accurate climate models and broadly disseminate the data they generate. Current model diagnostic analysis and data dissemination technologies will not scale to the size of the simulations and the complexity of the models envisioned by ACME and other top tier international modeling centers. In this talk, the ACME Workflow component plans to meet these future needs will be described and early implementation examples will be highlighted.
Development of a Scale-up Tool for Pervaporation Processes
Thiess, Holger; Strube, Jochen
2018-01-01
In this study, an engineering tool for the design and optimization of pervaporation processes is developed based on physico-chemical modelling coupled with laboratory/mini-plant experiments. The model incorporates the solution-diffusion-mechanism, polarization effects (concentration and temperature), axial dispersion, pressure drop and the temperature drop in the feed channel due to vaporization of the permeating components. The permeance, being the key model parameter, was determined via dehydration experiments on a mini-plant scale for the binary mixtures ethanol/water and ethyl acetate/water. A second set of experimental data was utilized for the validation of the model for two chemical systems. The industrially relevant ternary mixture, ethanol/ethyl acetate/water, was investigated close to its azeotropic point and compared to a simulation conducted with the determined binary permeance data. Experimental and simulation data proved to agree very well for the investigated process conditions. In order to test the scalability of the developed engineering tool, large-scale data from an industrial pervaporation plant used for the dehydration of ethanol was compared to a process simulation conducted with the validated physico-chemical model. Since the membranes employed in both mini-plant and industrial scale were of the same type, the permeance data could be transferred. The comparison of the measured and simulated data proved the scalability of the derived model. PMID:29342956
LASSIE: simulating large-scale models of biochemical systems on GPUs.
Tangherloni, Andrea; Nobile, Marco S; Besozzi, Daniela; Mauri, Giancarlo; Cazzaniga, Paolo
2017-05-10
Mathematical modeling and in silico analysis are widely acknowledged as complementary tools to biological laboratory methods, to achieve a thorough understanding of emergent behaviors of cellular processes in both physiological and perturbed conditions. Though, the simulation of large-scale models-consisting in hundreds or thousands of reactions and molecular species-can rapidly overtake the capabilities of Central Processing Units (CPUs). The purpose of this work is to exploit alternative high-performance computing solutions, such as Graphics Processing Units (GPUs), to allow the investigation of these models at reduced computational costs. LASSIE is a "black-box" GPU-accelerated deterministic simulator, specifically designed for large-scale models and not requiring any expertise in mathematical modeling, simulation algorithms or GPU programming. Given a reaction-based model of a cellular process, LASSIE automatically generates the corresponding system of Ordinary Differential Equations (ODEs), assuming mass-action kinetics. The numerical solution of the ODEs is obtained by automatically switching between the Runge-Kutta-Fehlberg method in the absence of stiffness, and the Backward Differentiation Formulae of first order in presence of stiffness. The computational performance of LASSIE are assessed using a set of randomly generated synthetic reaction-based models of increasing size, ranging from 64 to 8192 reactions and species, and compared to a CPU-implementation of the LSODA numerical integration algorithm. LASSIE adopts a novel fine-grained parallelization strategy to distribute on the GPU cores all the calculations required to solve the system of ODEs. By virtue of this implementation, LASSIE achieves up to 92× speed-up with respect to LSODA, therefore reducing the running time from approximately 1 month down to 8 h to simulate models consisting in, for instance, four thousands of reactions and species. Notably, thanks to its smaller memory footprint, LASSIE is able to perform fast simulations of even larger models, whereby the tested CPU-implementation of LSODA failed to reach termination. LASSIE is therefore expected to make an important breakthrough in Systems Biology applications, for the execution of faster and in-depth computational analyses of large-scale models of complex biological systems.
NASA Astrophysics Data System (ADS)
Lin, S. J.
2015-12-01
The NOAA/Geophysical Fluid Dynamics Laboratory has been developing a unified regional-global modeling system with variable resolution capabilities that can be used for severe weather predictions (e.g., tornado outbreak events and cat-5 hurricanes) and ultra-high-resolution (1-km) regional climate simulations within a consistent global modeling framework. The fundation of this flexible regional-global modeling system is the non-hydrostatic extension of the vertically Lagrangian dynamical core (Lin 2004, Monthly Weather Review) known in the community as FV3 (finite-volume on the cubed-sphere). Because of its flexability and computational efficiency, the FV3 is one of the final candidates of NOAA's Next Generation Global Prediction System (NGGPS). We have built into the modeling system a stretched (single) grid capability, a two-way (regional-global) multiple nested grid capability, and the combination of the stretched and two-way nests, so as to make convection-resolving regional climate simulation within a consistent global modeling system feasible using today's High Performance Computing System. One of our main scientific goals is to enable simulations of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously regarded as impossible. In this presentation I will demonstrate that it is computationally feasible to simulate not only super-cell thunderstorms, but also the subsequent genesis of tornadoes using a global model that was originally designed for century long climate simulations. As a unified weather-climate modeling system, we evaluated the performance of the model with horizontal resolution ranging from 1 km to as low as 200 km. In particular, for downscaling studies, we have developed various tests to ensure that the large-scale circulation within the global varaible resolution system is well simulated while at the same time the small-scale can be accurately captured within the targeted high resolution region.
NASA Astrophysics Data System (ADS)
Wosnik, Martin; Bachant, Peter
2016-11-01
Cross-flow turbines show potential in marine hydrokinetic (MHK) applications. A research focus is on accurately predicting device performance and wake evolution to improve turbine array layouts for maximizing overall power output, i.e., minimizing wake interference, or taking advantage of constructive wake interaction. Experiments were carried with large laboratory-scale cross-flow turbines D O (1 m) using a turbine test bed in a large cross-section tow tank, designed to achieve sufficiently high Reynolds numbers for the results to be Reynolds number independent with respect to turbine performance and wake statistics, such that they can be reliably extrapolated to full scale and used for model validation. Several turbines of varying solidity were employed, including the UNH Reference Vertical Axis Turbine (RVAT) and a 1:6 scale model of the DOE-Sandia Reference Model 2 (RM2) turbine. To improve parameterization in array simulations, an actuator line model (ALM) was developed to provide a computationally feasible method for simulating full turbine arrays inside Navier-Stokes models. Results are presented for the simulation of performance and wake dynamics of cross-flow turbines and compared with experiments and body-fitted mesh, blade-resolving CFD. Supported by NSF-CBET Grant 1150797, Sandia National Laboratories.
Modeling Supernova Shocks with Intense Lasers.
NASA Astrophysics Data System (ADS)
Blue, Brent
2006-04-01
Large-scale directional outflows of supersonic plasma are ubiquitous phenomena in astrophysics, with specific application to supernovae. The traditional approach to understanding such phenomena is through theoretical analysis and numerical simulations. However, theoretical analysis might not capture all the relevant physics and numerical simulations have limited resolution and fail to scale correctly in Reynolds number and perhaps other key dimensionless parameters. Recent advances in high energy density physics using large inertial confinement fusion devices now allow controlled laboratory experiments on macroscopic volumes of plasma of direct relevance to astrophysics. This talk will present an overview of these facilities as well as results from current laboratory astrophysics experiments designed to study hydrodynamic jets and Rayleigh-Taylor mixing. This work is performed under the auspices of the U. S. DOE by Lawrence Livermore National Laboratory under Contract No. W-7405-ENG-48, Los Alamos National Laboratory under Contract No. W-7405-ENG-36, and the Laboratory for Laser Energetics under Contract No. DE-FC03-92SF19460.
Innovative mathematical modeling in environmental remediation.
Yeh, Gour-Tsyh; Gwo, Jin-Ping; Siegel, Malcolm D; Li, Ming-Hsu; Fang, Yilin; Zhang, Fan; Luo, Wensui; Yabusaki, Steve B
2013-05-01
There are two different ways to model reactive transport: ad hoc and innovative reaction-based approaches. The former, such as the Kd simplification of adsorption, has been widely employed by practitioners, while the latter has been mainly used in scientific communities for elucidating mechanisms of biogeochemical transport processes. It is believed that innovative mechanistic-based models could serve as protocols for environmental remediation as well. This paper reviews the development of a mechanistically coupled fluid flow, thermal transport, hydrologic transport, and reactive biogeochemical model and example-applications to environmental remediation problems. Theoretical bases are sufficiently described. Four example problems previously carried out are used to demonstrate how numerical experimentation can be used to evaluate the feasibility of different remediation approaches. The first one involved the application of a 56-species uranium tailing problem to the Melton Branch Subwatershed at Oak Ridge National Laboratory (ORNL) using the parallel version of the model. Simulations were made to demonstrate the potential mobilization of uranium and other chelating agents in the proposed waste disposal site. The second problem simulated laboratory-scale system to investigate the role of natural attenuation in potential off-site migration of uranium from uranium mill tailings after restoration. It showed inadequacy of using a single Kd even for a homogeneous medium. The third example simulated laboratory experiments involving extremely high concentrations of uranium, technetium, aluminum, nitrate, and toxic metals (e.g., Ni, Cr, Co). The fourth example modeled microbially-mediated immobilization of uranium in an unconfined aquifer using acetate amendment in a field-scale experiment. The purposes of these modeling studies were to simulate various mechanisms of mobilization and immobilization of radioactive wastes and to illustrate how to apply reactive transport models for environmental remediation. Copyright © 2011 Elsevier Ltd. All rights reserved.
MODELING SUPERSONIC-JET DEFLECTION IN THE HERBIG–HARO 110-270 SYSTEM WITH HIGH-POWER LASERS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Dawei; Li, Yutong; Lu, Xin
Herbig–Haro (HH) objects associated with newly born stars are typically characterized by two high Mach number jets ejected in opposite directions. However, HH 110 appears to only have a single jet instead of two. Recently, Kajdi et al. measured the proper motions of knots in the whole system and noted that HH 110 is a continuation of the nearby HH 270. It has been proved that the HH 270 collides with the surrounding mediums and is deflected by 58°, reshaping itself as HH 110. Although the scales of the astrophysical objects are very different from the plasmas created in themore » laboratory, similarity criteria of physical processes allow us to simulate the jet deflection in the HH 110/270 system in the laboratory with high power lasers. A controllable and repeatable laboratory experiment could give us insight into the deflection behavior. Here we show a well downscaled experiment in which a laser-produced supersonic-jet is deflected by 55° when colliding with a nearby orthogonal side-flow. We also present a two-dimensional hydrodynamic simulation with the Euler program, LARED-S, to reproduce the deflection. Both are in good agreement. Our results show that the large deflection angle formed in the HH 110/270 system is probably due to the ram pressure from a flow–flow collision model.« less
Small-scale impacts as potential trigger for landslides on small Solar system bodies
NASA Astrophysics Data System (ADS)
Hofmann, Marc; Sierks, Holger; Blum, Jürgen
2017-07-01
We conducted a set of experiments to investigate whether millimetre-sized impactors impinging on a granular material at several m s-1 are able to trigger avalanches on small, atmosphereless planetary bodies. These experiments were carried out at the Zentrum für angewandte Raumfahrttechnologie und Mikrogravitation (ZARM) drop tower facility in Bremen, Germany to facilitate a reduced gravity environment. Additional data were gathered at Earth gravity levels in the laboratory. As sample materials we used a ground Howardites, Eucrites and Diogenites (HED) meteorite and the Johnson Space Center (JSC) Mars-1 Martian soil simulant. We found that this type of small-scale impact can trigger avalanches with a moderate probability, if the target material is tilted to an angle close to the angle of repose. We additionally simulated a small-scale impact using the discrete element method code esys-particle. These simulations show that energy transfer from impactor to the target material is most efficient at low- and moderate-impactor inclinations and the transferred energy is retained in particles close to the surface due to a rapid dissipation of energy in lower material layers driven by inelastic collisions. Through Monte Carlo simulations we estimate the time-scale on which small-scale impacts with the observed characteristics will trigger avalanches covering all steep slopes on the surface of a small planetary body to be of the order 105 yr.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mikael
This 3-year project was a collaboration between University of California Irvine (UC Irvine), Pacific Northwest National Laboratory (PNNL), Idaho National Laboratory (INL), Argonne National Laboratory (ANL) and with an international collaborator at ForschungZentrum Jülich (FZJ). The project was led from UC Irvine under the direction of Profs. Mikael Nilsson and Hung Nguyen. The leads at PNNL, INL, ANL and FZJ were Dr. Liem Dang, Dr. Peter Zalupski, Dr. Nathaniel Hoyt and Dr. Giuseppe Modolo, respectively. Involved in this project at UC Irvine were three full time PhD graduate students, Tro Babikian, Ted Yoo, and Quynh Vo, and one MS student,more » Alba Font Bosch. The overall objective of this project was to study how the kinetics and thermodynamics of metal ion extraction can be described by molecular dynamic (MD) simulations and how the simulations can be validated by experimental data. Furthermore, the project includes the applied separation by testing the extraction systems in a single stage annular centrifugal contactor and coupling the experimental data with computational fluid dynamic (CFD) simulations. Specific objectives of the proposed research were: Study and establish a rigorous connection between MD simulations based on polarizable force fields and extraction thermodynamic and kinetic data. Compare and validate CFD simulations of extraction processes for An/Ln separation using different sizes (and types) of annular centrifugal contactors. Provide a theoretical/simulation and experimental base for scale-up of batch-wise extraction to continuous contactors. We approached objective 1 and 2 in parallel. For objective 1 we started by studying a well established extraction system with a relatively simple extraction mechanism, namely tributyl phosphate. What we found was that well optimized simulations can inform experiments and new information on TBP behavior was presented in this project, as well be discussed below. The second objective proved a larger challenge and most of the efforts were devoted to experimental studies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebrahimi, Fatima
2014-07-31
Large-scale magnetic fields have been observed in widely different types of astrophysical objects. These magnetic fields are believed to be caused by the so-called dynamo effect. Could a large-scale magnetic field grow out of turbulence (i.e. the alpha dynamo effect)? How could the topological properties and the complexity of magnetic field as a global quantity, the so called magnetic helicity, be important in the dynamo effect? In addition to understanding the dynamo mechanism in astrophysical accretion disks, anomalous angular momentum transport has also been a longstanding problem in accretion disks and laboratory plasmas. To investigate both dynamo and momentum transport,more » we have performed both numerical modeling of laboratory experiments that are intended to simulate nature and modeling of configurations with direct relevance to astrophysical disks. Our simulations use fluid approximations (Magnetohydrodynamics - MHD model), where plasma is treated as a single fluid, or two fluids, in the presence of electromagnetic forces. Our major physics objective is to study the possibility of magnetic field generation (so called MRI small-scale and large-scale dynamos) and its role in Magneto-rotational Instability (MRI) saturation through nonlinear simulations in both MHD and Hall regimes.« less
Virus elimination in activated sludge systems: from batch tests to mathematical modeling.
Haun, Emma; Ulbricht, Katharina; Nogueira, Regina; Rosenwinkel, Karl-Heinz
2014-01-01
A virus tool based on Activated Sludge Model No. 3 for modeling virus elimination in activated sludge systems was developed and calibrated with the results from laboratory-scale batch tests and from measurements in a municipal wastewater treatment plant (WWTP). The somatic coliphages were used as an indicator for human pathogenic enteric viruses. The extended model was used to simulate the virus concentration in batch tests and in a municipal full-scale WWTP under steady-state and dynamic conditions. The experimental and modeling results suggest that both adsorption and inactivation processes, modeled as reversible first-order reactions, contribute to virus elimination in activated sludge systems. The model should be a useful tool to estimate the number of viruses entering water bodies from the discharge of treated effluents.
Astromaterials Research Office (KR) Overview
NASA Technical Reports Server (NTRS)
Draper, David S.
2014-01-01
The fundamental goal of our research is to understand the origin and evolution of the solar system, particularly the terrestrial, "rocky" bodies. Our research involves analysis of, and experiments on, astromaterials in order to understand their nature, sources, and processes of formation. Our state-of-the-art analytical laboratories include four electron microbeam laboratories for mineral analysis, four spectroscopy laboratories for chemical and mineralogical analysis, and four mass spectrometry laboratories for isotopic analysis. Other facilities include the experimental impact laboratory and both 1-atm gas mixing and high-pressure experimental petrology laboratories. Recent research has emphasized a diverse range of topics, including: Study of the solar system's primitive materials, such as carbonaceous chondrites and interplanetary dust; Study of early solar system chronology using short-lived radioisotopes and early nebular processes through detailed geochemical and isotopic characterizations; Study of large-scale planetary differentiation and evolution via siderophile and incompatible trace element partitioning, magma ocean crystallization simulations, and isotopic systematics; Study of the petrogenesis of Martian meteorites through petrographic, isotopic, chemical, and experimental melting and crystallization studies; Interpretation of remote sensing data, especially from current robotic lunar and Mars missions, and study of terrestrial analog materials; Study of the role of organic geochemical processes in the evolution of astromaterials and the extent to which they constrain the potential for habitability and the origin of life.
NASA Astrophysics Data System (ADS)
Bershadskii, A.
1994-10-01
The quantitative (scaling) results of a recent lattice-gas simulation of granular flows [1] are interpreted in terms of Kolmogorov-Obukhov approach revised for strong space-intermittent systems. Renormalised power spectrum with exponent '-4/3' seems to be an universal spectrum of scalar fluctuations convected by stochastic velocity fields in dissipative systems with inverse energy transfer (some other laboratory and geophysic turbulent flows with this power spectrum as well as an analogy between this phenomenon and turbulent percolation on elastic backbone are pointed out).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T.; Jessee, Matthew Anderson
The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less
Full-scale fatigue tests of CX-100 wind turbine blades. Part I: testing
NASA Astrophysics Data System (ADS)
Farinholt, Kevin M.; Taylor, Stuart G.; Park, Gyuhae; Ammerman, Curtt M.
2012-04-01
This paper overviews the test setup and experimental methods for structural health monitoring (SHM) of two 9-meter CX-100 wind turbine blades that underwent fatigue loading at the National Renewable Energy Laboratory's (NREL) National Wind Technology Center (NWTC). The first blade was a pristine blade, which was manufactured to standard specifications for the CX-100 design. The second blade was manufactured for the University of Massachusetts, Lowell with intentional simulated defects within the fabric layup. Each blade was instrumented with piezoelectric transducers, accelerometers, acoustic emission sensors, and foil strain gauges. The blades underwent harmonic excitation at their first natural frequency using the Universal Resonant Excitation (UREX) system at NREL. Blades were initially excited at 25% of their design load, and then with steadily increasing loads until each blade reached failure. Data from the sensors were collected between and during fatigue loading sessions. The data were measured over multi-scale frequency ranges using a variety of acquisition equipment, including off-the-shelf systems and specially designed hardware developed at Los Alamos National Laboratory (LANL). The hardware systems were evaluated for their aptness in data collection for effective application of SHM methods to the blades. The results of this assessment will inform the selection of acquisition hardware and sensor types to be deployed on a CX-100 flight test to be conducted in collaboration with Sandia National Laboratory at the U.S. Department of Agriculture's (USDA) Conservation and Production Research Laboratory (CPRL) in Bushland, Texas.
NASA Astrophysics Data System (ADS)
Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.
2017-12-01
The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.
EMISSIONS OF AIR TOXICS FROM A SIMULATED CHARCOAL KILN EQUIPPED WITH AN AFTERBURNER
The report discusses emissions of air toxics from a simulated charcoal kiln equipped with an afterburner. A laboratory-scale simulator was constructed and tested to determine if it could be used to produce charcoal that was similar to that produced in Missouri-type charcoal kilns...
NASA Astrophysics Data System (ADS)
Arnold, B. W.; Lee, C.; Ma, C.; Knowlton, R. G.
2006-12-01
Taiwan is evaluating representative sites for the potential disposal of low-level radioactive waste (LLW), including consideration of shallow land burial and cavern disposal concepts. A representative site for shallow land burial is on a small island in the Taiwan Strait with basalt bedrock. The shallow land burial concept includes an engineered cover to limit infiltration into the waste disposal cell. A representative site for cavern disposal is located on the southeast coast of Taiwan. The tunnel system for this disposal concept would be several hundred meters below the mountainous land surface in argillite bedrock. The LLW will consist of about 966,000 drums, primarily from the operation and decommissioning of four nuclear power plants. Sandia National Laboratories and the Institute of Nuclear Energy Research have collaborated to develop performance assessment models to evaluate the long-term safety of LLW disposal at these representative sites. Important components of the system models are sub-models of groundwater flow in the natural system and infiltration through the engineered cover for the shallow land burial concept. The FEHM software code was used to simulate groundwater flow in three-dimensional models at both sites. In addition, a higher-resolution two-dimensional model was developed to simulate flow through the engineered tunnel system at the cavern site. The HELP software was used to simulate infiltration through the cover at the island site. The primary objective of these preliminary models is to provide a modeling framework, given the lack of site-specific data and detailed engineering design specifications. The steady-state groundwater flow model at the island site uses a specified recharge boundary at the land surface and specified head at the island shoreline. Simulated groundwater flow vectors are extracted from the FEHM model along a cross section through one of the LLW disposal cells for utilization in radionuclide transport simulations in the performance assessment model with the BLT-MS software. Infiltration through the engineered cover is simulated to be about 3 mm/yr and 49 mm/yr, with and without a geomembrane layer, respectively. For the cavern LLW disposal site, the FEHM basin-scale flow model uses specified recharge flux, constant head at the ocean shoreline, and head-dependent flux boundaries along flowing streams. Groundwater flow vectors are extracted along a cross section for use in radionuclide transport simulations. Transport simulations indicate that a significant fraction of contaminants may ultimately discharge to nearby streams. FEHM flow simulations with the drift-scale model indicate that the flow rates within the backfilled tunnels may be more than two orders of magnitude lower than in the host rock. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE-AC04-94AL85000.
NEW TECHNOLOGY AND PEC PROCESS - COLUMBUS, GA
The presentation will discuss Columbus, Georgia’s Biosolids Flow-through Thermophilic Treatment (BFT3) Process. Site-specific equivalency requires proof. Laboratory-scale pathogen testing must exceed Class A performance criteria while simulating full scale as closely as pos...
NASA Astrophysics Data System (ADS)
Nora, R.; Field, J. E.; Peterson, J. Luc; Spears, B.; Kruse, M.; Humbird, K.; Gaffney, J.; Springer, P. T.; Brandon, S.; Langer, S.
2017-10-01
We present an experimentally corroborated hydrodynamic extrapolation of several recent BigFoot implosions on the National Ignition Facility. An estimate on the value and error of the hydrodynamic scale necessary for ignition (for each individual BigFoot implosion) is found by hydrodynamically scaling a distribution of multi-dimensional HYDRA simulations whose outputs correspond to their experimental observables. The 11-parameter database of simulations, which include arbitrary drive asymmetries, dopant fractions, hydrodynamic scaling parameters, and surface perturbations due to surrogate tent and fill-tube engineering features, was computed on the TRINITY supercomputer at Los Alamos National Laboratory. This simple extrapolation is the first step in providing a rigorous calibration of our workflow to provide an accurate estimate of the efficacy of achieving ignition on the National Ignition Facility. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finsterle, S.; Moridis, G.J.; Pruess, K.
1994-01-01
The emplacement of liquids under controlled viscosity conditions is investigated by means of numerical simulations. Design calculations are performed for a laboratory experiment on a decimeter scale, and a field experiment on a meter scale. The purpose of the laboratory experiment is to study the behavior of multiple gout plumes when injected in a porous medium. The calculations for the field trial aim at designing a grout injection test from a vertical well in order to create a grout plume of a significant extent in the subsurface.
Simulation of Initiation in Hexanitrostilbene
NASA Astrophysics Data System (ADS)
Thompson, Aidan; Shan, Tzu-Ray; Yarrington, Cole; Wixom, Ryan
We report on the effect of isolated voids and pairs of nearby voids on hot spot formation, growth and chemical reaction initiation in hexanitrostilbene (HNS) crystals subjected to shock loading. Large-scale, reactive molecular dynamics simulations are performed using the reactive force field (ReaxFF) as implemented in the LAMMPS software. The ReaxFF force field description for HNS has been validated previously by comparing the isothermal equation of state to available diamond anvil cell (DAC) measurements and density function theory (DFT) calculations. Micron-scale molecular dynamics simulations of a supported shockwave propagating in HNS crystal along the [010] orientation are performed (up = 1.25 km/s, Us =4.0 km/s, P = 11GPa.) We compare the effect on hot spot formation and growth rate of isolated cylindrical voids up to 0.1 µm in size with that of two 50nm voids set 100nm apart. Results from the micron-scale atomistic simulations are compared with hydrodynamics simulations. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lock- heed Martin Corporation, for the U.S. DOE National Nuclear Security Administration under Contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Hillman, B. R.; Marchand, R.; Ackerman, T. P.
2016-12-01
Satellite instrument simulators have emerged as a means to reduce errors in model evaluation by producing simulated or psuedo-retrievals from model fields, which account for limitations in the satellite retrieval process. Because of the mismatch in resolved scales between satellite retrievals and large-scale models, model cloud fields must first be downscaled to scales consistent with satellite retrievals. This downscaling is analogous to that required for model radiative transfer calculations. The assumption is often made in both model radiative transfer codes and satellite simulators that the unresolved clouds follow maximum-random overlap with horizontally homogeneous cloud condensate amounts. We examine errors in simulated MISR and CloudSat retrievals that arise due to these assumptions by applying the MISR and CloudSat simulators to cloud resolving model (CRM) output generated by the Super-parameterized Community Atmosphere Model (SP-CAM). Errors are quantified by comparing simulated retrievals performed directly on the CRM fields with those simulated by first averaging the CRM fields to approximately 2-degree resolution, applying a "subcolumn generator" to regenerate psuedo-resolved cloud and precipitation condensate fields, and then applying the MISR and CloudSat simulators on the regenerated condensate fields. We show that errors due to both assumptions of maximum-random overlap and homogeneous condensate are significant (relative to uncertainties in the observations and other simulator limitations). The treatment of precipitation is particularly problematic for CloudSat-simulated radar reflectivity. We introduce an improved subcolumn generator for use with the simulators, and show that these errors can be greatly reduced by replacing the maximum-random overlap assumption with the more realistic generalized overlap and incorporating a simple parameterization of subgrid-scale cloud and precipitation condensate heterogeneity. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000. SAND NO. SAND2016-7485 A
An Integrated Crustal Dynamics Simulator
NASA Astrophysics Data System (ADS)
Xing, H. L.; Mora, P.
2007-12-01
Numerical modelling offers an outstanding opportunity to gain an understanding of the crustal dynamics and complex crustal system behaviour. This presentation provides our long-term and ongoing effort on finite element based computational model and software development to simulate the interacting fault system for earthquake forecasting. A R-minimum strategy based finite-element computational model and software tool, PANDAS, for modelling 3-dimensional nonlinear frictional contact behaviour between multiple deformable bodies with the arbitrarily-shaped contact element strategy has been developed by the authors, which builds up a virtual laboratory to simulate interacting fault systems including crustal boundary conditions and various nonlinearities (e.g. from frictional contact, materials, geometry and thermal coupling). It has been successfully applied to large scale computing of the complex nonlinear phenomena in the non-continuum media involving the nonlinear frictional instability, multiple material properties and complex geometries on supercomputers, such as the South Australia (SA) interacting fault system, South California fault model and Sumatra subduction model. It has been also extended and to simulate the hot fractured rock (HFR) geothermal reservoir system in collaboration of Geodynamics Ltd which is constructing the first geothermal reservoir system in Australia and to model the tsunami generation induced by earthquakes. Both are supported by Australian Research Council.
A microcomputer-based testing station for dynamic and static testing of protective relay systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, W.J.; Li, R.J.; Gu, J.C.
1995-12-31
Dynamic and static relay performance testing before installation in the field is a subject of great interest to utility relay engineers. The common practice in utility testing of new relays is to put the new unit to be tested in parallel with an existing functioning relay in the system, wait until an actual transient occurs and then observe and analyze the performance of new relay. It is impossible to have a thorough test of the protective relay system through this procedure. An equipment, Microcomputer-Based Testing Station (or PC-Based Testing Station), that can perform both static and dynamic testing of themore » relay is described in this paper. The Power System Simulation Laboratory at the University of Texas at Arlington is a scaled-down, three-phase, physical power system which correlates well with the important components for a real power system and is an ideal facility for the dynamic and static testing of protective relay systems. A brief introduction to the configuration of this laboratory is presented. Test results of several protective functions by using this laboratory illustrate the usefulness of this test set-up.« less
Development of a thermal storage module using modified anhydrous sodium hydroxide
NASA Technical Reports Server (NTRS)
Rice, R. E.; Rowny, P. E.
1980-01-01
The laboratory scale testing of a modified anhydrous NaOH latent heat storage concept for small solar thermal power systems such as total energy systems utilizing organic Rankine systems is discussed. A diagnostic test on the thermal energy storage module and an investigation of alternative heat transfer fluids and heat exchange concepts are specifically addressed. A previously developed computer simulation model is modified to predict the performance of the module in a solar total energy system environment. In addition, the computer model is expanded to investigate parametrically the incorporation of a second heat exchange inside the module which will vaporize and superheat the Rankine cycle power fluid.
NASA Astrophysics Data System (ADS)
Schruff, T.; Liang, R.; Rüde, U.; Schüttrumpf, H.; Frings, R. M.
2018-01-01
The knowledge of structural properties of granular materials such as porosity is highly important in many application-oriented and scientific fields. In this paper we present new results of computer-based packing simulations where we use the non-smooth granular dynamics (NSGD) method to simulate gravitational random dense packing of spherical particles with various particle size distributions and two types of depositional conditions. A bin packing scenario was used to compare simulation results to laboratory porosity measurements and to quantify the sensitivity of the NSGD regarding critical simulation parameters such as time step size. The results of the bin packing simulations agree well with laboratory measurements across all particle size distributions with all absolute errors below 1%. A large-scale packing scenario with periodic side walls was used to simulate the packing of up to 855,600 spherical particles with various particle size distributions (PSD). Simulation outcomes are used to quantify the effect of particle-domain-size ratio on the packing compaction. A simple correction model, based on the coordination number, is employed to compensate for this effect on the porosity and to determine the relationship between PSD and porosity. Promising accuracy and stability results paired with excellent computational performance recommend the application of NSGD for large-scale packing simulations, e.g. to further enhance the generation of representative granular deposits.
Communication Simulations for Power System Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuller, Jason C.; Ciraci, Selim; Daily, Jeffrey A.
2013-05-29
New smart grid technologies and concepts, such as dynamic pricing, demand response, dynamic state estimation, and wide area monitoring, protection, and control, are expected to require considerable communication resources. As the cost of retrofit can be high, future power grids will require the integration of high-speed, secure connections with legacy communication systems, while still providing adequate system control and security. While considerable work has been performed to create co-simulators for the power domain with load models and market operations, limited work has been performed in integrating communications directly into a power domain solver. The simulation of communication and power systemsmore » will become more important as the two systems become more inter-related. This paper will discuss ongoing work at Pacific Northwest National Laboratory to create a flexible, high-speed power and communication system co-simulator for smart grid applications. The framework for the software will be described, including architecture considerations for modular, high performance computing and large-scale scalability (serialization, load balancing, partitioning, cross-platform support, etc.). The current simulator supports the ns-3 (telecommunications) and GridLAB-D (distribution systems) simulators. Ongoing and future work will be described, including planned future expansions for a traditional transmission solver. A test case using the co-simulator, utilizing a transactive demand response system created for the Olympic Peninsula and AEP gridSMART demonstrations, requiring two-way communication between distributed and centralized market devices, will be used to demonstrate the value and intended purpose of the co-simulation environment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Blanc, Katya Lee; Spielman, Zachary Alexander; Rice, Brandon Charles
2016-04-01
This report describes the installation of two advanced control room technologies, an advanced alarm system and a computerized procedure system, into the Human Systems Simulation Laboratory (HSSL). Installation of these technologies enables future phases of this research by providing a platform to systematically evaluate the effect of these technologies on operator and plant performance.
NLS Flight Simulation Laboratory (FSL) documentation
NASA Technical Reports Server (NTRS)
1995-01-01
The Flight Simulation Laboratory (FSL) Electronic Documentation System design consists of modification and utilization of the MSFC Integrated Engineering System (IES), translation of the existing FSL documentation to an electronic format, and generation of new drawings to represent the Engine Flight Simulation Laboratory design and implementation. The intent of the electronic documentation is to provide ease of access, local print/plot capabilities, as well as the ability to correct and/or modify the stored data by network users who are authorized to access this information.
Multibillion-atom Molecular Dynamics Simulations of Plasticity, Spall, and Ejecta
NASA Astrophysics Data System (ADS)
Germann, Timothy C.
2007-06-01
Modern supercomputing platforms, such as the IBM BlueGene/L at Lawrence Livermore National Laboratory and the Roadrunner hybrid supercomputer being built at Los Alamos National Laboratory, are enabling large-scale classical molecular dynamics simulations of phenomena that were unthinkable just a few years ago. Using either the embedded atom method (EAM) description of simple (close-packed) metals, or modified EAM (MEAM) models of more complex solids and alloys with mixed covalent and metallic character, simulations containing billions to trillions of atoms are now practical, reaching volumes in excess of a cubic micron. In order to obtain any new physical insights, however, it is equally important that the analysis of such systems be tractable. This is in fact possible, in large part due to our highly efficient parallel visualization code, which enables the rendering of atomic spheres, Eulerian cells, and other geometric objects in a matter of minutes, even for tens of thousands of processors and billions of atoms. After briefly describing the BlueGene/L and Roadrunner architectures, and the code optimization strategies that were employed, results obtained thus far on BlueGene/L will be reviewed, including: (1) shock compression and release of a defective EAM Cu sample, illustrating the plastic deformation accompanying void collapse as well as the subsequent void growth and linkup upon release; (2) solid-solid martensitic phase transition in shock-compressed MEAM Ga; and (3) Rayleigh-Taylor fluid instability modeled using large-scale direct simulation Monte Carlo (DSMC) simulations. I will also describe our initial experiences utilizing Cell Broadband Engine processors (developed for the Sony PlayStation 3), and planned simulation studies of ejecta and spall failure in polycrystalline metals that will be carried out when the full Petaflop Opteron/Cell Roadrunner supercomputer is assembled in mid-2008.
Australia's marine virtual laboratory
NASA Astrophysics Data System (ADS)
Proctor, Roger; Gillibrand, Philip; Oke, Peter; Rosebrock, Uwe
2014-05-01
In all modelling studies of realistic scenarios, a researcher has to go through a number of steps to set up a model in order to produce a model simulation of value. The steps are generally the same, independent of the modelling system chosen. These steps include determining the time and space scales and processes of the required simulation; obtaining data for the initial set up and for input during the simulation time; obtaining observation data for validation or data assimilation; implementing scripts to run the simulation(s); and running utilities or custom-built software to extract results. These steps are time consuming and resource hungry, and have to be done every time irrespective of the simulation - the more complex the processes, the more effort is required to set up the simulation. The Australian Marine Virtual Laboratory (MARVL) is a new development in modelling frameworks for researchers in Australia. MARVL uses the TRIKE framework, a java-based control system developed by CSIRO that allows a non-specialist user configure and run a model, to automate many of the modelling preparation steps needed to bring the researcher faster to the stage of simulation and analysis. The tool is seen as enhancing the efficiency of researchers and marine managers, and is being considered as an educational aid in teaching. In MARVL we are developing a web-based open source application which provides a number of model choices and provides search and recovery of relevant observations, allowing researchers to: a) efficiently configure a range of different community ocean and wave models for any region, for any historical time period, with model specifications of their choice, through a user-friendly web application, b) access data sets to force a model and nest a model into, c) discover and assemble ocean observations from the Australian Ocean Data Network (AODN, http://portal.aodn.org.au/webportal/) in a format that is suitable for model evaluation or data assimilation, and d) run the assembled configuration in a cloud computing environment, or download the assembled configuration and packaged data to run on any other system of the user's choice. MARVL is now being applied in a number of case studies around Australia ranging in scale from locally confined estuaries to the Tasman Sea between Australia and New Zealand. In time we expect the range of models offered will include biogeochemical models.
System reliability of randomly vibrating structures: Computational modeling and laboratory testing
NASA Astrophysics Data System (ADS)
Sundar, V. S.; Ammanagi, S.; Manohar, C. S.
2015-09-01
The problem of determination of system reliability of randomly vibrating structures arises in many application areas of engineering. We discuss in this paper approaches based on Monte Carlo simulations and laboratory testing to tackle problems of time variant system reliability estimation. The strategy we adopt is based on the application of Girsanov's transformation to the governing stochastic differential equations which enables estimation of probability of failure with significantly reduced number of samples than what is needed in a direct simulation study. Notably, we show that the ideas from Girsanov's transformation based Monte Carlo simulations can be extended to conduct laboratory testing to assess system reliability of engineering structures with reduced number of samples and hence with reduced testing times. Illustrative examples include computational studies on a 10-degree of freedom nonlinear system model and laboratory/computational investigations on road load response of an automotive system tested on a four-post test rig.
Nanocoatings for High-Efficiency Industrial and Tooling Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blau, P; Qu, J.; Higdon, C.
This industry-driven project was the result of a successful response by Eaton Corporation to a DOE/ITP Program industry call. It consisted of three phases in which ORNL participated. In addition to Eaton Corporation and ORNL (CRADA), the project team included Ames Laboratory, who developed the underlying concept for aluminum-magnesium-boron based nanocomposite coatings [1], and Greenleaf, a small tooling manufacturer in western Pennsylvania. This report focuses on the portion of this work that was conducted by ORNL in a CRADA with Eaton Corporation. A comprehensive final report for the entire effort, which ended in September 2010, has been prepared by Eatonmore » Corporation. Phase I, “Proof of Concept” ran for one year (September 1, 2006 to September 30, 2007) during which the applicability of AlMgB14 single-phase and nanocomposite coatings on hydraulic material coupons and components as well as on tool inserts was demonstrated.. The coating processes used either plasma laser deposition (PLD) or physical vapor deposition (PVD). During Phase I, ORNL conducted laboratory-scale pin-on-disk and reciprocating pin-on-flat tests of coatings produced by PLD and PVD. Non-coated M2 tool steel was used as a baseline for comparison, and the material for the sliding counterface was Type 52100 bearing steel since it simulated the pump materials. Initial tests were run mainly in a commercial hydraulic fluid named Mobil DTE-24, but some tests were later run in a water-glycol mixture as well. A tribosystem analysis was conducted to define the operating conditions of pump components and to help develop simulative tests in Phase II. Phase II, “Coating Process Scale-up” was intended to use scaled-up process to generate prototype parts. This involved both PLD practices at Ames Lab, and a PVD scale-up study at Eaton using its production capable equipment. There was also a limited scale-up study at Greenleaf for the tooling application. ORNL continued to conduct friction and wear tests on process variants and developed tests to better simulate the applications of interest. ORNL also employed existing lubrication models to better understand hydraulic pump frictional behavior and test results. Phase III, “Functional Testing” focused on finalizing the strategy for commercialization of AlMgB14 coatings for both hydraulic and tooling systems. ORNL continued to provide tribology testing and analysis support for hydraulic pump applications. It included both laboratory-scale coupon testing and the analysis of friction and wear data from full component-level tests performed at Eaton Corp. Laboratory-scale tribology test methods are used to characterize the behavior of nanocomposite coatings prior to running them in full-sized hydraulic pumps. This task also includes developing tribosystems analyses, both to provide a better understanding of the performance of coated surfaces in alternate hydraulic fluids, and to help design useful laboratory protocols. Analysis also includes modeling the lubrication conditions and identifying the physical processes by which wear and friction of the contact interface changes over time. This final report summarizes ORNL’s portion of the nanocomposite coatings development effort and presents both generated data and the analyses that were used in the course of this effort.« less
Global ice sheet/RSL simulations using the higher-order Ice Sheet System Model.
NASA Astrophysics Data System (ADS)
Larour, E. Y.; Ivins, E. R.; Adhikari, S.; Schlegel, N.; Seroussi, H. L.; Morlighem, M.
2017-12-01
Relative sea-level rise is driven by processes that are intimately linked to the evolution ofglacial areas and ice sheets in particular. So far, most Earth System models capable of projecting theevolution of RSL on decadal to centennial time scales have relied on offline interactions between RSL andice sheets. In particular, grounding line and calving front dynamics have not been modeled in a way that istightly coupled with Elasto-Static Adjustment (ESA) and/or Glacial-Isostatic Adjustment (GIA). Here, we presenta new simulation of the entire Earth System in which both Greenland and Antarctica ice sheets are tightly coupledto an RSL model that includes both ESA and GIA at resolutions and time scales compatible with processes suchas grounding line dynamics for Antarctica ice shelves and calving front dynamics for Greenland marine-terminatingglaciers. The simulations rely on the Ice Sheet System Model (ISSM) and show the impact of higher-orderice flow dynamics and coupling feedbacks between ice flow and RSL. We quantify the exact impact of ESA andGIA inclusion on grounding line evolution for large ice shelves such as the Ronne and Ross ice shelves, as well asthe Agasea Embayment ice streams, and demonstate how offline vs online RSL simulations diverge in the long run,and the consequences for predictions of sea-level rise.This work was performed at the California Institute of Technology's Jet Propulsion Laboratory undera contract with the National Aeronautics and Space Administration's Cryosphere Science Program.
Laboratory simulation of space plasma phenomena*
NASA Astrophysics Data System (ADS)
Amatucci, B.; Tejero, E. M.; Ganguli, G.; Blackwell, D.; Enloe, C. L.; Gillman, E.; Walker, D.; Gatling, G.
2017-12-01
Laboratory devices, such as the Naval Research Laboratory's Space Physics Simulation Chamber, are large-scale experiments dedicated to the creation of large-volume plasmas with parameters realistically scaled to those found in various regions of the near-Earth space plasma environment. Such devices make valuable contributions to the understanding of space plasmas by investigating phenomena under carefully controlled, reproducible conditions, allowing for the validation of theoretical models being applied to space data. By working in collaboration with in situ experimentalists to create realistic conditions scaled to those found during the observations of interest, the microphysics responsible for the observed events can be investigated in detail not possible in space. To date, numerous investigations of phenomena such as plasma waves, wave-particle interactions, and particle energization have been successfully performed in the laboratory. In addition to investigations such as plasma wave and instability studies, the laboratory devices can also make valuable contributions to the development and testing of space plasma diagnostics. One example is the plasma impedance probe developed at NRL. Originally developed as a laboratory diagnostic, the sensor has now been flown on a sounding rocket, is included on a CubeSat experiment, and will be included on the DoD Space Test Program's STP-H6 experiment on the International Space Station. In this presentation, we will describe several examples of the laboratory investigation of space plasma waves and instabilities and diagnostic development. *This work supported by the NRL Base Program.
NASA Astrophysics Data System (ADS)
Miller, M. A.; Miller, N. L.; Sale, M. J.; Springer, E. P.; Wesely, M. L.; Bashford, K. E.; Conrad, M. E.; Costigan, K. R.; Kemball-Cook, S.; King, A. W.; Klazura, G. E.; Lesht, B. M.; Machavaram, M. V.; Sultan, M.; Song, J.; Washington-Allen, R.
2001-12-01
A multi-laboratory Department of Energy (DOE) team (Argonne National Laboratory, Brookhaven National Laboratory, Los Alamos National Laboratory, Lawrence Berkeley National Laboratory, Oak Ridge National Laboratory) has begun an investigation of hydrometeorological processes at the Whitewater subbasin of the Walnut River Watershed in Kansas. The Whitewater sub-basin is viewed as a DOE long-term hydrologic research watershed and resides within the well-instrumented Atmospheric Radiation Measurement/Cloud Radiation Atmosphere Testbed (ARM/CART) and the proposed Arkansas-Red River regional hydrologic testbed. The focus of this study is the development and evaluation of coupled regional to watershed scale models that simulate atmospheric, land surface, and hydrologic processes as systems with linkages and feedback mechanisms. This pilot is the precursor to the proposed DOE Water Cycle Dynamics Prediction Program. An important new element is the introduction of water isotope budget equations into mesoscale and hydrologic modeling. Two overarching hypotheses are part of this pilot study: (1) Can the predictability of the regional water balance be improved using high-resolution model simulations that are constrained and validated using new water isotope and hydrospheric water measurements? (2) Can water isotopic tracers be used to segregate different pathways through the water cycle and predict a change in regional climate patterns? Initial results of the pilot will be presented along with a description and copies of the proposed DOE Water Cycle Dynamics Prediction Program.
Thermal Conductivity within Nanoparticle Powder Beds
NASA Astrophysics Data System (ADS)
Wilson, Mark; Chandross, Michael
Non-equilibrium molecular dynamics is utilized to compute thermal transport properties within nanoparticle powder beds. In the realm of additive manufacturing of metals, the electronic contribution to thermal conduction is critical. To this end, our simulations incorporate the two temperature model, coupling a continuum representation of the electronic thermal contribution and the atomic phonon system. The direct method is used for conductivity determination, wherein thermal gradients between two different temperature heat flux reservoirs are calculated. The approach is demonstrated on several example cases including 304L stainless steel. The results from size distribution variations of mono/poly-disperse systems are extrapolated to predict values at the micron length scale, along with bulk properties at infinite system sizes. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Cloud-enabled large-scale land surface model simulations with the NASA Land Information System
NASA Astrophysics Data System (ADS)
Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.
2017-12-01
Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.
NASA Astrophysics Data System (ADS)
Morikawa, Y.; Murata, K. T.; Watari, S.; Kato, H.; Yamamoto, K.; Inoue, S.; Tsubouchi, K.; Fukazawa, K.; Kimura, E.; Tatebe, O.; Shimojo, S.
2010-12-01
Main methodologies of Solar-Terrestrial Physics (STP) so far are theoretical, experimental and observational, and computer simulation approaches. Recently "informatics" is expected as a new (fourth) approach to the STP studies. Informatics is a methodology to analyze large-scale data (observation data and computer simulation data) to obtain new findings using a variety of data processing techniques. At NICT (National Institute of Information and Communications Technology, Japan) we are now developing a new research environment named "OneSpaceNet". The OneSpaceNet is a cloud-computing environment specialized for science works, which connects many researchers with high-speed network (JGN: Japan Gigabit Network). The JGN is a wide-area back-born network operated by NICT; it provides 10G network and many access points (AP) over Japan. The OneSpaceNet also provides with rich computer resources for research studies, such as super-computers, large-scale data storage area, licensed applications, visualization devices (like tiled display wall: TDW), database/DBMS, cluster computers (4-8 nodes) for data processing and communication devices. What is amazing in use of the science cloud is that a user simply prepares a terminal (low-cost PC). Once connecting the PC to JGN2plus, the user can make full use of the rich resources of the science cloud. Using communication devices, such as video-conference system, streaming and reflector servers, and media-players, the users on the OneSpaceNet can make research communications as if they belong to a same (one) laboratory: they are members of a virtual laboratory. The specification of the computer resources on the OneSpaceNet is as follows: The size of data storage we have developed so far is almost 1PB. The number of the data files managed on the cloud storage is getting larger and now more than 40,000,000. What is notable is that the disks forming the large-scale storage are distributed to 5 data centers over Japan (but the storage system performs as one disk). There are three supercomputers allocated on the cloud, one from Tokyo, one from Osaka and the other from Nagoya. One's simulation job data on any supercomputers are saved on the cloud data storage (same directory); it is a kind of virtual computing environment. The tiled display wall has 36 panels acting as one display; the pixel (resolution) size of it is as large as 18000x4300. This size is enough to preview or analyze the large-scale computer simulation data. It also allows us to take a look of multiple (e.g., 100 pictures) on one screen together with many researchers. In our talk we also present a brief report of the initial results using the OneSpaceNet for Global MHD simulations as an example of successful use of our science cloud; (i) Ultra-high time resolution visualization of Global MHD simulations on the large-scale storage and parallel processing system on the cloud, (ii) Database of real-time Global MHD simulation and statistic analyses of the data, and (iii) 3D Web service of Global MHD simulations.
Laboratory generated M -6 earthquakes
McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.
2014-01-01
We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.
NASA Astrophysics Data System (ADS)
Guan, Mingfu; Ahilan, Sangaralingam; Yu, Dapeng; Peng, Yong; Wright, Nigel
2018-01-01
Fine sediment plays crucial and multiple roles in the hydrological, ecological and geomorphological functioning of river systems. This study employs a two-dimensional (2D) numerical model to track the hydro-morphological processes dominated by fine suspended sediment, including the prediction of sediment concentration in flow bodies, and erosion and deposition caused by sediment transport. The model is governed by 2D full shallow water equations with which an advection-diffusion equation for fine sediment is coupled. Bed erosion and sedimentation are updated by a bed deformation model based on local sediment entrainment and settling flux in flow bodies. The model is initially validated with the three laboratory-scale experimental events where suspended load plays a dominant role. Satisfactory simulation results confirm the model's capability in capturing hydro-morphodynamic processes dominated by fine suspended sediment at laboratory-scale. Applications to sedimentation in a stormwater pond are conducted to develop the process-based understanding of fine sediment dynamics over a variety of flow conditions. Urban flows with 5-year, 30-year and 100-year return period and the extreme flood event in 2012 are simulated. The modelled results deliver a step change in understanding fine sediment dynamics in stormwater ponds. The model is capable of quantitatively simulating and qualitatively assessing the performance of a stormwater pond in managing urban water quantity and quality.
3D-PTV around Operational Wind Turbines
NASA Astrophysics Data System (ADS)
Brownstein, Ian; Dabiri, John
2016-11-01
Laboratory studies and numerical simulations of wind turbines are typically constrained in how they can inform operational turbine behavior. Laboratory experiments are usually unable to match both pertinent parameters of full-scale wind turbines, the Reynolds number (Re) and tip speed ratio, using scaled-down models. Additionally, numerical simulations of the flow around wind turbines are constrained by the large domain size and high Re that need to be simulated. When these simulations are preformed, turbine geometry is typically simplified resulting in flow structures near the rotor not being well resolved. In order to bypass these limitations, a quantitative flow visualization method was developed to take in situ measurements of the flow around wind turbines at the Field Laboratory for Optimized Wind Energy (FLOWE) in Lancaster, CA. The apparatus constructed was able to seed an approximately 9m x 9m x 5m volume in the wake of the turbine using artificial snow. Quantitative measurements were obtained by tracking the evolution of the artificial snow using a four camera setup. The methodology for calibrating and collecting data, as well as preliminary results detailing the flow around a 2kW vertical-axis wind turbine (VAWT), will be presented.
A system dynamics approach to analyze laboratory test errors.
Guo, Shijing; Roudsari, Abdul; Garcez, Artur d'Avila
2015-01-01
Although many researches have been carried out to analyze laboratory test errors during the last decade, it still lacks a systemic view of study, especially to trace errors during test process and evaluate potential interventions. This study implements system dynamics modeling into laboratory errors to trace the laboratory error flows and to simulate the system behaviors while changing internal variable values. The change of the variables may reflect a change in demand or a proposed intervention. A review of literature on laboratory test errors was given and provided as the main data source for the system dynamics model. Three "what if" scenarios were selected for testing the model. System behaviors were observed and compared under different scenarios over a period of time. The results suggest system dynamics modeling has potential effectiveness of helping to understand laboratory errors, observe model behaviours, and provide a risk-free simulation experiments for possible strategies.
Tri-Laboratory Linux Capacity Cluster 2007 SOW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seager, M
2007-03-22
The Advanced Simulation and Computing (ASC) Program (formerly know as Accelerated Strategic Computing Initiative, ASCI) has led the world in capability computing for the last ten years. Capability computing is defined as a world-class platform (in the Top10 of the Top500.org list) with scientific simulations running at scale on the platform. Example systems are ASCI Red, Blue-Pacific, Blue-Mountain, White, Q, RedStorm, and Purple. ASC applications have scaled to multiple thousands of CPUs and accomplished a long list of mission milestones on these ASC capability platforms. However, the computing demands of the ASC and Stockpile Stewardship programs also include a vastmore » number of smaller scale runs for day-to-day simulations. Indeed, every 'hero' capability run requires many hundreds to thousands of much smaller runs in preparation and post processing activities. In addition, there are many aspects of the Stockpile Stewardship Program (SSP) that can be directly accomplished with these so-called 'capacity' calculations. The need for capacity is now so great within the program that it is increasingly difficult to allocate the computer resources required by the larger capability runs. To rectify the current 'capacity' computing resource shortfall, the ASC program has allocated a large portion of the overall ASC platforms budget to 'capacity' systems. In addition, within the next five to ten years the Life Extension Programs (LEPs) for major nuclear weapons systems must be accomplished. These LEPs and other SSP programmatic elements will further drive the need for capacity calculations and hence 'capacity' systems as well as future ASC capability calculations on 'capability' systems. To respond to this new workload analysis, the ASC program will be making a large sustained strategic investment in these capacity systems over the next ten years, starting with the United States Government Fiscal Year 2007 (GFY07). However, given the growing need for 'capability' systems as well, the budget demands are extreme and new, more cost effective ways of fielding these systems must be developed. This Tri-Laboratory Linux Capacity Cluster (TLCC) procurement represents the ASC first investment vehicle in these capacity systems. It also represents a new strategy for quickly building, fielding and integrating many Linux clusters of various sizes into classified and unclassified production service through a concept of Scalable Units (SU). The programmatic objective is to dramatically reduce the overall Total Cost of Ownership (TCO) of these 'capacity' systems relative to the best practices in Linux Cluster deployments today. This objective only makes sense in the context of these systems quickly becoming very robust and useful production clusters under the crushing load that will be inflicted on them by the ASC and SSP scientific simulation capacity workload.« less
NASA Astrophysics Data System (ADS)
Wietsma, T. W.; Oostrom, M.; Foster, N. S.
2003-12-01
Intermediate-scale experiments (ISEs) for flow and transport are a valuable tool for simulating subsurface features and conditions encountered in the field at government and private sites. ISEs offer the ability to study, under controlled laboratory conditions, complicated processes characteristic of mixed wastes and heterogeneous subsurface environments, in multiple dimensions and at different scales. ISEs may, therefore, result in major cost savings if employed prior to field studies. A distinct advantage of ISEs is that researchers can design physical and/or chemical heterogeneities in the porous media matrix that better approximate natural field conditions and therefore address research questions that contain the additional complexity of processes often encountered in the natural environment. A new Subsurface Flow and Transport Laboratory (SFTL) has been developed for ISE users in the Environmental Spectroscopy & Biogeochemistry Facility in the Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). The SFTL offers a variety of columns and flow cells, a new state-of-the-art dual-energy gamma system, a fully automated saturation-pressure apparatus, and analytical equipment for sample processing. The new facility, including qualified staff, is available for scientists interested in collaboration on conducting high-quality flow and transport experiments, including contaminant remediation. Close linkages exist between the SFTL and numerical modelers to aid in experimental design and interpretation. This presentation will discuss the facility and outline the procedures required to submit a proposal to use this unique facility for research purposes. The W. R. Wiley Environmental Molecular Sciences Laboratory, a national scientific user facility, is sponsored by the U.S. Department of Energy's Office of Biological and Environmental Research and located at Pacific Northwest National Laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Mark D.; McPherson, Brian J.; Grigg, Reid B.
Numerical simulation is an invaluable analytical tool for scientists and engineers in making predictions about of the fate of carbon dioxide injected into deep geologic formations for long-term storage. Current numerical simulators for assessing storage in deep saline formations have capabilities for modeling strongly coupled processes involving multifluid flow, heat transfer, chemistry, and rock mechanics in geologic media. Except for moderate pressure conditions, numerical simulators for deep saline formations only require the tracking of two immiscible phases and a limited number of phase components, beyond those comprising the geochemical reactive system. The requirements for numerically simulating the utilization and storagemore » of carbon dioxide in partially depleted petroleum reservoirs are more numerous than those for deep saline formations. The minimum number of immiscible phases increases to three, the number of phase components may easily increase fourfold, and the coupled processes of heat transfer, geochemistry, and geomechanics remain. Public and scientific confidence in the ability of numerical simulators used for carbon dioxide sequestration in deep saline formations has advanced via a natural progression of the simulators being proven against benchmark problems, code comparisons, laboratory-scale experiments, pilot-scale injections, and commercial-scale injections. This paper describes a new numerical simulator for the scientific investigation of carbon dioxide utilization and storage in partially depleted petroleum reservoirs, with an emphasis on its unique features for scientific investigations; and documents the numerical simulation of the utilization of carbon dioxide for enhanced oil recovery in the western section of the Farnsworth Unit and represents an early stage in the progression of numerical simulators for carbon utilization and storage in depleted oil reservoirs.« less
Gate Set Tomography on two qubits
NASA Astrophysics Data System (ADS)
Nielsen, Erik; Blume-Kohout, Robin; Gamble, John; Rudinger, Kenneth
Gate set tomography (GST) is a method for characterizing quantum gates that does not require pre-calibrated operations, and has been used to both certify and improve the operation of single qubits. We analyze the performance of GST applied to a simulated two-qubit system, and show that Heisenberg scaling is achieved in this case. We present a GST analysis of preliminary two-qubit experimental data, and draw comparisons with the simulated data case. Finally, we will discuss recent theoretical developments that have improved the efficiency of GST estimation procedures, and which are particularly beneficial when characterizing two qubit systems. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Danáčová, Michaela; Valent, Peter; Výleta, Roman
2017-12-01
Nowadays, rainfall simulators are being used by many researchers in field or laboratory experiments. The main objective of most of these experiments is to better understand the underlying runoff generation processes, and to use the results in the process of calibration and validation of hydrological models. Many research groups have assembled their own rainfall simulators, which comply with their understanding of rainfall processes, and the requirements of their experiments. Most often, the existing rainfall simulators differ mainly in the size of the irrigated area, and the way they generate rain drops. They can be characterized by the accuracy, with which they produce a rainfall of a given intensity, the size of the irrigated area, and the rain drop generating mechanism. Rainfall simulation experiments can provide valuable information about the genesis of surface runoff, infiltration of water into soil and rainfall erodibility. Apart from the impact of physical properties of soil, its moisture and compaction on the generation of surface runoff and the amount of eroded particles, some studies also investigate the impact of vegetation cover of the whole area of interest. In this study, the rainfall simulator was used to simulate the impact of the slope gradient of the irrigated area on the amount of generated runoff and sediment yield. In order to eliminate the impact of external factors and to improve the reproducibility of the initial conditions, the experiments were conducted in laboratory conditions. The laboratory experiments were carried out using a commercial rainfall simulator, which was connected to an external peristaltic pump. The pump maintained a constant and adjustable inflow of water, which enabled to overcome the maximum volume of simulated precipitation of 2.3 l, given by the construction of the rainfall simulator, while maintaining constant characteristics of the simulated precipitation. In this study a 12-minute rainfall with a constant intensity of 5 mm/min was used to irrigate a corrupted soil sample. The experiment was undertaken for several different slopes, under the condition of no vegetation cover. The results of the rainfall simulation experiment complied with the expectations of a strong relationship between the slope gradient, and the amount of surface runoff generated. The experiments with higher slope gradients were characterised by larger volumes of surface runoff generated, and by shorter times after which it occurred. The experiments with rainfall simulators in both laboratory and field conditions play an important role in better understanding of runoff generation processes. The results of such small scale experiments could be used to estimate some of the parameters of complex hydrological models, which are used to model rainfall-runoff and erosion processes at catchment scale.
EMISSIONS OF AIR TOXICS FROM A SIMULATED CHARCOAL KILN
The report gives results of experiments in a laboratory-scale charcoal kiln simulator to evaluate emissions of hazardous air pollutants from the production of charcoal in Missouri-type kilns. Fixed combustion gases were measured using continuous monitors. In Addition, other pollu...
HPC simulations of grain-scale spallation to improve thermal spallation drilling
NASA Astrophysics Data System (ADS)
Walsh, S. D.; Lomov, I.; Wideman, T. W.; Potter, J.
2012-12-01
Thermal spallation drilling and related hard-rock hole opening techniques are transformative technologies with the potential to dramatically reduce the costs associated with EGS well drilling and improve the productivity of new and existing wells. In contrast to conventional drilling methods that employ mechanical means to penetrate rock, thermal spallation methods fragment rock into small pieces ("spalls") without contact via the rapid transmission of heat to the rock surface. State-of-the-art constitutive models of thermal spallation employ Weibull statistical failure theory to represent the relationship between rock heterogeneity and its propensity to produce spalls when heat is applied to the rock surface. These models have been successfully used to predict such factors as penetration rate, spall-size distribution and borehole radius from drilling jet velocity and applied heat flux. A properly calibrated Weibull model would permit design optimization of thermal spallation drilling under geothermal field conditions. However, although useful for predicting system response in a given context, Weibull models are by their nature empirically derived. In the past, the parameters used in these models were carefully determined from laboratory tests, and thus model applicability was limited by experimental scope. This becomes problematic, for example, if simulating spall production at depths relevant for geothermal energy production, or modeling thermal spallation drilling in new rock types. Nevertheless, with sufficient computational resources, Weibull models could be validated in the absence of experimental data by explicit small-scale simulations that fully resolve rock grains. This presentation will discuss how high-fidelity simulations can be used to inform Weibull models of thermal spallation, and what these simulations reveal about the processes driving spallation at the grain-scale - in particular, the role that inter-grain boundaries and micro-pores play in the onset and extent of spallation. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Density functional simulations as a tool to probe molecular interactions in wet supercritical CO2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glezakou, Vassiliki Alexandra; McGrail, B. Peter
2013-06-03
Recent advances in mixed Gaussian and plane wave algorithms have made possible the effective use of density functional theory (DFT) in ab initio molecular dynamics (AIMD) simulations for large and chemically complex models of condensed phase materials. In this chapter, we are reviewing recent progress on the modeling and characterization of co-sequestration processes and reactivity in wet supercritical CO2 (sc-CO2). We examine the molecular transformations of mineral and metal components of a sequestration system in contact with water-bearing scCO2 media and aim to establish a reliable correspondence between experimental observations and theory models with predictive ability and transferability of resultsmore » in large scale geomechanical simulators. This work is funded by the Department of Energy, Office of Fossil Energy. A portion of the research was performed using EMSL, a national scientific user facility sponsored by the Department of Energy’s Office of Biological and Environmental Research located at Pacific Northwest National Laboratory. The Pacific Norhtwest National Laboratory (PNNL) is operated by Battelle for DOE under contract DE-AC06-76RL01830.« less
Ethane-xenon mixtures under shock conditions
NASA Astrophysics Data System (ADS)
Flicker, Dawn; Magyar, Rudolph; Root, Seth; Cochrane, Kyle; Mattsson, Thomas
2015-06-01
Mixtures of light and heavy elements arise in inertial confinement fusion and planetary science. We present results on the physics of molecular scale mixing through a validation study of equation of state (EOS) properties. Density functional theory molecular dynamics (DFT/QMD) at elevated-temperature and pressure is used to obtain the properties of pure xenon, ethane, and various compressed mixture compositions along their principal Hugoniots. To validate the QMD simulations, we performed high-precision shock compression experiments using Sandia's Z-Machine. A bond tracking analysis of the simulations correlates the sharp rise in the Hugoniot curve with completion of dissociation in ethane. DFT-based simulation results compare well with experimental data and are used to provide insight into the dissociation as a function of mixture composition. Interestingly, we find that the compression ratio for complete dissociation is similar for ethane, Xe-ethane, polymethyl-pentene, and polystyrene, suggesting that a limiting compression exists for C-C bonded systems. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Company, Security Administration under contract DE-AC04-94AL85000.
Simulating flow in karst aquifers at laboratory and sub-regional scales using MODFLOW-CFP
NASA Astrophysics Data System (ADS)
Gallegos, Josue Jacob; Hu, Bill X.; Davis, Hal
2013-12-01
Groundwater flow in a well-developed karst aquifer dominantly occurs through bedding planes, fractures, conduits, and caves created by and/or enlarged by dissolution. Conventional groundwater modeling methods assume that groundwater flow is described by Darcian principles where primary porosity (i.e. matrix porosity) and laminar flow are dominant. However, in well-developed karst aquifers, the assumption of Darcian flow can be questionable. While Darcian flow generally occurs in the matrix portion of the karst aquifer, flow through conduits can be non-laminar where the relation between specific discharge and hydraulic gradient is non-linear. MODFLOW-CFP is a relatively new modeling program that accounts for non-laminar and laminar flow in pipes, like karst caves, within an aquifer. In this study, results from MODFLOW-CFP are compared to those from MODFLOW-2000/2005, a numerical code based on Darcy's law, to evaluate the accuracy that CFP can achieve when modeling flows in karst aquifers at laboratory and sub-regional (Woodville Karst Plain, Florida, USA) scales. In comparison with laboratory experiments, simulation results by MODFLOW-CFP are more accurate than MODFLOW 2005. At the sub-regional scale, MODFLOW-CFP was more accurate than MODFLOW-2000 for simulating field measurements of peak flow at one spring and total discharges at two springs for an observed storm event.
Preparation of a Frozen Regolith Simulant Bed for ISRU Component Testing in a Vacuum Chamber
NASA Technical Reports Server (NTRS)
Klenhenz, Julie; Linne, Diane
2013-01-01
In-Situ Resource Utilization (ISRU) systems and components have undergone extensive laboratory and field tests to expose hardware to relevant soil environments. The next step is to combine these soil environments with relevant pressure and temperature conditions. Previous testing has demonstrated how to incorporate large bins of unconsolidated lunar regolith into sufficiently sized vacuum chambers. In order to create appropriate depth dependent soil characteristics that are needed to test drilling operations for the lunar surface, the regolith simulant bed must by properly compacted and frozen. While small cryogenic simulant beds have been created for laboratory tests, this scale effort will allow testing of a full 1m drill which has been developed for a potential lunar prospector mission. Compacted bulk densities were measured at various moisture contents for GRC-3 and Chenobi regolith simulants. Vibrational compaction methods were compared with the previously used hammer compaction, or "Proctor", method. All testing was done per ASTM standard methods. A full 6.13 m3 simulant bed with 6 percent moisture by weight was prepared, compacted in layers, and frozen in a commercial freezer. Temperature and desiccation data was collected to determine logistics for preparation and transport of the simulant bed for thermal vacuum testing. Once in the vacuum facility, the simulant bed will be cryogenically frozen with liquid nitrogen. These cryogenic vacuum tests are underway, but results will not be included in this manuscript.
PRACTICAL SIMULATION OF COMPOSTING IN THE LABORATORY
A closed incubation system was developed for laboratory simulation of composting conditions at the interior of a large compost pile. A conductive heat flux control system (CHFC) was used to adjust the temperature of the internal wall to that of the compost center and compensate f...
Jia, Qianqian; Xiong, Huilei; Wang, Hui; Shi, Hanchang; Sheng, Xinying; Sun, Run; Chen, Guoqiang
2014-11-01
The generation of polyhydroxyalkanoates (PHA) from excess sludge fermentation liquid (SFL) was studied at lab and pilot scale. A PHA-accumulated bacterial consortium (S-150) was isolated from activated sludge using simulated SFL (S-SFL) contained high concentration volatile fatty acids (VFA) and nitrogen. The maximal PHA content accounted for 59.18% in S-SFL and dropped to 23.47% in actual SFL (L-SFL) of the dry cell weight (DCW) at lab scale. The pilot-scale integrated system comprised an anaerobic fermentation reactor (AFR), a ceramic membrane system (CMS) and a PHA production bio-reactor (PHAR). The PHA content from pilot-scale SFL (P-SFL) finally reached to 59.47% DCW with the maximal PHA yield coefficient (YP/S) of 0.17 g PHA/g COD. The results indicated that VFA-containing SFL was suitable for PHA production. The adverse impact of excess nitrogen and non-VFAs in SFL might be eliminated by pilot-scale domestication, which might resulted in community structure optimization and substrate selective ability improvement of S-150. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wantuck, P. J.; Hollen, R. M.
2002-01-01
This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and processmore » the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the automation needs of many Laboratory programs.« less
Small-scale multi-axial hybrid simulation of a shear-critical reinforced concrete frame
NASA Astrophysics Data System (ADS)
Sadeghian, Vahid; Kwon, Oh-Sung; Vecchio, Frank
2017-10-01
This study presents a numerical multi-scale simulation framework which is extended to accommodate hybrid simulation (numerical-experimental integration). The framework is enhanced with a standardized data exchange format and connected to a generalized controller interface program which facilitates communication with various types of laboratory equipment and testing configurations. A small-scale experimental program was conducted using a six degree-of-freedom hydraulic testing equipment to verify the proposed framework and provide additional data for small-scale testing of shearcritical reinforced concrete structures. The specimens were tested in a multi-axial hybrid simulation manner under a reversed cyclic loading condition simulating earthquake forces. The physical models were 1/3.23-scale representations of a beam and two columns. A mixed-type modelling technique was employed to analyze the remainder of the structures. The hybrid simulation results were compared against those obtained from a large-scale test and finite element analyses. The study found that if precautions are taken in preparing model materials and if the shear-related mechanisms are accurately considered in the numerical model, small-scale hybrid simulations can adequately simulate the behaviour of shear-critical structures. Although the findings of the study are promising, to draw general conclusions additional test data are required.
NASA Astrophysics Data System (ADS)
Hristova-Veleva, S.; Chao, Y.; Vane, D.; Lambrigtsen, B.; Li, P. P.; Knosp, B.; Vu, Q. A.; Su, H.; Dang, V.; Fovell, R.; Tanelli, S.; Garay, M.; Willis, J.; Poulsen, W.; Fishbein, E.; Ao, C. O.; Vazquez, J.; Park, K. J.; Callahan, P.; Marcus, S.; Haddad, Z.; Fetzer, E.; Kahn, R.
2007-12-01
In spite of recent improvements in hurricane track forecast accuracy, currently there are still many unanswered questions about the physical processes that determine hurricane genesis, intensity, track and impact on large- scale environment. Furthermore, a significant amount of work remains to be done in validating hurricane forecast models, understanding their sensitivities and improving their parameterizations. None of this can be accomplished without a comprehensive set of multiparameter observations that are relevant to both the large- scale and the storm-scale processes in the atmosphere and in the ocean. To address this need, we have developed a prototype of a comprehensive hurricane information system of high- resolution satellite, airborne and in-situ observations and model outputs pertaining to: i) the thermodynamic and microphysical structure of the storms; ii) the air-sea interaction processes; iii) the larger-scale environment as depicted by the SST, ocean heat content and the aerosol loading of the environment. Our goal was to create a one-stop place to provide the researchers with an extensive set of observed hurricane data, and their graphical representation, together with large-scale and convection-resolving model output, all organized in an easy way to determine when coincident observations from multiple instruments are available. Analysis tools will be developed in the next step. The analysis tools will be used to determine spatial, temporal and multiparameter covariances that are needed to evaluate model performance, provide information for data assimilation and characterize and compare observations from different platforms. We envision that the developed hurricane information system will help in the validation of the hurricane models, in the systematic understanding of their sensitivities and in the improvement of the physical parameterizations employed by the models. Furthermore, it will help in studying the physical processes that affect hurricane development and impact on large-scale environment. This talk will describe the developed prototype of the hurricane information systems. Furthermore, we will use a set of WRF hurricane simulations and compare simulated to observed structures to illustrate how the information system can be used to discriminate between simulations that employ different physical parameterizations. The work described here was performed at the Jet Propulsion Laboratory, California Institute of Technology, under contract with the National Aeronautics ans Space Administration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Im, Piljae; Cho, Heejin; Kim, Dongsu
2016-08-01
This report provides second-year project simulation results for the multi-year project titled “Evaluation of Variable Refrigeration Flow (VRF) system on Oak Ridge National Laboratory (ORNL)’s Flexible Research Platform (FRP).”
Chudnoff, Scott G; Liu, Connie S; Levie, Mark D; Bernstein, Peter; Banks, Erika H
2010-09-01
To assess whether a novel educational curriculum using a simulation teaching laboratory improves resident knowledge, comfort with, and surgical performance of hysteroscopic sterilization. An educational prospective, pretest/posttest study. The Montefiore Institute of Minimally Invasive Surgery Laboratory. PATIENT(S)/SUBJECT(S): Thirty-four OB/GYN residents in an academic medical center. Hysteroscopic sterilization simulation laboratory and a brief didactic lecture. Differences in scores on validated skill assessment tools: Task specific checklist, Global Rating Scale (GRS), pass fail assessment, and a multiple-choice examination to evaluate knowledge and attitude. In the entire cohort improvements were observed on all evaluation tools after the simulation laboratory, with 31% points (SD+/-11.5, 95% confidence interval [CI] 27.3-35.3) higher score on the written evaluation; 63% points (SD+/-15.7, 95% CI 57.8-68.8) higher score on the task specific checklist; and 54% points (SD+/-13.6, 95% CI 48.8-58.3) higher score on the GRS. Higher PGY status was correlated with better pretest performance, but was not statistically significant in posttest scores. Residents reported an improvement in comfort performing the procedure after the laboratory. Simulation laboratory teaching significantly improved resident knowledge, comfort level, and technical skill performance of hysteroscopic sterilization. Copyright (c) 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Does a surgical simulator improve resident operative performance of laparoscopic tubal ligation?
Banks, Erika H; Chudnoff, Scott; Karmin, Ira; Wang, Cuiling; Pardanani, Setul
2007-11-01
The purpose of this study was to assess whether a surgical skills simulator laboratory improves resident knowledge and operative performance of laparoscopic tubal ligation. Twenty postgraduate year 1 residents were assigned randomly to either a surgical simulator laboratory on laparoscopic tubal ligation together with apprenticeship teaching in the operating room or to apprenticeship teaching alone. Tests that were given before and after the training assessed basic knowledge. Attending physicians who were blinded to resident randomization status evaluated postgraduate year 1 performance on a laparoscopic tubal ligation in the operating room with 3 validated tools: a task-specific checklist, global rating scale, and pass/fail grade. Postgraduate year 1 residents who were assigned randomly to the surgical simulator laboratory performed significantly better than control subjects on all 3 surgical assessment tools (the checklist, the global score, and the pass/fail analysis) and scored significantly better on the knowledge posttest (all P < .0005). Compared with apprenticeship teaching alone, a surgical simulator laboratory on laparoscopic tubal ligation improved resident knowledge and performance in the operating room.
NASA Astrophysics Data System (ADS)
Blázquez, M.; Egizabal, A.; Unzueta, I.
2014-08-01
The LIFE+ Project SIRENA, Simulation of the release of nanomaterials from consumer products for environmental exposure assessment, (LIFE11 ENV/ES/596) has set up a Technological Surveillance System (TSS) to trace technical references at worldwide level related to nanocomposites and the release from nanocomposites. So far a total of seventy three items of different nature (from peer reviewed articles to presentations and contributions to congresses) have been selected and classified as "nanomaterials release simulation technologies". In present document, different approaches for the simulation of different life cycle stages through the physical degradation of polymer nanocomposites at laboratory scale are assessed. In absence of a reference methodology, the comparison of the different protocols used still remains a challenge.
Mixing in a stratified shear flow: Energetics and sampling
NASA Technical Reports Server (NTRS)
Ivey, G. N.; Koseff, J. R.; Briggs, D. A.; Ferziger, J. H.
1993-01-01
Direct numerical simulations of the time evolution of homogeneous stably stratified shear flows have been performed for Richardson numbers from 0 to 1 and for Prandtl numbers between 0.1 and 2. The results indicate that mixing efficiency R(sub f) varies with turbulent Froude number in a manner consistent with laboratory experiments performed with Prandtl numbers of 0.7 and 700. However, unlike the laboratory results, for a particular Froude number, the simulations do not show a clear dependence on the magnitude of R(sub f) on Pr. The observed maximum value of R(sub f) is 0.25. When averaged over vertical length scales of an order of magnitude greater than either the overturning or Ozmidov scales of the flow, the simulations indicate that the dissipation rate epsilon is only weakly lognormally distributed with an intermittency of about 0.01 whereas estimated values in the ocean are 3 to 7.
Cooperative Collision Avoidance Technology Demonstration Data Analysis Report
NASA Technical Reports Server (NTRS)
2007-01-01
This report details the National Aeronautics and Space Administration (NASA) Access 5 Project Office Cooperative Collision Avoidance (CCA) Technology Demonstration for unmanned aircraft systems (UAS) conducted from 21 to 28 September 2005. The test platform chosen for the demonstration was the Proteus Optionally Piloted Vehicle operated by Scaled Composites, LLC, flown out of the Mojave Airport, Mojave, CA. A single intruder aircraft, a NASA Gulf stream III, was used during the demonstration to execute a series of near-collision encounter scenarios. Both aircraft were equipped with Traffic Alert and Collision Avoidance System-II (TCAS-II) and Automatic Dependent Surveillance Broadcast (ADS-B) systems. The objective of this demonstration was to collect flight data to support validation efforts for the Access 5 CCA Work Package Performance Simulation and Systems Integration Laboratory (SIL). Correlation of the flight data with results obtained from the performance simulation serves as the basis for the simulation validation. A similar effort uses the flight data to validate the SIL architecture that contains the same sensor hardware that was used during the flight demonstration.
Rundle, J. B.; Tiampo, K. F.; Klein, W.; Sá Martins, J. S.
2002-01-01
Threshold systems are known to be some of the most important nonlinear self-organizing systems in nature, including networks of earthquake faults, neural networks, superconductors and semiconductors, and the World Wide Web, as well as political, social, and ecological systems. All of these systems have dynamics that are strongly correlated in space and time, and all typically display a multiplicity of spatial and temporal scales. Here we discuss the physics of self-organization in earthquake threshold systems at two distinct scales: (i) The “microscopic” laboratory scale, in which consideration of results from simulations leads to dynamical equations that can be used to derive the results obtained from sliding friction experiments, and (ii) the “macroscopic” earthquake fault-system scale, in which the physics of strongly correlated earthquake fault systems can be understood by using time-dependent state vectors defined in a Hilbert space of eigenstates, similar in many respects to the mathematics of quantum mechanics. In all of these systems, long-range interactions induce the existence of locally ergodic dynamics. The existence of dissipative effects leads to the appearance of a “leaky threshold” dynamics, equivalent to a new scaling field that controls the size of nucleation events relative to the size of background fluctuations. At the macroscopic earthquake fault-system scale, these ideas show considerable promise as a means of forecasting future earthquake activity. PMID:11875204
Gawande, Nitin A; Reinhart, Debra R; Yeh, Gour-Tsyh
2010-02-01
Biodegradation process modeling of municipal solid waste (MSW) bioreactor landfills requires the knowledge of various process reactions and corresponding kinetic parameters. Mechanistic models available to date are able to simulate biodegradation processes with the help of pre-defined species and reactions. Some of these models consider the effect of critical parameters such as moisture content, pH, and temperature. Biomass concentration is a vital parameter for any biomass growth model and often not compared with field and laboratory results. A more complex biodegradation model includes a large number of chemical and microbiological species. Increasing the number of species and user defined process reactions in the simulation requires a robust numerical tool. A generalized microbiological and chemical model, BIOKEMOD-3P, was developed to simulate biodegradation processes in three-phases (Gawande et al. 2009). This paper presents the application of this model to simulate laboratory-scale MSW bioreactors under anaerobic conditions. BIOKEMOD-3P was able to closely simulate the experimental data. The results from this study may help in application of this model to full-scale landfill operation.
INTEGRATION OF PHOTOCATALYTIC OXIDATION WITH AIR STRIPPING OF CONTAMINATED AQUIFERS
Bench scale laboratory studies and pilot scale studies in a simulated field-test situation were performed to evaluate the integration of gas-solid ultaviolet (UV) photocatalytic oxidation (PCO) downstream if an air stripper unit as a technology for cost-effectively treating water...
Pradhan, Ranjan; Misra, Manjusri; Erickson, Larry; Mohanty, Amar
2010-11-01
A laboratory scale simulated composting facility (as per ASTM D 5338) was designed and utilized to determine and evaluate the extent of degradation of polylactic acid (PLA), untreated wheat and soy straw and injection moulded composites of PLA-wheat straw (70:30) and PLA-soy straw (70:30). The outcomes of the study revealed the suitability of the test protocol, validity of the test system and defined the compostability of the composites of PLA with unmodified natural substrate. The study would help to design composites using modified soy straw and wheat straw as reinforcement/filler to satisfy ASTM D 6400 specifications. Copyright 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.
2017-12-01
The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.
Accuracy of finite-difference modeling of seismic waves : Simulation versus laboratory measurements
NASA Astrophysics Data System (ADS)
Arntsen, B.
2017-12-01
The finite-difference technique for numerical modeling of seismic waves is still important and for some areas extensively used.For exploration purposes is finite-difference simulation at the core of both traditional imaging techniques such as reverse-time migration and more elaborate Full-Waveform Inversion techniques.The accuracy and fidelity of finite-difference simulation of seismic waves are hard to quantify and meaningfully error analysis is really onlyeasily available for simplistic media. A possible alternative to theoretical error analysis is provided by comparing finite-difference simulated data with laboratory data created using a scale model. The advantage of this approach is the accurate knowledge of the model, within measurement precision, and the location of sources and receivers.We use a model made of PVC immersed in water and containing horizontal and tilted interfaces together with several spherical objects to generateultrasonic pressure reflection measurements. The physical dimensions of the model is of the order of a meter, which after scaling represents a model with dimensions of the order of 10 kilometer and frequencies in the range of one to thirty hertz.We find that for plane horizontal interfaces the laboratory data can be reproduced by the finite-difference scheme with relatively small error, but for steeply tilted interfaces the error increases. For spherical interfaces the discrepancy between laboratory data and simulated data is sometimes much more severe, to the extent that it is not possible to simulate reflections from parts of highly curved bodies. The results are important in view of the fact that finite-difference modeling is often at the core of imaging and inversion algorithms tackling complicatedgeological areas with highly curved interfaces.
Simulating Laboratory Procedures.
ERIC Educational Resources Information Center
Baker, J. E.; And Others
1986-01-01
Describes the use of computer assisted instruction in a medical microbiology course. Presents examples of how computer assisted instruction can present case histories in which the laboratory procedures are simulated. Discusses an authoring system used to prepare computer simulations and provides one example of a case history dealing with fractured…
Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground
NASA Astrophysics Data System (ADS)
Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.
2011-11-01
U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.
Remote sensing and field test capabilities at U.S. Army Dugway Proving Ground
NASA Astrophysics Data System (ADS)
Pearson, James T.; Herron, Joshua P.; Marshall, Martin S.
2012-05-01
U.S. Army Dugway Proving Ground (DPG) is a Major Range and Test Facility Base (MRTFB) with the mission of testing chemical and biological defense systems and materials. DPG facilities include state-of-the-art laboratories, extensive test grids, controlled environment calibration facilities, and a variety of referee instruments for required test measurements. Among these referee instruments, DPG has built up a significant remote sensing capability for both chemical and biological detection. Technologies employed for remote sensing include FTIR spectroscopy, UV spectroscopy, Raman-shifted eye-safe lidar, and other elastic backscatter lidar systems. These systems provide referee data for bio-simulants, chemical simulants, toxic industrial chemicals (TICs), and toxic industrial materials (TIMs). In order to realize a successful large scale open-air test, each type of system requires calibration and characterization. DPG has developed specific calibration facilities to meet this need. These facilities are the Joint Ambient Breeze Tunnel (JABT), and the Active Standoff Chamber (ASC). The JABT and ASC are open ended controlled environment tunnels. Each includes validation instrumentation to characterize simulants that are disseminated. Standoff systems are positioned at typical field test distances to measure characterized simulants within the tunnel. Data from different types of systems can be easily correlated using this method, making later open air test results more meaningful. DPG has a variety of large scale test grids available for field tests. After and during testing, data from the various referee instruments is provided in a visual format to more easily draw conclusions on the results. This presentation provides an overview of DPG's standoff testing facilities and capabilities, as well as example data from different test scenarios.
Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility
NASA Astrophysics Data System (ADS)
Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.
2014-12-01
The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.
Modeling unstable alcohol flooding of DNAPL-contaminated columns
NASA Astrophysics Data System (ADS)
Roeder, Eberhard; Falta, Ronald W.
Alcohol flooding, consisting of injection of a mixture of alcohol and water, is one source removal technology for dense non-aqueous phase liquids (DNAPLs) currently under investigation. An existing compositional multiphase flow simulator (UTCHEM) was adapted to accurately represent the equilibrium phase behavior of ternary and quaternary alcohol/DNAPL systems. Simulator predictions were compared to laboratory column experiments and the results are presented here. It was found that several experiments involved unstable displacements of the NAPL bank by the alcohol flood or of the alcohol flood by the following water flood. Unstable displacement led to additional mixing compared to ideal displacement. This mixing was approximated by a large dispersion in one-dimensional simulations and or by including permeability heterogeneities on a very small scale in three-dimensional simulations. Three-dimensional simulations provided the best match. Simulations of unstable displacements require either high-resolution grids, or need to consider the mixing of fluids in a different manner to capture the resulting effects on NAPL recovery.
Nitrate reduction in a simulated free-water surface wetland system.
Misiti, Teresa M; Hajaya, Malek G; Pavlostathis, Spyros G
2011-11-01
The feasibility of using a constructed wetland for treatment of nitrate-contaminated groundwater resulting from the land application of biosolids was investigated for a site in the southeastern United States. Biosolids degradation led to the release of ammonia, which upon oxidation resulted in nitrate concentrations in the upper aquifer in the range of 65-400 mg N/L. A laboratory-scale system was constructed in support of a pilot-scale project to investigate the effect of temperature, hydraulic retention time (HRT) and nitrate and carbon loading on denitrification using soil and groundwater from the biosolids application site. The maximum specific reduction rates (MSRR), measured in batch assays conducted with an open to the atmosphere reactor at four initial nitrate concentrations from 70 to 400 mg N/L, showed that the nitrate reduction rate was not affected by the initial nitrate concentration. The MSRR values at 22 °C for nitrate and nitrite were 1.2 ± 0.2 and 0.7 ± 0.1 mg N/mg VSS(COD)-day, respectively. MSRR values were also measured at 5, 10, 15 and 22 °C and the temperature coefficient for nitrate reduction was estimated at 1.13. Based on the performance of laboratory-scale continuous-flow reactors and model simulations, wetland performance can be maintained at high nitrogen removal efficiency (>90%) with an HRT of 3 days or higher and at temperature values as low as 5 °C, as long as there is sufficient biodegradable carbon available to achieve complete denitrification. The results of this study show that based on the climate in the southeastern United States, a constructed wetland can be used for the treatment of nitrate-contaminated groundwater to low, acceptable nitrate levels. Copyright © 2011 Elsevier Ltd. All rights reserved.
Multiple-Scale Physics During Magnetic Reconnection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jara-Almonte, Jonathan
Magnetic reconnection is a key fundamental process in magnetized plasmas wherein the global magnetic topology is modified and stored energy is transferred from fields to particles. Reconnection is an inherently local process, and mechanisms to couple global-scale dynamics are not well understood. This dissertation explores two different mechanisms for cross-scale coupling during magnetic reconnection. As one example, we theoretically examine reconnection in a collisionless plasma using particle-in-cell simulations and demonstrate that large scale reconnection physics can couple to and drive microscopic instabilities, even in two-dimensional systems if significant scale separation exists between the Debye length and the electron skin depth.more » The physics underlying these instabilities is explained using simple theoretical models, and their potential connection to existing discrepancies between laboratory experiments and numerical simulations is explored. In three-dimensional systems, these instabilities are shown to generate anomalous resistivity that balances a substantial fraction of the electric field. In contrast, we also use experiments to investigate cross-scale couplings during reconnection in a collisional plasma. A leading candidate for coupling global and local scales is the hierarchical breakdown of elongated, reconnecting current sheets into numerous smaller current sheets -– the plasmoid instability. In the Magnetic Reconnection Experiment (MRX), recent hardware improvements have extended the accessible parameter space allowing for the study of long-lived, elongated current sheets. Moreover, by using Argon, reproducible and collisional plasmas are produced, which allow for a detailed statistical study of collisional reconnection. As a result, we have conclusively measured the onset of sub-ion-scale plasmoids during resistive, anti-parallel reconnection for the first time. The current sheet thickness is intermediate between ion and electron kinetic scales such that the plasma is in the Hall-MHD regime. Surprisingly, plasmoids are observed at Lundquist numbers < 100 well below theoretical predictions (> 10,000). The number of plasmoids scales with both Lundquist number and current sheet aspect ratio. The Hall quadrupolar fields are shown to suppress plasmoids. Finally, plasmoids are shown to couple local and global physics by enhancing the reconnection rate. These results are compared with prior studies of tearing and plasmoid instability, and implications for astrophysical plasmas, laboratory experiments, and theoretical studies of reconnection are discussed.« less
Sensitivity of CEAP cropland simulations to the parameterization of the APEX model
USDA-ARS?s Scientific Manuscript database
For large scale applications like the U.S. National Scale Conservation Effects Assessment Project (CEAP), soil hydraulic characteristics data are not readily available and therefore need to be estimated. Field soil water properties are commonly approximated using laboratory soil water retention meas...
NASA Astrophysics Data System (ADS)
Dobson, P. F.; Kneafsey, T. J.
2001-12-01
As part of an ongoing effort to evaluate THC effects on flow in fractured media, we performed a laboratory experiment and numerical simulations to investigate mineral dissolution and precipitation. To replicate mineral dissolution by condensate in fractured tuff, deionized water equilibrated with carbon dioxide was flowed for 1,500 hours through crushed Yucca Mountain tuff at 94° C. The reacted water was collected and sampled for major dissolved species, total alkalinity, electrical conductivity, and pH. The resulting steady-state fluid composition had a total dissolved solids content of about 140 mg/L; silica was the dominant dissolved constituent. A portion of the steady-state reacted water was flowed at 10.8 mL/hr into a 31.7-cm tall, 16.2-cm wide vertically oriented planar fracture with a hydraulic aperture of 31 microns in a block of welded Topopah Spring tuff that was maintained at 80° C at the top and 130° C at the bottom. The fracture began to seal within five days. A 1-D plug-flow model using the TOUGHREACT code developed at Berkeley Lab was used to simulate mineral dissolution, and a 2-D model was developed to simulate the flow of mineralized water through a planar fracture, where boiling conditions led to mineral precipitation. Predicted concentrations of the major dissolved constituents for the tuff dissolution were within a factor of 2 of the measured average steady-state compositions. The fracture-plugging simulations result in the precipitation of amorphous silica at the base of the boiling front, leading to a hundred-fold decrease in fracture permeability in less than 6 days, consistent with the laboratory experiment. These results help validate the use of the TOUGHREACT code for THC modeling of the Yucca Mountain system. The experiment and simulations indicate that boiling and concomitant precipitation of amorphous silica could cause significant reductions in fracture porosity and permeability on a local scale. The TOUGHREACT code will be used to evaluate larger-scale silica sealing observed in a portion of the Yellowstone geothermal system, a natural analog for the precipitation-experiment processes.
NASA Astrophysics Data System (ADS)
Jakubský, Michal; Lenhard, Richard; Vantúch, Martin; Malcho, Milan
2012-04-01
In the call OPVaV-2008/2.2/01-SORO Operational Programme Research and Development - knowledge and technology transfer from research and development into practice (ITMS-26220220057), whose strategic goal is "Device to use low-potential geothermal heat without forced circulation of heat carrier deep in the well "in the Department of Energy laboratory techniques to construct a simulator of transport low potential of geothermal energy in comparative test-drilling in the laboratory. The article describes a device that was designed as a scale model of two deep boreholes each of which withdraws the earth's heat by heat transfer technology and heat carrier. Device using forced circulation of heat carrier will respond in the construction of equipment currently used to transport heat from deep borehole. As the heat carrier will be used CO2. Facilities without using forced circulation of heat carrier, the new technology, which will be used as heat carrier ammonia (NH3).
Moreno-Ger, Pablo; Torrente, Javier; Bustamante, Julián; Fernández-Galaz, Carmen; Fernández-Manjón, Baltasar; Comas-Rengifo, María Dolores
2010-06-01
Practical sessions in undergraduate medical education are often costly and have to face constraints in terms of available laboratory time and practice materials (e.g. blood samples from animals). This makes it difficult to increase the time each student spends at the laboratory. We consider that it would be possible to improve the effectiveness of the laboratory time by providing the students with computer-based simulations for prior rehearsal. However, this approach still presents issues in terms of development costs and distribution to the students. This study investigates the employment of low-cost simulation to allow medical students to rehearse practical exercises through a web-based e-learning environment. The aim is to maximize the efficiency of laboratory time and resources allocated by letting students become familiarized with the equipment and the procedures before they attend a laboratory session, but without requiring large-scale investment. Moreover, students can access the simulation via the Internet and rehearse at their own pace. We have studied the effects of such a simulation in terms of impact on the laboratory session, learning outcomes and student satisfaction. We created a simulation that covers the steps of a practical exercise in a Physiology course (measuring hematocrit in a blood sample). An experimental group (EG, n=66) played the simulation 1 week before the laboratory session. A control group (CG, n=77) attended the laboratory session without playing the simulation. After the session, all students completed a survey about their perception of the difficulty of the exercise on a scale of 1-10 and the HCT final value that they obtained. The students in the EG also completed a survey about their satisfaction with the experience. After the laboratory session, the perceived difficulty of the procedure was lower on average in the EG compared to the CG (3.52 vs. 4.39, 95% CI: 0.16-1.57, P=.016). There was no significant difference in terms of perceived difficulty using the equipment. The HCT measures reported by the EG group also presented a much lower dispersion, meaning a higher reliability, in determining the HCT value (3.10 vs. 26.94, SD; variances significantly different, P<.001, F: 75.25, Dfd: 68.19 for EG and CG). In the satisfaction test, the majority of the students in the EG reported that the experience was positive or very positive (80.7%) and reported that it had helped them to identify and use the equipment (78%) and to perform the exercise (66%). The simulation was well received by students in the EG, who felt more comfortable during the laboratory session, and it helped them to perform the exercise better, obtaining more accurate results, which indicates more effective training. EG students perceived the procedure as easier to perform, but did not report an improvement in the perceived difficulty in using the equipment. The increased reliability demonstrates that low-cost simulations are a good complement to the laboratory sessions. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Accurate determination of predicted environmental concentrations (PECs) is a continuing and often elusive goal of pesticide risk assessment. PECs are typically derived using simulation models that depend on laboratory generated data for key input parameters (t1/2, Koc, etc.). Model flexibility in ...
USDA-ARS?s Scientific Manuscript database
Accurate determination of predicted environmental concentrations (PECs) is a continuing and often elusive goal of pesticide risk assessment. PECs are typically derived using simulation models that depend on laboratory generated data for key input parameters (t1/2, Koc, etc.). Model flexibility in ev...
Computer Laboratory for Multi-scale Simulations of Novel Nanomaterials
2014-09-15
schemes for multiscale modeling of polymers. Permselective ion-exchange membranes for protective clothing, fuel cells , and batteries are of special...polyelectrolyte membranes ( PEM ) with chemical warfare agents (CWA) and their simulants and (2) development of new simulation methods and computational...chemical potential using gauge cell method and calculation of density profiles. However, the code does not run in parallel environments. For mesoscale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraucunas, Ian P.; Clarke, Leon E.; Dirks, James A.
2015-04-01
The Platform for Regional Integrated Modeling and Analysis (PRIMA) is an innovative modeling system developed at Pacific Northwest National Laboratory (PNNL) to simulate interactions among natural and human systems at scales relevant to regional decision making. PRIMA brings together state-of-the-art models of regional climate, hydrology, agriculture, socioeconomics, and energy systems using a flexible coupling approach. The platform can be customized to inform a variety of complex questions and decisions, such as the integrated evaluation of mitigation and adaptation options across a range of sectors. Research into stakeholder decision support needs underpins the platform's application to regional issues, including uncertainty characterization.more » Ongoing numerical experiments are yielding new insights into the interactions among human and natural systems on regional scales with an initial focus on the energy-land-water nexus in the upper U.S. Midwest. This paper focuses on PRIMA’s functional capabilities and describes some lessons learned to date about integrated regional modeling.« less
Direct numerical simulations of magmatic differentiation at the microscopic scale
NASA Astrophysics Data System (ADS)
Sethian, J.; Suckale, J.; Elkins-Tanton, L. T.
2010-12-01
A key question in the context of magmatic differentiation and fractional crystallization is the ability of crystals to decouple from the ambient fluid and sink or rise. Field data indicates a complex spectrum of behavior ranging from rapid sedimentation to continued entrainment. Theoretical and laboratory studies paint a similarly rich picture. The goal of this study is to provide a detailed numerical assessment of the competing effects of sedimentation and entrainment at the scale of individual crystals. The decision to simulate magmatic differentiation at the grain scale comes at the price of not being able to simultaneously solve for the convective velocity field at the macroscopic scale, but has the crucial advantage of enabling us to fully resolve the dynamics of the systems from first principles without requiring any simplifying assumptions. The numerical approach used in this study is a customized computational methodology developed specifically for simulations of solid-fluid coupling in geophysical systems. The algorithm relies on a two-step projection scheme: In the first step, we solve the multiple-phase Navier-Stokes or Stokes equation in both domains. In the second step, we project the velocity field in the solid domain onto a rigid-body motion by enforcing that the deformation tensor in the respective domain is zero. This procedure is also used to enforce the no-slip boundary-condition on the solid-fluid interface. We have extensively validated and benchmarked the method. Our preliminary results indicate that, not unexpectedly, the competing effects of sedimentation and entrainment depend sensitively on the size distribution of the crystals, the aspect ratio of individual crystals and the vigor of the ambient flow field. We provide a detailed scaling analysis and quantify our results in terms of the relevant non-dimensional numbers.
NASA Astrophysics Data System (ADS)
Buechner, J.; Jain, N.; Sharma, A.
2013-12-01
The four s/c of the Magnetospheric Multiscale (MMS) mission, to be launched in 2014, will use the Earth's magnetosphere as a laboratory to study the microphysics of three fundamental plasma processes. One of them is magnetic reconnection, an essentially multi-scale process. While laboratory experiments and past theoretical investigations have shown that important processes necessary to understand magnetic reconnection take place at electron scales the MMS mission for the first time will be able to resolve these scales by in space observations. For the measurement strategy of MMS it is important to make specific predictions of the behavior of current sheets with a thickness of the order of the electron skin depth which play an important role in the evolution of collisionless magnetic reconnection. Since these processes are highly nonlinear and non-local numerical simulation is needed to specify the current sheet evolution. Here we present new results about the nonlinear evolution of electron-scale current sheets starting from the linear stage and using 3-D electron-magnetohydrodynamic (EMHD) simulations. The growth rates of the simulated instabilities compared well with the growth rates obtained from linear theory. Mechanisms and conditions of the formation of flux ropes and of current filamentation will be discussed in comparison with the results of fully kinetic simulations. In 3D the X- and O-point configurations of the magnetic field formed in reconnection planes alternate along the out-of-reconnection-plane direction with the wavelength of the unstable mode. In the presence of multiple reconnection sites, the out-of-plane magnetic field can develop nested structure of quadrupoles in reconnection planes, similar to the 2-D case, but now with variations in the out-of-plane direction. The structures of the electron flow and magnetic field in 3-D simulations will be compared with those in 2-D simulations to discriminate the essentially 3D features. We also discuss the influence of guide fields, as in the magnetopause case and show how the 3-D evolution of an electron current sheet is influenced the strength of the guide field. This is unlike the 2-D case where reconnection takes place only in a plane. This work was partially funded by the Max-Planck/Princeton Center for Plasma Physics and the National Science Foundation.
A Simulated Research Problem for Undergraduate Metamorphic Petrology.
ERIC Educational Resources Information Center
Amenta, Roddy V.
1984-01-01
Presents a laboratory problem in metamorphic petrology designed to simulate a research experience. The problem deals with data on scales ranging from a geologic map to hand specimens to thin sections. Student analysis includes identifying metamorphic index minerals, locating their isograds on the map, and determining the folding sequence. (BC)
DOT National Transportation Integrated Search
2017-03-01
A number of full-scale tests have been carried out in the laboratory focused on the shear : performance of simulated precast concrete deck panels (PCP). Shear tests were carried out to : simulate the type of loading that will be applied to the deck p...
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Critical infrastructures of the world are at constant risks for earthquakes. Most of these critical structures are designed using archaic, seismic, simulation methods that were built from early digital computers from the 1970s. Idaho National Laboratory’s Seismic Research Group are working to modernize the simulation methods through computational research and large-scale laboratory experiments.
SAGE Validations of Volcanic Jet Simulations
NASA Astrophysics Data System (ADS)
Peterson, A. H.; Wohletz, K. H.; Ogden, D. E.; Gisler, G.; Glatzmaier, G.
2006-12-01
The SAGE (SAIC Adaptive Grid Eulerian) code employs adaptive mesh refinement in solving Eulerian equations of complex fluid flow desirable for simulation of volcanic eruptions. Preliminary eruption simulations demonstrate its ability to resolve multi-material flows over large domains where dynamics are concentrated in small regions. In order to validate further application of this code to numerical simulation of explosive eruption phenomena, we focus on one of the fundamental physical processes important to the problem, namely the dynamics of an underexpanded jet. Observations of volcanic eruption plumes and laboratory experiments on analog systems document the eruption of overpressured fluid in a supersonic jet that is governed by vent diameter and level of overpressure. The jet is dominated by inertia (very high Reynolds number) and feeds a thermally convective plume controlled by turbulent admixture of the atmosphere. The height above the vent at which the jet looses its inertia is important to know for convective plume predictions that are used to calculate atmospheric dispersal of volcanic products. We simulate a set of well documented laboratory experiments that provide detail on underexpanded jet structure by gas density contours, showing the shape and size of the Mach stem. SAGE results are within several percent of the experiments for position and density of the incident (intercepting) and reflected shocks, slip lines, shear layers, and Mach disk. The simulations also resolve vorticity at the jet margins near the Mach disk, showing turbulent velocity fields down to a scale of 30 micrometers. Benchmarking these results with those of CFDLib (Los Alamos National Laboratory), which solves the full Navier-Stokes equations (includes viscous stress tensor), shows close agreement, indicating that adaptive mesh refinement used in SAGE may offset the need for explicit calculation of viscous dissipation.
Observations and laboratory simulations of tornadoes in complex topographical regions
NASA Astrophysics Data System (ADS)
Karstens, Christopher Daniel
Aerial photos taken along the damage paths of the Joplin, MO, and Tuscaloosa-Birmingham, AL, tornadoes of 2011 captured and preserved several unique patterns of damage. In particular, a few distinct tree-fall patterns were noted along the Tuscaloosa-Birmingham tornado track that appeared highly influenced by the underlying topography. One such region was the focus of a damage survey and motivated laboratory vortex simulations with a 3-D foam representation of the underlying topography, in addition to simulations performed with idealized 2D topographic features, using Iowa State University's tornado simulator. The purpose of this dissertation is to explore various aspects related to the interaction of a tornado or a tornado-like vortex with its underlying topography. Three topics are examined: 1) Analysis of tornado-induced tree-fall using aerial photography from the Joplin, MO, and Tuscaloosa-Birmingham, AL, tornadoes of 2011, 2) Laboratory investigation of topographical influences on a simulated tornado-like vortex, and 3) On the use of non-standard EF-scale damage indicators to categorize tornadoes.
NASA Astrophysics Data System (ADS)
Javidi, Giti
2005-07-01
This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical group's (n = 40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory. The Simulation group's (n = 40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation program in a controlled PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude surveys and qualitative study were administered at the completion of the treatment. The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups' attitude toward the simulation program and their post-test scores. Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group. In addition, there was significant difference between the two groups on their lab completion time in favor of the simulation group. At the same time, the qualitative research has uncovered several issues not explored by the quantitative research. It was concluded that incorporating the recommendations acquired from the qualitative research, especially elements of incorporating hardware experience to avoid lack of hands-on skills, into the laboratory pedagogy should help improve students' experience regardless of the environment in which the laboratory is conducted.
A comparison of refuse attenuation in laboratory and field scale lysimeters.
Youcai, Zhao; Luochun, Wang; Renhua, Hua; Dimin, Xu; Guowei, Gu
2002-01-01
For this study, small and middle scale laboratory lysimeters, and a large scale field lysimeter in situ in Shanghai Refuse Landfill, with refuse weights of 187,600 and 10,800,000 kg, respectively, were created. These lysimeters are compared in terms of leachate quality (pH, concentrations of COD, BOD and NH3-N), refuse composition (biodegradable matter and volatile solid) and surface settlement for a monitoring period of 0-300 days. The objectives of this study were to explore both the similarities and disparities between laboratory and field scale lysimeters, and to compare degradation behaviors of refuse at the intensive reaction phase in the different scale lysimeters. Quantitative relationships of leachate quality and refuse composition with placement time show that degradation behaviors of refuse seem to depend heavily on the scales of the lysimeters and the parameters of concern, especially in the starting period of 0-6 months. However, some similarities exist between laboratory and field lysimeters after 4-6 months of placement because COD and BOD concentrations in leachate in the field lysimeter decrease regularly in a parallel pattern with those in the laboratory lysimeters. NH3-N, volatile solid (VS) and biodegradable matter (BDM) also gradually decrease in parallel in this intensive reaction phase for all scale lysimeters as refuse ages. Though the concrete data are different among the different scale lysimeters, it may be considered that laboratory lysimeters with sufficient scale are basically applicable for a rough simulation of a real landfill, especially for illustrating the degradation pattern and mechanism. Settlement of refuse surface is roughly proportional to the initial refuse height.
A study of facilities and fixtures for testing of a high speed civil transport wing component
NASA Technical Reports Server (NTRS)
Cerro, J. A.; Vause, R. F.; Bowman, L. M.; Jensen, J. K.; Martin, C. J., Jr.; Stockwell, A. E.; Waters, W. A., Jr.
1996-01-01
A study was performed to determine the feasibility of testing a large-scale High Speed Civil Transport wing component in the Structures and Materials Testing Laboratory in Building 1148 at NASA Langley Research Center. The report includes a survey of the electrical and hydraulic resources and identifies the backing structure and floor hard points which would be available for reacting the test loads. The backing structure analysis uses a new finite element model of the floor and backstop support system in the Structures Laboratory. Information on the data acquisition system and the thermal power requirements is also presented. The study identified the hardware that would be required to test a typical component, including the number and arrangement of hydraulic actuators required to simulate expected flight loads. Load introduction and reaction structure concepts were analyzed to investigate the effects of experimentally induced boundary conditions.
Laboratory analogue of a supersonic accretion column in a binary star system.
Cross, J E; Gregori, G; Foster, J M; Graham, P; Bonnet-Bidaud, J-M; Busschaert, C; Charpentier, N; Danson, C N; Doyle, H W; Drake, R P; Fyrth, J; Gumbrell, E T; Koenig, M; Krauland, C; Kuranz, C C; Loupias, B; Michaut, C; Mouchet, M; Patankar, S; Skidmore, J; Spindloe, C; Tubman, E R; Woolsey, N; Yurchak, R; Falize, É
2016-06-13
Astrophysical flows exhibit rich behaviour resulting from the interplay of different forms of energy-gravitational, thermal, magnetic and radiative. For magnetic cataclysmic variable stars, material from a late, main sequence star is pulled onto a highly magnetized (B>10 MG) white dwarf. The magnetic field is sufficiently large to direct the flow as an accretion column onto the poles of the white dwarf, a star subclass known as AM Herculis. A stationary radiative shock is expected to form 100-1,000 km above the surface of the white dwarf, far too small to be resolved with current telescopes. Here we report the results of a laboratory experiment showing the evolution of a reverse shock when both ionization and radiative losses are important. We find that the stand-off position of the shock agrees with radiation hydrodynamic simulations and is consistent, when scaled to AM Herculis star systems, with theoretical predictions.
Development of an Indexing Media Filtration System for Long Duration Space Missions
NASA Technical Reports Server (NTRS)
Agui, Juan H.; Vijayakumar, R.
2013-01-01
The effective maintenance of air quality aboard spacecraft cabins will be vital to future human exploration missions. A key component will be the air cleaning filtration system which will need to remove a broad size range of particles derived from multiple biological and material sources. In addition, during surface missions any extraterrestrial planetary dust, including dust generated by near-by ISRU equipment, which is tracked into the habitat will also need to be managed by the filtration system inside the pressurized habitat compartments. An indexing media filter system is being developed to meet the demand for long-duration missions that will result in dramatic increases in filter service life and loading capacity, and will require minimal crew involvement. The filtration system consists of three stages: an inertial impactor stage, an indexing media stage, and a high-efficiency filter stage, packaged in a stacked modular cartridge configuration. Each stage will target a specific range of particle sizes that optimize the filtration and regeneration performance of the system. An 1/8th scale and full-scale prototype of the filter system have been fabricated and have been tested in the laboratory and reduced gravity environments that simulate conditions on spacecrafts, landers and habitats. Results from recent laboratory and reduce-gravity flight tests data will be presented. The features of the new filter system may also benefit other closed systems, such as submarines, and remote location terrestrial installations where servicing and replacement of filter units is not practical.
Defense Waste Processing Facility Simulant Chemical Processing Cell Studies for Sludge Batch 9
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Tara E.; Newell, J. David; Woodham, Wesley H.
The Savannah River National Laboratory (SRNL) received a technical task request from Defense Waste Processing Facility (DWPF) and Saltstone Engineering to perform simulant tests to support the qualification of Sludge Batch 9 (SB9) and to develop the flowsheet for SB9 in the DWPF. These efforts pertained to the DWPF Chemical Process Cell (CPC). CPC experiments were performed using SB9 simulant (SB9A) to qualify SB9 for sludge-only and coupled processing using the nitric-formic flowsheet in the DWPF. Two simulant batches were prepared, one representing SB8 Tank 40H and another representing SB9 Tank 51H. The simulant used for SB9 qualification testing wasmore » prepared by blending the SB8 Tank 40H and SB9 Tank 51H simulants. The blended simulant is referred to as SB9A. Eleven CPC experiments were run with an acid stoichiometry ranging between 105% and 145% of the Koopman minimum acid equation (KMA), which is equivalent to 109.7% and 151.5% of the Hsu minimum acid factor. Three runs were performed in the 1L laboratory scale setup, whereas the remainder were in the 4L laboratory scale setup. Sludge Receipt and Adjustment Tank (SRAT) and Slurry Mix Evaporator (SME) cycles were performed on nine of the eleven. The other two were SRAT cycles only. One coupled flowsheet and one extended run were performed for SRAT and SME processing. Samples of the condensate, sludge, and off-gas were taken to monitor the chemistry of the CPC experiments.« less
The Spider Center Wide File System; From Concept to Reality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shipman, Galen M; Dillow, David A; Oral, H Sarp
2009-01-01
The Leadership Computing Facility (LCF) at Oak Ridge National Laboratory (ORNL) has a diverse portfolio of computational resources ranging from a petascale XT4/XT5 simulation system (Jaguar) to numerous other systems supporting development, visualization, and data analytics. In order to support vastly different I/O needs of these systems Spider, a Lustre-based center wide file system was designed and deployed to provide over 240 GB/s of aggregate throughput with over 10 Petabytes of formatted capacity. A multi-stage InfiniBand network, dubbed as Scalable I/O Network (SION), with over 889 GB/s of bisectional bandwidth was deployed as part of Spider to provide connectivity tomore » our simulation, development, visualization, and other platforms. To our knowledge, while writing this paper, Spider is the largest and fastest POSIX-compliant parallel file system in production. This paper will detail the overall architecture of the Spider system, challenges in deploying and initial testings of a file system of this scale, and novel solutions to these challenges which offer key insights into file system design in the future.« less
Development of space simulation / net-laboratory system
NASA Astrophysics Data System (ADS)
Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.
A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.
Seasonal-Scale Optimization of Conventional Hydropower Operations in the Upper Colorado System
NASA Astrophysics Data System (ADS)
Bier, A.; Villa, D.; Sun, A.; Lowry, T. S.; Barco, J.
2011-12-01
Sandia National Laboratories is developing the Hydropower Seasonal Concurrent Optimization for Power and the Environment (Hydro-SCOPE) tool to examine basin-wide conventional hydropower operations at seasonal time scales. This tool is part of an integrated, multi-laboratory project designed to explore different aspects of optimizing conventional hydropower operations. The Hydro-SCOPE tool couples a one-dimensional reservoir model with a river routing model to simulate hydrology and water quality. An optimization engine wraps around this model framework to solve for long-term operational strategies that best meet the specific objectives of the hydrologic system while honoring operational and environmental constraints. The optimization routines are provided by Sandia's open source DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) software. Hydro-SCOPE allows for multi-objective optimization, which can be used to gain insight into the trade-offs that must be made between objectives. The Hydro-SCOPE tool is being applied to the Upper Colorado Basin hydrologic system. This system contains six reservoirs, each with its own set of objectives (such as maximizing revenue, optimizing environmental indicators, meeting water use needs, or other objectives) and constraints. This leads to a large optimization problem with strong connectedness between objectives. The systems-level approach used by the Hydro-SCOPE tool allows simultaneous analysis of these objectives, as well as understanding of potential trade-offs related to different objectives and operating strategies. The seasonal-scale tool will be tightly integrated with the other components of this project, which examine day-ahead and real-time planning, environmental performance, hydrologic forecasting, and plant efficiency.
Intrinsic frame transport for a model of nematic liquid crystal
NASA Astrophysics Data System (ADS)
Cozzini, S.; Rull, L. F.; Ciccotti, G.; Paolini, G. V.
1997-02-01
We present a computer simulation study of the dynamical properties of a nematic liquid crystal model. The diffusional motion of the nematic director is taken into account in our calculations in order to give a proper estimate of the transport coefficients. Differently from other groups we do not attempt to stabilize the director through rigid constraints or applied external fields. We instead define an intrinsic frame which moves along with the director at each step of the simulation. The transport coefficients computed in the intrinsic frame are then compared against the ones calculated in the fixed laboratory frame, to show the inadequacy of the latter for systems with less than 500 molecules. Using this general scheme on the Gay-Berne liquid crystal model, we evidence the natural motion of the director and attempt to quantify its intrinsic time scale and size dependence. Through extended simulations of systems of different size we calculate the diffusion and viscosity coefficients of this model and compare our results with values previously obtained with fixed director.
Manufacturing Laboratory | Energy Systems Integration Facility | NREL
Manufacturing Laboratory Manufacturing Laboratory Researchers in the Energy Systems Integration Facility's Manufacturing Laboratory develop methods and technologies to scale up renewable energy technology manufacturing capabilities. Photo of researchers and equipment in the Manufacturing Laboratory. Capability Hubs
NASA Astrophysics Data System (ADS)
Hidalgo, J. J.; MacMinn, C. W.; Cueto-Felgueroso, L.; Fe, J.
2011-12-01
Dissolution by convective mixing is one of the main trapping mechanisms during CO2 sequestration in saline aquifers. The free-phase CO2 tends to rise due to buoyancy, accumulate beneath the caprock and dissolve into the brine, initially by diffusion. The CO2-brine mixture, however, is denser than the two initial fluids, leading to a Rayleigh-Bénard-type instability known as convective mixing, which greatly accelerates CO2 dissolution. Although this is a well-known process, it remains unclear how convective mixing scales with the governing parameters of the system and its impact on the actual mixing of CO2 and brine. Here, we perform high-resolution numerical simulations and laboratory experiments with an analogue fluid system (water and propylene glycol) to explore the dependence of the CO2 dissolution flux on the nonlinearity of the density and viscosity of the fluid mixture. We find that the convective flux depends strongly on the value of the concentration for which the density of the mixture is maximum, and on the viscosity contrast between the fluids. From the experimental and simulation results we elucidate the scaling behavior of convective mixing, and clarify the role of nonlinear density and viscosity feedbacks in the interpretation of the analogue-fluid experiments.
Experiments and High-resolution Simulations of Density and Viscosity Feedbacks on Convective Mixing
NASA Astrophysics Data System (ADS)
Hidalgo, Juan J.; Fe, Jaime; MacMinn, Christopher W.; Cueto-Felgueroso, Luis; Juanes, Ruben
2011-11-01
Dissolution by convective mixing is one of the main trapping mechanisms during CO2 sequestration in saline aquifers. Initially, the buoyant CO2 dissolves into the underlying brine by diffusion. The CO2-brine mixture is denser than the two initial fluids, leading to a Rayleigh-Bénard-type instability known as convective mixing, which greatly accelerates CO2 dissolution. Although this is a well-known process, it remains unclear how convective mixing scales with the governing parameters of the system and its impact on the actual mixing of CO2 and brine. We explore the dependence of the CO2 dissolution flux on the nonlinearity of the density and viscosity of the fluid mixture by means of high-resolution numerical simulations and laboratory experiments with an analogue fluid system (water and propylene glycol). We find that the value of the concentration for which the density of the mixture is maximum, and the viscosity contrast between the fluids, both exert a powerful control on the convective flux. From the experimental and simulation results, we obtain the scaling behavior of convective mixing, and clarify the role of nonlinear density and viscosity feedbacks. JJH acknowledges the support from the FP7 Marie Curie Actions of the European Commission, via the CO2-MATE project (PIOF-GA-2009-253678).
Modelling and scale-up of chemical flooding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Lake, L.W.; Sepehrnoori, K.
1990-03-01
The objective of this research is to develop, validate, and apply a comprehensive chemical flooding simulator for chemical recovery processes involving surfactants, polymers, and alkaline chemicals in various combinations. This integrated program includes components of laboratory experiments, physical property modelling, scale-up theory, and numerical analysis as necessary and integral components of the simulation activity. We have continued to develop, test, and apply our chemical flooding simulator (UTCHEM) to a wide variety of laboratory and reservoir problems involving tracers, polymers, polymer gels, surfactants, and alkaline agents. Part I is an update on the Application of Higher-Order Methods in Chemical Flooding Simulation.more » This update focuses on the comparison of grid orientation effects for four different numerical methods implemented in UTCHEM. Part II is on Simulation Design Studies and is a continuation of Saad's Big Muddy surfactant pilot simulation study reported last year. Part III reports on the Simulation of Gravity Effects under conditions similar to those of some of the oil reservoirs in the North Sea. Part IV is on Determining Oil Saturation from Interwell Tracers UTCHEM is used for large-scale interwell tracer tests. A systematic procedure for estimating oil saturation from interwell tracer data is developed and a specific example based on the actual field data provided by Sun E P Co. is given. Part V reports on the Application of Vectorization and Microtasking for Reservoir Simulation. Part VI reports on Alkaline Simulation. The alkaline/surfactant/polymer flood compositional simulator (UTCHEM) reported last year is further extended to include reactions involving chemical species containing magnesium, aluminium and silicon as constituent elements. Part VII reports on permeability and trapping of microemulsion.« less
Biological and chemical terrorism scenarios and implications for detection systems needs
NASA Astrophysics Data System (ADS)
Gordon, Susanna P.; Chumfong, Isabelle; Edwards, Donna M.; Gleason, Nathaniel J.; West, Todd; Yang, Lynn
2007-04-01
Terrorists intent on causing many deaths and severe disruption to our society could, in theory, cause hundreds to tens of thousands of deaths and significant contamination of key urban facilities by using chemical or biological (CB) agents. The attacks that have occurred to date, such as the 1995 Aum Shinrikyo CB attacks and the 2001 anthrax letters, have been very small on the scale of what is possible. In order to defend against and mitigate the impacts of large-scale terrorist attacks, defensive systems for protection of urban areas and high-value facilities from biological and chemical threats have been deployed. This paper reviews analyses of such scenarios and of the efficacy of potential response options, discusses defensive systems that have been deployed and detectors that are being developed, and finally outlines the detection systems that will be needed for improved CB defense in the future. Sandia's collaboration with San Francisco International Airport on CB defense will also be briefly reviewed, including an overview of airport facility defense guidelines produced in collaboration with Lawrence Berkeley National Laboratory. The analyses that will be discussed were conducted by Sandia National Laboratories' Systems Studies Department in support of the U.S. Department of Homeland Security (DHS) Science and Technology Directorate, and include quantitative analyses utilizing simulation models developed through close collaboration with subject matter experts, such as public health officials in urban areas and biological defense experts.
Comparison of Monte Carlo simulated and measured performance parameters of miniPET scanner
NASA Astrophysics Data System (ADS)
Kis, S. A.; Emri, M.; Opposits, G.; Bükki, T.; Valastyán, I.; Hegyesi, Gy.; Imrek, J.; Kalinka, G.; Molnár, J.; Novák, D.; Végh, J.; Kerek, A.; Trón, L.; Balkay, L.
2007-02-01
In vivo imaging of small laboratory animals is a valuable tool in the development of new drugs. For this purpose, miniPET, an easy to scale modular small animal PET camera has been developed at our institutes. The system has four modules, which makes it possible to rotate the whole detector system around the axis of the field of view. Data collection and image reconstruction are performed using a data acquisition (DAQ) module with Ethernet communication facility and a computer cluster of commercial PCs. Performance tests were carried out to determine system parameters, such as energy resolution, sensitivity and noise equivalent count rate. A modified GEANT4-based GATE Monte Carlo software package was used to simulate PET data analogous to those of the performance measurements. GATE was run on a Linux cluster of 10 processors (64 bit, Xeon with 3.0 GHz) and controlled by a SUN grid engine. The application of this special computer cluster reduced the time necessary for the simulations by an order of magnitude. The simulated energy spectra, maximum rate of true coincidences and sensitivity of the camera were in good agreement with the measured parameters.
Development of an Indexing Media Filtration System for Long Duration Space Missions
NASA Technical Reports Server (NTRS)
Agui, Juan H.; Vijayakumar, R.
2013-01-01
The effective maintenance of air quality aboard spacecraft cabins will be vital to future human exploration missions. A key component will be the air cleaning filtration system which will need to remove a broad size range of particles including skin flakes, hair and clothing fibers, other biological matter, and particulate matter derived from material and equipment wear. In addition, during surface missions any extraterrestrial planetary dust, including dust generated by near-by ISRU equipment, which is tracked into the habitat will also need to be managed by the filtration system inside the pressurized habitat compartments. An indexing media filter system is being developed to meet the demand for long-duration missions that will result in dramatic increases in filter service life and loading capacity, and will require minimal crew involvement. These features may also benefit other closed systems, such as submarines, and remote location terrestrial installations where servicing and replacement of filter units is not practical. The filtration system consists of three stages: an inertial impactor stage, an indexing media stage, and a high-efficiency filter stage, packaged in a stacked modular cartridge configuration. Each stage will target a specific range of particle sizes that optimize the filtration and regeneration performance of the system. An 1/8th scale and full-scale prototype of the filter system have been fabricated and have been tested in the laboratory and reduced gravity environments that simulate conditions on spacecrafts, landers and habitats. Results from recent laboratory and reducegravity flight tests data will be presented.
Atomistic Simulation of Initiation in Hexanitrostilbene
NASA Astrophysics Data System (ADS)
Shan, Tzu-Ray; Wixom, Ryan; Yarrington, Cole; Thompson, Aidan
2015-06-01
We report on the effect of cylindrical voids on hot spot formation, growth and chemical reaction initiation in hexanitrostilbene (HNS) crystals subjected to shock. Large-scale, reactive molecular dynamics simulations are performed using the reactive force field (ReaxFF) as implemented in the LAMMPS software. The ReaxFF force field description for HNS has been validated previously by comparing the isothermal equation of state to available diamond anvil cell (DAC) measurements and density function theory (DFT) calculations and by comparing the primary dissociation pathway to ab initio calculations. Micron-scale molecular dynamics simulations of a supported shockwave propagating through the HNS crystal along the [010] orientation are performed with an impact velocity (or particle velocity) of 1.25 km/s, resulting in shockwave propagation at 4.0 km/s in the bulk material and a bulk shock pressure of ~ 11GPa. The effect of cylindrical void sizes varying from 0.02 to 0.1 μm on hot spot formation and growth rate has been studied. Interaction between multiple voids in the HNS crystal and its effect on hot spot formation will also be addressed. Results from the micron-scale atomistic simulations are compared with hydrodynamics simulations. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DOE National Nuclear Security Administration under Contract DE-AC04-94AL85000.
Simulation of General Physics laboratory exercise
NASA Astrophysics Data System (ADS)
Aceituno, P.; Hernández-Aceituno, J.; Hernández-Cabrera, A.
2015-01-01
Laboratory exercises are an important part of general Physics teaching, both during the last years of high school and the first year of college education. Due to the need to acquire enough laboratory equipment for all the students, and the widespread access to computers rooms in teaching, we propose the development of computer simulated laboratory exercises. A representative exercise in general Physics is the calculation of the gravity acceleration value, through the free fall motion of a metal ball. Using a model of the real exercise, we have developed an interactive system which allows students to alter the starting height of the ball to obtain different fall times. The simulation was programmed in ActionScript 3, so that it can be freely executed in any operative system; to ensure the accuracy of the calculations, all the input parameters of the simulations were modelled using digital measurement units, and to allow a statistical management of the resulting data, measurement errors are simulated through limited randomization.
Preliminary design, analysis, and costing of a dynamic scale model of the NASA space station
NASA Technical Reports Server (NTRS)
Gronet, M. J.; Pinson, E. D.; Voqui, H. L.; Crawley, E. F.; Everman, M. R.
1987-01-01
The difficulty of testing the next generation of large flexible space structures on the ground places an emphasis on other means for validating predicted on-orbit dynamic behavior. Scale model technology represents one way of verifying analytical predictions with ground test data. This study investigates the preliminary design, scaling and cost trades for a Space Station dynamic scale model. The scaling of nonlinear joint behavior is studied from theoretical and practical points of view. Suspension system interaction trades are conducted for the ISS Dual Keel Configuration and Build-Up Stages suspended in the proposed NASA/LaRC Large Spacecraft Laboratory. Key issues addressed are scaling laws, replication vs. simulation of components, manufacturing, suspension interactions, joint behavior, damping, articulation capability, and cost. These issues are the subject of parametric trades versus the scale model factor. The results of these detailed analyses are used to recommend scale factors for four different scale model options, each with varying degrees of replication. Potential problems in constructing and testing the scale model are identified, and recommendations for further study are outlined.
Delvigne, Frank; Takors, Ralf; Mudde, Rob; van Gulik, Walter; Noorman, Henk
2017-09-01
Efficient optimization of microbial processes is a critical issue for achieving a number of sustainable development goals, considering the impact of microbial biotechnology in agrofood, environment, biopharmaceutical and chemical industries. Many of these applications require scale-up after proof of concept. However, the behaviour of microbial systems remains unpredictable (at least partially) when shifting from laboratory-scale to industrial conditions. The need for robust microbial systems is thus highly needed in this context, as well as a better understanding of the interactions between fluid mechanics and cell physiology. For that purpose, a full scale-up/down computational framework is already available. This framework links computational fluid dynamics (CFD), metabolic flux analysis and agent-based modelling (ABM) for a better understanding of the cell lifelines in a heterogeneous environment. Ultimately, this framework can be used for the design of scale-down simulators and/or metabolically engineered cells able to cope with environmental fluctuations typically found in large-scale bioreactors. However, this framework still needs some refinements, such as a better integration of gas-liquid flows in CFD, and taking into account intrinsic biological noise in ABM. © 2017 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
Egbers, Christoph; Futterer, Birgit; Zaussinger, Florian; Harlander, Uwe
2014-05-01
Baroclinic waves are responsible for the transport of heat and momentum in the oceans, in the Earth's atmosphere as well as in other planetary atmospheres. The talk will give an overview on possibilities to simulate such large scale as well as co-existing small scale structures with the help of well defined laboratory experiments like the baroclinic wave tank (annulus experiment). The analogy between the Earth's atmosphere and the rotating cylindrical annulus experiment only driven by rotation and differential heating between polar and equatorial regions is obvious. From the Gulf stream single vortices seperate from time to time. The same dynamics and the co-existence of small and large scale structures and their separation can be also observed in laboratory experiments as in the rotating cylindrical annulus experiment. This experiment represents the mid latitude dynamics quite well and is part as a central reference experiment in the German-wide DFG priority research programme ("METSTRÖM", SPP 1276) yielding as a benchmark for lot of different numerical methods. On the other hand, those laboratory experiments in cylindrical geometry are limited due to the fact, that the surface and real interaction between polar and equatorial region and their different dynamics can not be really studied. Therefore, I demonstrate how to use the very successful Geoflow I and Geoflow II space experiment hardware on ISS with future modifications for simulations of small and large scale planetary atmospheric motion in spherical geometry with differential heating between inner and outer spheres as well as between the polar and equatorial regions. References: Harlander, U., Wenzel, J., Wang, Y., Alexandrov, K. & Egbers, Ch., 2012, Simultaneous PIV- and thermography measurements of partially blocked flow in a heated rotating annulus, Exp. in Fluids, 52 (4), 1077-1087 Futterer, B., Krebs, A., Plesa, A.-C., Zaussinger, F., Hollerbach, R., Breuer, D. & Egbers, Ch., 2013, Sheet-like and plume-like thermal flow in a spherical convection experiment performed under microgravity, J. Fluid Mech., vol. 75, p 647-683
Li, Ting; Petrini, Marcia A; Stone, Teresa E
2018-02-01
The study aim was to identify the perceived perspectives of baccalaureate nursing students toward the peer tutoring in the simulation laboratory. Insight into the nursing students' experiences and baseline data related to their perception of peer tutoring will assist to improve nursing education. Q methodology was applied to explore the students' perspectives of peer tutoring in the simulation laboratory. A convenience P-sample of 40 baccalaureate nursing students was used. Fifty-eight selected Q statements from each participant were classified into the shape of a normal distribution using an 11-point bipolar scale form with a range from -5 to +5. PQ Method software analyzed the collected data. Three discrete factors emerged: Factor I ("Facilitate or empower" knowledge acquisition), Factor II ("Safety Net" Support environment), and Factor III ("Mentoring" learn how to learn). The findings of this study support and indicate that peer tutoring is an effective supplementary strategy to promote baccalaureate students' knowledge acquisition, establishing a supportive safety net and facilitating their abilities to learn in the simulation laboratory. Copyright © 2017 Elsevier Ltd. All rights reserved.
Multi-scale modeling of multi-component reactive transport in geothermal aquifers
NASA Astrophysics Data System (ADS)
Nick, Hamidreza M.; Raoof, Amir; Wolf, Karl-Heinz; Bruhn, David
2014-05-01
In deep geothermal systems heat and chemical stresses can cause physical alterations, which may have a significant effect on flow and reaction rates. As a consequence it will lead to changes in permeability and porosity of the formations due to mineral precipitation and dissolution. Large-scale modeling of reactive transport in such systems is still challenging. A large area of uncertainty is the way in which the pore-scale information controlling the flow and reaction will behave at a larger scale. A possible choice is to use constitutive relationships relating, for example the permeability and porosity evolutions to the change in the pore geometry. While determining such relationships through laboratory experiments may be limited, pore-network modeling provides an alternative solution. In this work, we introduce a new workflow in which a hybrid Finite-Element Finite-Volume method [1,2] and a pore network modeling approach [3] are employed. Using the pore-scale model, relevant constitutive relations are developed. These relations are then embedded in the continuum-scale model. This approach enables us to study non-isothermal reactive transport in porous media while accounting for micro-scale features under realistic conditions. The performance and applicability of the proposed model is explored for different flow and reaction regimes. References: 1. Matthäi, S.K., et al.: Simulation of solute transport through fractured rock: a higher-order accurate finite-element finite-volume method permitting large time steps. Transport in porous media 83.2 (2010): 289-318. 2. Nick, H.M., et al.: Reactive dispersive contaminant transport in coastal aquifers: Numerical simulation of a reactive Henry problem. Journal of contaminant hydrology 145 (2012), 90-104. 3. Raoof A., et al.: PoreFlow: A Complex pore-network model for simulation of reactive transport in variably saturated porous media, Computers & Geosciences, 61, (2013), 160-174.
Huff, G.F.
2004-01-01
The tendency of solutes in input water to precipitate efficiency lowering scale deposits on the membranes of reverse osmosis (RO) desalination systems is an important factor in determining the suitability of input water for desalination. Simulated input water evaporation can be used as a technique to quantitatively assess the potential for scale formation in RO desalination systems. The technique was demonstrated by simulating the increase in solute concentrations required to form calcite, gypsum, and amorphous silica scales at 25??C and 40??C from 23 desalination input waters taken from the literature. Simulation results could be used to quantitatively assess the potential of a given input water to form scale or to compare the potential of a number of input waters to form scale during RO desalination. Simulated evaporation of input waters cannot accurately predict the conditions under which scale will form owing to the effects of potentially stable supersaturated solutions, solution velocity, and residence time inside RO systems. However, the simulated scale-forming potential of proposed input waters could be compared with the simulated scale-forming potentials and actual scale-forming properties of input waters having documented operational histories in RO systems. This may provide a technique to estimate the actual performance and suitability of proposed input waters during RO.
Moving from Batch to Field Using the RT3D Reactive Transport Modeling System
NASA Astrophysics Data System (ADS)
Clement, T. P.; Gautam, T. R.
2002-12-01
The public domain reactive transport code RT3D (Clement, 1997) is a general-purpose numerical code for solving coupled, multi-species reactive transport in saturated groundwater systems. The code uses MODFLOW to simulate flow and several modules of MT3DMS to simulate the advection and dispersion processes. RT3D employs the operator-split strategy which allows the code solve the coupled reactive transport problem in a modular fashion. The coupling between reaction and transport is defined through a separate module where the reaction equations are specified. The code supports a versatile user-defined reaction option that allows users to define their own reaction system through a Fortran-90 subroutine, known as the RT3D-reaction package. Further a utility code, known as BATCHRXN, allows the users to independently test and debug their reaction package. To analyze a new reaction system at a batch scale, users should first run BATCHRXN to test the ability of their reaction package to model the batch data. After testing, the reaction package can simply be ported to the RT3D environment to study the model response under 1-, 2-, or 3-dimensional transport conditions. This paper presents example problems that demonstrate the methods for moving from batch to field-scale simulations using BATCHRXN and RT3D codes. The first example describes a simple first-order reaction system for simulating the sequential degradation of Tetrachloroethene (PCE) and its daughter products. The second example uses a relatively complex reaction system for describing the multiple degradation pathways of Tetrachloroethane (PCA) and its daughter products. References 1) Clement, T.P, RT3D - A modular computer code for simulating reactive multi-species transport in 3-Dimensional groundwater aquifers, Battelle Pacific Northwest National Laboratory Research Report, PNNL-SA-28967, September, 1997. Available at: http://bioprocess.pnl.gov/rt3d.htm.
An Integrated Unix-based CAD System for the Design and Testing of Custom VLSI Chips
NASA Technical Reports Server (NTRS)
Deutsch, L. J.
1985-01-01
A computer aided design (CAD) system that is being used at the Jet Propulsion Laboratory for the design of custom and semicustom very large scale integrated (VLSI) chips is described. The system consists of a Digital Equipment Corporation VAX computer with the UNIX operating system and a collection of software tools for the layout, simulation, and verification of microcircuits. Most of these tools were written by the academic community and are, therefore, available to JPL at little or no cost. Some small pieces of software have been written in-house in order to make all the tools interact with each other with a minimal amount of effort on the part of the designer.
Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool
NASA Technical Reports Server (NTRS)
Ramachandran, N.
2005-01-01
This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data and inferences from the tests will allow for improvements in the development and design of next generation life support systems and configurations. Preliminary experimental and modeling work in this area will be presented. This involves testing of a single inlet-exit model with detailed 3-D flow visualization and quantitative diagnostics and computational modeling of the system.
An Object-Oriented Finite Element Framework for Multiphysics Phase Field Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael R Tonks; Derek R Gaston; Paul C Millett
2012-01-01
The phase field approach is a powerful and popular method for modeling microstructure evolution. In this work, advanced numerical tools are used to create a phase field framework that facilitates rapid model development. This framework, called MARMOT, is based on Idaho National Laboratory's finite element Multiphysics Object-Oriented Simulation Environment. In MARMOT, the system of phase field partial differential equations (PDEs) are solved simultaneously with PDEs describing additional physics, such as solid mechanics and heat conduction, using the Jacobian-Free Newton Krylov Method. An object-oriented architecture is created by taking advantage of commonalities in phase fields models to facilitate development of newmore » models with very little written code. In addition, MARMOT provides access to mesh and time step adaptivity, reducing the cost for performing simulations with large disparities in both spatial and temporal scales. In this work, phase separation simulations are used to show the numerical performance of MARMOT. Deformation-induced grain growth and void growth simulations are included to demonstrate the muliphysics capability.« less
Teaching Engineering Design in a Laboratory Setting
ERIC Educational Resources Information Center
Hummon, Norman P.; Bullen, A. G. R.
1974-01-01
Discusses the establishment of an environmental systems laboratory at the University of Pittsburgh with the support of the Sloan Foundation. Indicates that the "real world" can be brought into the laboratory by simulating on computers, software systems, and data bases. (CC)
NASA Technical Reports Server (NTRS)
Martinez, Debbie; Davidson, Paul C.; Kenney, P. Sean; Hutchinson, Brian K.
2004-01-01
The Flight Simulation and Software Branch (FSSB) at NASA Langley Research Center (LaRC) maintains the unique national asset identified as the Transport Research Facility (TRF). The TRF is a group of facilities and integration laboratories utilized to support the LaRC's simulation-to-flight concept. This concept incorporates common software, hardware, and processes for both groundbased flight simulators and LaRC s B-757-200 flying laboratory identified as the Airborne Research Integrated Experiments System (ARIES). These assets provide Government, industry, and academia with an efficient way to develop and test new technology concepts to enhance the capacity, safety, and operational needs of the ever-changing national airspace system. The integration of the TRF enables a smooth continuous flow of the research from simulation to actual flight test.
Towards laboratory detection of topological vortices in superfluid phases of QCD
NASA Astrophysics Data System (ADS)
Das, Arpan; Dave, Shreyansh S.; de, Somnath; Srivastava, Ajit M.
2017-10-01
Topological defects arise in a variety of systems, e.g. vortices in superfluid helium to cosmic strings in the early universe. There is an indirect evidence of neutron superfluid vortices from the glitches in pulsars. One also expects that the topological defects may arise in various high baryon density phases of quantum chromodynamics (QCD), e.g. superfluid topological vortices in the color flavor locked (CFL) phase. Though vastly different in energy/length scales, there are universal features in the formation of all these defects. Utilizing this universality, we investigate the possibility of detecting these topological superfluid vortices in laboratory experiments, namely heavy-ion collisions (HICs). Using hydrodynamic simulations, we show that vortices can qualitatively affect the power spectrum of flow fluctuations. This can give an unambiguous signal for superfluid transition resulting in vortices, allowing for the check of defect formation theories in a relativistic quantum field theory system, and the detection of superfluid phases of QCD. Detection of nucleonic superfluid vortices in low energy HICs will give opportunity for laboratory controlled study of their properties, providing crucial inputs for the physics of pulsars.
Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, C. K.; Tzeferacos, P.; Lamb, D.
X-ray images from the Chandra X-ray Observatory show that the South-East jet in the Crab nebula changes direction every few years. This remarkable phenomenon is also observed in jets associated with pulsar wind nebulae and other astrophysical objects, and therefore is a fundamental feature of astrophysical jet evolution that needs to be understood. Theoretical modeling and numerical simulations have suggested that this phenomenon may be a consequence of magnetic fields (B) and current-driven magnetohydrodynamic (MHD) instabilities taking place in the jet, but until now there has been no verification of this process in a controlled laboratory environment. Here we reportmore » the first such experiments, using scaled laboratory plasma jets generated by high-power lasers to model the Crab jet and monoenergetic-proton radiography to provide direct visualization and measurement of magnetic fields and their behavior. The toroidal magnetic field embedded in the supersonic jet triggered plasma instabilities and resulted in considerable deflections throughout the jet propagation, mimicking the kinks in the Crab jet. We also demonstrated that these kinks are stabilized by high jet velocity, consistent with the observation that instabilities alter the jet orientation but do not disrupt the overall jet structure. We successfully modeled these laboratory experiments with a validated three-dimensional (3D) numerical simulation, which in conjunction with the experiments provide compelling evidence that we have an accurate model of the most important physics of magnetic fields and MHD instabilities in the observed, kinked jet in the Crab nebula. The experiments initiate a novel approach in the laboratory for visualizing fields and instabilities associated with jets observed in various astrophysical objects, ranging from stellar to extragalactic systems. We expect that future work along this line will have important impact on the study and understanding of such fundamental astrophysical phenomena.« less
Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet
Li, C. K.; Tzeferacos, P.; Lamb, D.; ...
2016-10-07
X-ray images from the Chandra X-ray Observatory show that the South-East jet in the Crab nebula changes direction every few years. This remarkable phenomenon is also observed in jets associated with pulsar wind nebulae and other astrophysical objects, and therefore is a fundamental feature of astrophysical jet evolution that needs to be understood. Theoretical modeling and numerical simulations have suggested that this phenomenon may be a consequence of magnetic fields (B) and current-driven magnetohydrodynamic (MHD) instabilities taking place in the jet, but until now there has been no verification of this process in a controlled laboratory environment. Here we reportmore » the first such experiments, using scaled laboratory plasma jets generated by high-power lasers to model the Crab jet and monoenergetic-proton radiography to provide direct visualization and measurement of magnetic fields and their behavior. The toroidal magnetic field embedded in the supersonic jet triggered plasma instabilities and resulted in considerable deflections throughout the jet propagation, mimicking the kinks in the Crab jet. We also demonstrated that these kinks are stabilized by high jet velocity, consistent with the observation that instabilities alter the jet orientation but do not disrupt the overall jet structure. We successfully modeled these laboratory experiments with a validated three-dimensional (3D) numerical simulation, which in conjunction with the experiments provide compelling evidence that we have an accurate model of the most important physics of magnetic fields and MHD instabilities in the observed, kinked jet in the Crab nebula. The experiments initiate a novel approach in the laboratory for visualizing fields and instabilities associated with jets observed in various astrophysical objects, ranging from stellar to extragalactic systems. We expect that future work along this line will have important impact on the study and understanding of such fundamental astrophysical phenomena.« less
NASA Astrophysics Data System (ADS)
Quinn, J. D.; Larour, E. Y.; Cheng, D. L. C.; Halkides, D. J.
2016-12-01
The Virtual Earth System Laboratory (VESL) is a Web-based tool, under development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. It contains features geared toward a range of applications, spanning research and outreach. It offers an intuitive user interface, in which model inputs are changed using sliders and other interactive components. Current capabilities include simulation of polar ice sheet responses to climate forcing, based on NASA's Ice Sheet System Model (ISSM). We believe that the visualization of data is most effective when tailored to the target audience, and that many of the best practices for modern Web design/development can be applied directly to the visualization of data: use of negative space, color schemes, typography, accessibility standards, tooltips, etc cetera. We present our prototype website, and invite input from potential users, including researchers, educators, and students.
Multi-Modal Transportation System Simulation
DOT National Transportation Integrated Search
1971-01-01
THE PRESENT STATUS OF A LABORATORY BEING DEVELOPED FOR REAL-TIME SIMULATION OF COMMAND AND CONTROL FUNCTIONS IN TRANSPORTATION SYSTEMS IS DISCUSSED. DETAILS ARE GIVEN ON THE SIMULATION MODELS AND ON PROGRAMMING TECHNIQUES USED IN DEFINING AND EVALUAT...
Dynamic computer simulations of electrophoresis: three decades of active research.
Thormann, Wolfgang; Caslavska, Jitka; Breadmore, Michael C; Mosher, Richard A
2009-06-01
Dynamic models for electrophoresis are based upon model equations derived from the transport concepts in solution together with user-inputted conditions. They are able to predict theoretically the movement of ions and are as such the most versatile tool to explore the fundamentals of electrokinetic separations. Since its inception three decades ago, the state of dynamic computer simulation software and its use has progressed significantly and Electrophoresis played a pivotal role in that endeavor as a large proportion of the fundamental and application papers were published in this periodical. Software is available that simulates all basic electrophoretic systems, including moving boundary electrophoresis, zone electrophoresis, ITP, IEF and EKC, and their combinations under almost exactly the same conditions used in the laboratory. This has been employed to show the detailed mechanisms of many of the fundamental phenomena that occur in electrophoretic separations. Dynamic electrophoretic simulations are relevant for separations on any scale and instrumental format, including free-fluid preparative, gel, capillary and chip electrophoresis. This review includes a historical overview, a survey of current simulators, simulation examples and a discussion of the applications and achievements of dynamic simulation.
Laboratory Modelling of Volcano Plumbing Systems: a review
NASA Astrophysics Data System (ADS)
Galland, Olivier; Holohan, Eoghan P.; van Wyk de Vries, Benjamin; Burchardt, Steffi
2015-04-01
Earth scientists have, since the XIX century, tried to replicate or model geological processes in controlled laboratory experiments. In particular, laboratory modelling has been used study the development of volcanic plumbing systems, which sets the stage for volcanic eruptions. Volcanic plumbing systems involve complex processes that act at length scales of microns to thousands of kilometres and at time scales from milliseconds to billions of years, and laboratory models appear very suitable to address them. This contribution reviews laboratory models dedicated to study the dynamics of volcano plumbing systems (Galland et al., Accepted). The foundation of laboratory models is the choice of relevant model materials, both for rock and magma. We outline a broad range of suitable model materials used in the literature. These materials exhibit very diverse rheological behaviours, so their careful choice is a crucial first step for the proper experiment design. The second step is model scaling, which successively calls upon: (1) the principle of dimensional analysis, and (2) the principle of similarity. The dimensional analysis aims to identify the dimensionless physical parameters that govern the underlying processes. The principle of similarity states that "a laboratory model is equivalent to his geological analogue if the dimensionless parameters identified in the dimensional analysis are identical, even if the values of the governing dimensional parameters differ greatly" (Barenblatt, 2003). The application of these two steps ensures a solid understanding and geological relevance of the laboratory models. In addition, this procedure shows that laboratory models are not designed to exactly mimic a given geological system, but to understand underlying generic processes, either individually or in combination, and to identify or demonstrate physical laws that govern these processes. From this perspective, we review the numerous applications of laboratory models to understand the distinct key features of volcanic plumbing systems: dykes, cone sheets, sills, laccoliths, caldera-related structures, ground deformation, magma/fault interactions, and explosive vents. Barenblatt, G.I., 2003. Scaling. Cambridge University Press, Cambridge. Galland, O., Holohan, E.P., van Wyk de Vries, B., Burchardt, S., Accepted. Laboratory modelling of volcanic plumbing systems: A review, in: Breitkreuz, C., Rocchi, S. (Eds.), Laccoliths, sills and dykes: Physical geology of shallow level magmatic systems. Springer.
Regeneration of Exhausted Arsenic Adsorptive media of a Full Scale Treatment System
This presentation will describe the method and results of laboratory tests showing the feasibility of regenerating exhausted, iron-based, adsorptive media and the results of a follow up regeneration test at a full scale system in Twentynine Palms CA. The laboratory studies on se...
In a recently completed test program, bench-scale laboratory studies at Arizona State University (ASU) in Tempe, AZ, and pilot-scale studies in a simulated field test situation at Zentox Corp in Ocala, FL, were performed to evaluate the integration of gas-solid ultraviolet (UV) p...
Generation of dense plume fingers in saturated-unsaturated homogeneous porous media
NASA Astrophysics Data System (ADS)
Cremer, Clemens J. M.; Graf, Thomas
2015-02-01
Flow under variable-density conditions is widespread, occurring in geothermal reservoirs, at waste disposal sites or due to saltwater intrusion. The migration of dense plumes typically results in the formation of vertical plume fingers which are known to be triggered by material heterogeneity or by variations in source concentration that causes the density variation. Using a numerical groundwater model, six perturbation methods are tested under saturated and unsaturated flow conditions to mimic heterogeneity and concentration variations on the pore scale in order to realistically generate dense fingers. A laboratory-scale sand tank experiment is numerically simulated, and the perturbation methods are evaluated by comparing plume fingers obtained from the laboratory experiment with numerically simulated fingers. Dense plume fingering for saturated flow can best be reproduced with a spatially random, time-constant perturbation of the solute source. For unsaturated flow, a spatially and temporally random noise of solute concentration or a random conductivity field adequately simulate plume fingering.
Laboratory flow experiments for visualizing carbon dioxide-induced, density-driven brine convection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kneafsey, T.; Pruess, K.
2009-09-01
Injection of carbon dioxide (CO{sub 2}) into saline aquifers confined by low-permeability cap rock will result in a layer of CO{sub 2} overlying the brine. Dissolution of CO{sub 2} into the brine increases the brine density, resulting in an unstable situation in which more-dense brine overlies less-dense brine. This gravitational instability could give rise to density-driven convection of the fluid, which is a favorable process of practical interest for CO{sub 2} storage security because it accelerates the transfer of buoyant CO{sub 2} into the aqueous phase, where it is no longer subject to an upward buoyant drive. Laboratory flow visualizationmore » tests in transparent Hele-Shaw cells have been performed to elucidate the processes and rates of this CO{sub 2} solute-driven convection (CSC). Upon introduction of CO{sub 2} into the system, a layer of CO{sub 2}-laden brine forms at the CO{sub 2}-water interface. Subsequently, small convective fingers form, which coalesce, broaden, and penetrate into the test cell. Images and time-series data of finger lengths and wavelengths are presented. Observed CO{sub 2} uptake of the convection system indicates that the CO{sub 2} dissolution rate is approximately constant for each test and is far greater than expected for a diffusion-only scenario. Numerical simulations of our system show good agreement with the experiments for onset time of convection and advancement of convective fingers. There are differences as well, the most prominent being the absence of cell-scale convection in the numerical simulations. This cell-scale convection observed in the experiments is probably initiated by a small temperature gradient induced by the cell illumination.« less
Parachute Models Used in the Mars Science Laboratory Entry, Descent, and Landing Simulation
NASA Technical Reports Server (NTRS)
Cruz, Juan R.; Way, David W.; Shidner, Jeremy D.; Davis, Jody L.; Powell, Richard W.; Kipp, Devin M.; Adams, Douglas S.; Witkowski, Al; Kandis, Mike
2013-01-01
An end-to-end simulation of the Mars Science Laboratory (MSL) entry, descent, and landing (EDL) sequence was created at the NASA Langley Research Center using the Program to Optimize Simulated Trajectories II (POST2). This simulation is capable of providing numerous MSL system and flight software responses, including Monte Carlo-derived statistics of these responses. The MSL POST2 simulation includes models of EDL system elements, including those related to the parachute system. Among these there are models for the parachute geometry, mass properties, deployment, inflation, opening force, area oscillations, aerodynamic coefficients, apparent mass, interaction with the main landing engines, and off-loading. These models were kept as simple as possible, considering the overall objectives of the simulation. The main purpose of this paper is to describe these parachute system models to the extent necessary to understand how they work and some of their limitations. A list of lessons learned during the development of the models and simulation is provided. Future improvements to the parachute system models are proposed.
The Australian Computational Earth Systems Simulator
NASA Astrophysics Data System (ADS)
Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.
2001-12-01
Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.
NASA Astrophysics Data System (ADS)
Li, Xiaorong; Li, Ming; Wolf, Judith
2017-04-01
As a response to worldwide climate change, clean non-carbon renewable energy resources have been gaining significant attention. Among a range of renewable alternatives, tidal stream energy is considered very promising; due to its consistent predictability and availability. To investigate impacts of tidal stream devices on their surroundings, prototype experiments involving small scale laboratory studies have been implemented. Computational Flow Dynamics (CFD) modelling is also commonly applied to study turbine behaviours. However, these studies focus on impacts of the turbine in the near-field scale. As a result, in order to study and predict the far-field impacts caused by the operation of turbines, large scale 2D and 3D numerical oceanography models have been used, with routines added to reflect the impacts of turbines. In comparison to 2D models, 3D models are advantageous in providing complete prediction of vertical flow structures and hence mixing in the wake of a turbine. This research aims to deliver a thorough 3D tidal stream turbine simulation system, by considering major coastal processes, i.e. current, waves and sediment transport, based on a 3D wave-current-sediment fully coupled numerical oceanography model — the Unstructured Grid Finite Volume Community Ocean Model (FVCOM). The energy extraction of turbines is simulated by adding a body force to the momentum equations. Across the water depth, the coefficient related to the additional body force is given different values according to the turbine configuration and operation to reflect the vertical variation of the turbine's impacts on the passing flow. Three turbulence perturbation terms are added to the turbulence closure to simulate the turbine-induced turbulence generation, dissipation and interference for the turbulence length-scale. Impacts of turbine operation on surface waves are also considered by modification of wave energy flux across the device. A thorough validation study is carried out in which the developed model is tested; based on a combination of laboratory measured data and CFD simulated results. The developed turbine simulation system is then applied to the Anglesey coast, North Wales, UK for a case study. The validation study suggests that the developed turbine simulation system is able to accurately simulate both hydrodynamics and wave dynamics in the turbine wake. The case study with 18 turbines (diameter is 15 m) modelled individually in the waterway between the north-west Anglesey and the Skerries reveals impacts of the turbine farm on free surface elevation, flow field, turbulence kinetic energy (TKE), surface waves, bottom shear stress and suspended sediment transport. The wake is observable up to 4.5 km downstream of the device farm. Flow near the bed in the wake is accelerated, leading to enhanced bottom shear stress. The device farm has a strong influence on TKE and hence the vertical mixing of suspended sediment in the wake. Further, the eastwards directed residual sediment transport along the north coast of Anglesey is found to be weakened by the turbine farm.
Brunk, Elizabeth; Ashari, Negar; Athri, Prashanth; Campomanes, Pablo; de Carvalho, F Franco; Curchod, Basile F E; Diamantis, Polydefkis; Doemer, Manuel; Garrec, Julian; Laktionov, Andrey; Micciarelli, Marco; Neri, Marilisa; Palermo, Giulia; Penfold, Thomas J; Vanni, Stefano; Tavernelli, Ivano; Rothlisberger, Ursula
2011-01-01
The Laboratory of Computational Chemistry and Biochemistry is active in the development and application of first-principles based simulations of complex chemical and biochemical phenomena. Here, we review some of our recent efforts in extending these methods to larger systems, longer time scales and increased accuracies. Their versatility is illustrated with a diverse range of applications, ranging from the determination of the gas phase structure of the cyclic decapeptide gramicidin S, to the study of G protein coupled receptors, the interaction of transition metal based anti-cancer agents with protein targets, the mechanism of action of DNA repair enzymes, the role of metal ions in neurodegenerative diseases and the computational design of dye-sensitized solar cells. Many of these projects are done in collaboration with experimental groups from the Institute of Chemical Sciences and Engineering (ISIC) at the EPFL.
d'Entremont, Anna; Corgnale, Claudio; Hardy, Bruce; ...
2018-01-11
Concentrating solar power plants can achieve low cost and efficient renewable electricity production if equipped with adequate thermal energy storage systems. Metal hydride based thermal energy storage systems are appealing candidates due to their demonstrated potential for very high volumetric energy densities, high exergetic efficiencies, and low costs. The feasibility and performance of a thermal energy storage system based on NaMgH 2F hydride paired with TiCr 1.6Mn 0.2 is examined, discussing its integration with a solar-driven ultra-supercritical steam power plant. The simulated storage system is based on a laboratory-scale experimental apparatus. It is analyzed using a detailed transport model accountingmore » for the thermochemical hydrogen absorption and desorption reactions, including kinetics expressions adequate for the current metal hydride system. The results show that the proposed metal hydride pair can suitably be integrated with a high temperature steam power plant. The thermal energy storage system achieves output energy densities of 226 kWh/m 3, 9 times the DOE SunShot target, with moderate temperature and pressure swings. Also, simulations indicate that there is significant scope for performance improvement via heat-transfer enhancement strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
d'Entremont, Anna; Corgnale, Claudio; Hardy, Bruce
Concentrating solar power plants can achieve low cost and efficient renewable electricity production if equipped with adequate thermal energy storage systems. Metal hydride based thermal energy storage systems are appealing candidates due to their demonstrated potential for very high volumetric energy densities, high exergetic efficiencies, and low costs. The feasibility and performance of a thermal energy storage system based on NaMgH 2F hydride paired with TiCr 1.6Mn 0.2 is examined, discussing its integration with a solar-driven ultra-supercritical steam power plant. The simulated storage system is based on a laboratory-scale experimental apparatus. It is analyzed using a detailed transport model accountingmore » for the thermochemical hydrogen absorption and desorption reactions, including kinetics expressions adequate for the current metal hydride system. The results show that the proposed metal hydride pair can suitably be integrated with a high temperature steam power plant. The thermal energy storage system achieves output energy densities of 226 kWh/m 3, 9 times the DOE SunShot target, with moderate temperature and pressure swings. Also, simulations indicate that there is significant scope for performance improvement via heat-transfer enhancement strategies.« less
NASA Astrophysics Data System (ADS)
Rundle, P. B.; Rundle, J. B.; Morein, G.; Donnellan, A.; Turcotte, D.; Klein, W.
2004-12-01
The research community is rapidly moving towards the development of an earthquake forecast technology based on the use of complex, system-level earthquake fault system simulations. Using these topologically and dynamically realistic simulations, it is possible to develop ensemble forecasting methods similar to that used in weather and climate research. To effectively carry out such a program, one needs 1) a topologically realistic model to simulate the fault system; 2) data sets to constrain the model parameters through a systematic program of data assimilation; 3) a computational technology making use of modern paradigms of high performance and parallel computing systems; and 4) software to visualize and analyze the results. In particular, we focus attention on a new version of our code Virtual California (version 2001) in which we model all of the major strike slip faults in California, from the Mexico-California border to the Mendocino Triple Junction. Virtual California is a "backslip model", meaning that the long term rate of slip on each fault segment in the model is matched to the observed rate. We use the historic data set of earthquakes larger than magnitude M > 6 to define the frictional properties of 650 fault segments (degrees of freedom) in the model. To compute the dynamics and the associated surface deformation, we use message passing as implemented in the MPICH standard distribution on a Beowulf clusters consisting of >10 cpus. We also will report results from implementing the code on significantly larger machines so that we can begin to examine much finer spatial scales of resolution, and to assess scaling properties of the code. We present results of simulations both as static images and as mpeg movies, so that the dynamical aspects of the computation can be assessed by the viewer. We compute a variety of statistics from the simulations, including magnitude-frequency relations, and compare these with data from real fault systems. We report recent results on use of Virtual California for probabilistic earthquake forecasting for several sub-groups of major faults in California. These methods have the advantage that system-level fault interactions are explicitly included, as well as laboratory-based friction laws.
NASA Technical Reports Server (NTRS)
Groom, N. J.; Woolley, C. T.; Joshi, S. M.
1981-01-01
A linear analysis and the results of a nonlinear simulation of a magnetic bearing suspension system which uses permanent magnet flux biasing are presented. The magnetic bearing suspension is part of a 4068 N-m-s (3000 lb-ft-sec) laboratory model annular momentum control device (AMCD). The simulation includes rigid body rim dynamics, linear and nonlinear axial actuators, linear radial actuators, axial and radial rim warp, and power supply and power driver current limits.
Molecular-Level Simulations of the Turbulent Taylor-Green Flow
NASA Astrophysics Data System (ADS)
Gallis, M. A.; Bitter, N. P.; Koehler, T. P.; Plimpton, S. J.; Torczynski, J. R.; Papadakis, G.
2017-11-01
The Direct Simulation Monte Carlo (DSMC) method, a statistical, molecular-level technique that provides accurate solutions to the Boltzmann equation, is applied to the turbulent Taylor-Green vortex flow. The goal of this work is to investigate whether DSMC can accurately simulate energy decay in a turbulent flow. If so, then simulating turbulent flows at the molecular level can provide new insights because the energy decay can be examined in detail from molecular to macroscopic length scales, thereby directly linking molecular relaxation processes to macroscopic transport processes. The DSMC simulations are performed on half a million cores of Sequoia, the 17 Pflop platform at Lawrence Livermore National Laboratory, and the kinetic-energy dissipation rate and the energy spectrum are computed directly from the molecular velocities. The DSMC simulations are found to reproduce the Kolmogorov -5/3 law and to agree with corresponding Navier-Stokes simulations obtained using a spectral method. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525.
NETL Extreme Drilling Laboratory Studies High Pressure High Temperature Drilling Phenomena
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyons, K.D.; Honeygan, S.; Moroz, T.H.
2008-12-01
The U.S. Department of Energy's National Energy Technology Laboratory (NETL) established the Extreme Drilling Laboratory to engineer effective and efficient drilling technologies viable at depths greater than 20,000 ft. This paper details the challenges of ultradeep drilling, documents reports of decreased drilling rates as a result of increasing fluid pressure and temperature, and describes NETL's research and development activities. NETL is invested in laboratory-scale physical simulation. Its physical simulator will have capability of circulating drilling fluids at 30,000 psi and 480°F around a single drill cutter. This simulator is not yet operational; therefore, the results will be limited to themore » identification of leading hypotheses of drilling phenomena and NETL's test plans to validate or refute such theories. Of particular interest to the Extreme Drilling Laboratory's studies are the combinatorial effects of drilling fluid pressure, drilling fluid properties, rock properties, pore pressure, and drilling parameters, such as cutter rotational speed, weight on bit, and hydraulics associated with drilling fluid introduction to the rock-cutter interface. A detailed discussion of how each variable is controlled in a laboratory setting will be part of the conference paper and presentation.« less
Design and test of a simulation system for autonomous optic-navigated planetary landing
NASA Astrophysics Data System (ADS)
Cai, Sheng; Yin, Yanhe; Liu, Yanjun; He, Fengyun
2018-02-01
In this paper, a simulation system based on commercial projector is proposed to test the optical navigation algorithms for autonomous planetary landing in laboratorial scenarios. The design work of optics, mechanics and synchronization control are carried out. Furthermore, the whole simulation system is set up and tested. Through the calibration of the system, two main problems, synchronization between the projector and CCD and pixel-level shifting caused by the low repeatability of DMD used in the projector, are settled. The experimental result shows that the RMS errors of pitch, yaw and roll angles are 0.78', 0.48', and 2.95' compared with the theoretical calculation, which can fulfill the requirement of experimental simulation for planetary landing in laboratory.
NASA Astrophysics Data System (ADS)
Huang, Shiquan; Yi, Youping; Li, Pengchuan
2011-05-01
In recent years, multi-scale simulation technique of metal forming is gaining significant attention for prediction of the whole deformation process and microstructure evolution of product. The advances of numerical simulation at macro-scale level on metal forming are remarkable and the commercial FEM software, such as Deform2D/3D, has found a wide application in the fields of metal forming. However, the simulation method of multi-scale has little application due to the non-linearity of microstructure evolution during forming and the difficulty of modeling at the micro-scale level. This work deals with the modeling of microstructure evolution and a new method of multi-scale simulation in forging process. The aviation material 7050 aluminum alloy has been used as example for modeling of microstructure evolution. The corresponding thermal simulated experiment has been performed on Gleeble 1500 machine. The tested specimens have been analyzed for modeling of dislocation density, nucleation and growth of recrystallization(DRX). The source program using cellular automaton (CA) method has been developed to simulate the grain nucleation and growth, in which the change of grain topology structure caused by the metal deformation was considered. The physical fields at macro-scale level such as temperature field, stress and strain fields, which can be obtained by commercial software Deform 3D, are coupled with the deformed storage energy at micro-scale level by dislocation model to realize the multi-scale simulation. This method was explained by forging process simulation of the aircraft wheel hub forging. Coupled the results of Deform 3D with CA results, the forging deformation progress and the microstructure evolution at any point of forging could be simulated. For verifying the efficiency of simulation, experiments of aircraft wheel hub forging have been done in the laboratory and the comparison of simulation and experiment result has been discussed in details.
Inertial Fusion and High-Energy-Density Science in the United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarter, C B
2001-09-06
Inertial fusion and high-energy density science worldwide is poised to take a great leap forward. In the US, programs at the University of Rochester, Sandia National Laboratories, Los Alamos National Laboratory, Lawrence Livermore National Laboratory (LLNL), the Naval Research Laboratory, and many smaller laboratories have laid the groundwork for building a facility in which fusion ignition can be studied in the laboratory for the first time. The National Ignition Facility (NIF) is being built by the Department of Energy's National Nuclear Security Agency to provide an experimental test bed for the US Stockpile Stewardship Program (SSP) to ensure the dependabilitymore » of the country's nuclear deterrent without underground nuclear testing. NIF and other large laser systems being planned such as the Laser MegaJoule (LMJ) in France will also make important contributions to basic science, the development of inertial fusion energy, and other scientific and technological endeavors. NIF will be able to produce extreme temperatures and pressures in matter. This will allow simulating astrophysical phenomena (on a tiny scale) and measuring the equation of state of material under conditions that exist in planetary cores.« less
NASA Astrophysics Data System (ADS)
Duda, Mandy; Bracke, Rolf; Stöckhert, Ferdinand; Wittig, Volker
2017-04-01
A fundamental problem of technological applications related to the exploration and provision of geothermal energy is the inaccessibility of subsurface processes. As a result, actual reservoir properties can only be determined using (a) indirect measurement techniques such as seismic surveys, machine feedback and geophysical borehole logging, (b) laboratory experiments capable of simulating in-situ properties, but failing to preserve temporal and spatial scales, or vice versa, and (c) numerical simulations. Moreover, technological applications related to the drilling process, the completion and cementation of a wellbore or the stimulation and exploitation of the reservoir are exposed to high pressure and temperature conditions as well as corrosive environments resulting from both, rock formation and geofluid characteristics. To address fundamental and applied questions in the context of geothermal energy provision and subsurface exploration in general one of Europe's largest geoscientific laboratory infrastructures is introduced. The in-situ Borehole and Geofluid Simulator (i.BOGS) allows to simulate quasi scale-preserving processes at reservoir conditions up to depths of 5000 m and represents a large scale pressure vessel for iso-/hydrostatic and pore pressures up to 125 MPa and temperatures from -10°C to 180°C. The autoclave can either be filled with large rock core samples (25 cm in diameter, up to 3 m length) or with fluids and technical borehole devices (e.g. pumps, sensors). The pressure vessel is equipped with an ultrasound system for active transmission and passive recording of acoustic emissions, and can be complemented by additional sensors. The i.BOGS forms the basic module for the Match.BOGS finally consisting of three modules, i.e. (A) the i.BOGS, (B) the Drill.BOGS, a drilling module to be attached to the i.BOGS capable of applying realistic torques and contact forces to a drilling device that enters the i.BOGS, and (C) the Fluid.BOGS, a geofluid reactor for the composition of highly corrosive geofluids serving as synthetic groundwater / pore fluid in the i.BOGS. The i.BOGS will support scientists and engineers in developing instruments and applications such as drilling tooling and drillstrings, borehole cements and cementation procedures, geophysical tooling and sensors, or logging/measuring while drilling equipment, but will also contribute to optimized reservoir exploitation methods, for example related to stimulation techniques, pumping equipment and long-term reservoir accessibility.
Lessons Learned from the Puerto Rico Battery Energy Storage System
DOE Office of Scientific and Technical Information (OSTI.GOV)
BOYES, JOHN D.; DE ANA, MINDI FARBER; TORRES, WENCESLANO
1999-09-01
The Puerto Rico Electric Power Authority (PREPA) installed a distributed battery energy storage system in 1994 at a substation near San Juan, Puerto Rico. It was patterned after two other large energy storage systems operated by electric utilities in California and Germany. The U.S. Department of Energy (DOE) Energy Storage Systems Program at Sandia National Laboratories has followed the progress of all stages of the project since its inception. It directly supported the critical battery room cooling system design by conducting laboratory thermal testing of a scale model of the battery under simulated operating conditions. The Puerto Rico facility ismore » at present the largest operating battery storage system in the world and is successfully providing frequency control, voltage regulation, and spinning reserve to the Caribbean island. The system further proved its usefulness to the PREPA network in the fall of 1998 in the aftermath of Hurricane Georges. The owner-operator, PREPA, and the architect/engineer, vendors, and contractors learned many valuable lessons during all phases of project development and operation. In documenting these lessons, this report will help PREPA and other utilities in planning to build large energy storage systems.« less
Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, David C.; Syamlal, Madhava; Cottrell, Roger
2012-09-30
The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less
NASA Astrophysics Data System (ADS)
Higgins, N.; Lapusta, N.
2014-12-01
Many large earthquakes on natural faults are preceded by smaller events, often termed foreshocks, that occur close in time and space to the larger event that follows. Understanding the origin of such events is important for understanding earthquake physics. Unique laboratory experiments of earthquake nucleation in a meter-scale slab of granite (McLaskey and Kilgore, 2013; McLaskey et al., 2014) demonstrate that sample-scale nucleation processes are also accompanied by much smaller seismic events. One potential explanation for these foreshocks is that they occur on small asperities - or bumps - on the fault interface, which may also be the locations of smaller critical nucleation size. We explore this possibility through 3D numerical simulations of a heterogeneous 2D fault embedded in a homogeneous elastic half-space, in an attempt to qualitatively reproduce the laboratory observations of foreshocks. In our model, the simulated fault interface is governed by rate-and-state friction with laboratory-relevant frictional properties, fault loading, and fault size. To create favorable locations for foreshocks, the fault surface heterogeneity is represented as patches of increased normal stress, decreased characteristic slip distance L, or both. Our simulation results indicate that one can create a rate-and-state model of the experimental observations. Models with a combination of higher normal stress and lower L at the patches are closest to matching the laboratory observations of foreshocks in moment magnitude, source size, and stress drop. In particular, we find that, when the local compression is increased, foreshocks can occur on patches that are smaller than theoretical critical nucleation size estimates. The additional inclusion of lower L for these patches helps to keep stress drops within the range observed in experiments, and is compatible with the asperity model of foreshock sources, since one would expect more compressed spots to be smoother (and hence have lower L). In this heterogeneous rate-and-state fault model, the foreshocks interact with each other and with the overall nucleation process through their postseismic slip. The interplay amongst foreshocks, and between foreshocks and the larger-scale nucleation process, is a topic of our future work.
Hierarchical Modelling Of Mobile, Seeing Robots
NASA Astrophysics Data System (ADS)
Luh, Cheng-Jye; Zeigler, Bernard P.
1990-03-01
This paper describes the implementation of a hierarchical robot simulation which supports the design of robots with vision and mobility. A seeing robot applies a classification expert system for visual identification of laboratory objects. The visual data acquisition algorithm used by the robot vision system has been developed to exploit multiple viewing distances and perspectives. Several different simulations have been run testing the visual logic in a laboratory environment. Much work remains to integrate the vision system with the rest of the robot system.
Hierarchical modelling of mobile, seeing robots
NASA Technical Reports Server (NTRS)
Luh, Cheng-Jye; Zeigler, Bernard P.
1990-01-01
This paper describes the implementation of a hierarchical robot simulation which supports the design of robots with vision and mobility. A seeing robot applies a classification expert system for visual identification of laboratory objects. The visual data acquisition algorithm used by the robot vision system has been developed to exploit multiple viewing distances and perspectives. Several different simulations have been run testing the visual logic in a laboratory environment. Much work remains to integrate the vision system with the rest of the robot system.
Heat transfer analysis of a lab scale solar receiver using the discrete ordinates model
NASA Astrophysics Data System (ADS)
Dordevich, Milorad C. W.
This thesis documents the development, implementation and simulation outcomes of the Discrete Ordinates Radiation Model in ANSYS FLUENT simulating the radiative heat transfer occurring in the San Diego State University lab-scale Small Particle Heat Exchange Receiver. In tandem, it also serves to document how well the Discrete Ordinates Radiation Model results compared with those from the in-house developed Monte Carlo Ray Trace Method in a number of simplified geometries. The secondary goal of this study was the inclusion of new physics, specifically buoyancy. Implementation of an additional Monte Carlo Ray Trace Method software package known as VEGAS, which was specifically developed to model lab scale solar simulators and provide directional, flux and beam spread information for the aperture boundary condition, was also a goal of this study. Upon establishment of the model, test cases were run to understand the predictive capabilities of the model. It was shown that agreement within 15% was obtained against laboratory measurements made in the San Diego State University Combustion and Solar Energy Laboratory with the metrics of comparison being the thermal efficiency and outlet, wall and aperture quartz temperatures. Parametric testing additionally showed that the thermal efficiency of the system was very dependent on the mass flow rate and particle loading. It was also shown that the orientation of the small particle heat exchange receiver was important in attaining optimal efficiency due to the fact that buoyancy induced effects could not be neglected. The analyses presented in this work were all performed on the lab-scale small particle heat exchange receiver. The lab-scale small particle heat exchange receiver is 0.38 m in diameter by 0.51 m tall and operated with an input irradiation flux of 3 kWth and a nominal mass flow rate of 2 g/s with a suspended particle mass loading of 2 g/m3. Finally, based on acumen gained during the implementation and development of the model, a new and improved design was simulated to predict how the efficiency within the small particle heat exchange receiver could be improved through a few simple internal geometry design modifications. It was shown that the theoretical calculated efficiency of the small particle heat exchange receiver could be improved from 64% to 87% with adjustments to the internal geometry, mass flow rate, and mass loading.
Laboratory simulation of cratering on small bodies
NASA Technical Reports Server (NTRS)
Schmidt, Robert M.
1991-01-01
A new technique using external pressure was developed to simulate the lithostatic pressure due to self-gravity of small bodies. A 13-in. diameter cylindrical test chamber with L/D of 1 was fabricated to accommodate firing explosive charges with gas overpressures of up to 6000 psi. The chamber was hydrotested to 9000 psi. The method allows much larger scale factors that can be obtained with existing centrifuges and has the correct spherical geometry of self gravity. A simulant for jointed rock to be used in this fixture was developed using weakly cemented basalt. Various strength/pressure scaling theories can now be examined and tested.
NASA Technical Reports Server (NTRS)
Bartkus, Tadas; Tsao, Jen-Ching; Struk, Peter
2017-01-01
This paper builds on previous work that compares numerical simulations of mixed-phase icing clouds with experimental data. The model couples the thermal interaction between ice particles and water droplets of the icing cloud with the flowing air of an icing wind tunnel for simulation of NASA Glenn Research Centers (GRC) Propulsion Systems Laboratory (PSL). Measurements were taken during the Fundamentals of Ice Crystal Icing Physics Tests at the PSL tunnel in March 2016. The tests simulated ice-crystal and mixed-phase icing that relate to ice accretions within turbofan engines.
Estimation of Confined Peak Strength of Crack-Damaged Rocks
NASA Astrophysics Data System (ADS)
Bahrani, Navid; Kaiser, Peter K.
2017-02-01
It is known that the unconfined compressive strength of rock decreases with increasing density of geological features such as micro-cracks, fractures, and veins both at the laboratory specimen and rock block scales. This article deals with the confined peak strength of laboratory-scale rock specimens containing grain-scale strength dominating features such as micro-cracks. A grain-based distinct element model, whereby the rock is simulated with grains that are allowed to deform and break, is used to investigate the influence of the density of cracks on the rock strength under unconfined and confined conditions. A grain-based specimen calibrated to the unconfined and confined strengths of intact and heat-treated Wombeyan marble is used to simulate rock specimens with varying crack densities. It is demonstrated how such cracks affect the peak strength, stress-strain curve and failure mode with increasing confinement. The results of numerical simulations in terms of unconfined and confined peak strengths are used to develop semi-empirical relations that relate the difference in strength between the intact and crack-damaged rocks to the confining pressure. It is shown how these relations can be used to estimate the confined peak strength of a rock with micro-cracks when the unconfined and confined strengths of the intact rock and the unconfined strength of the crack-damaged rock are known. This approach for estimating the confined strength of crack-damaged rock specimens, called strength degradation approach, is then verified by application to published laboratory triaxial test data.
Earthquake stress drop and laboratory-inferred interseismic strength recovery
Beeler, N.M.; Hickman, S.H.; Wong, T.-F.
2001-01-01
We determine the scaling relationships between earthquake stress drop and recurrence interval tr that are implied by laboratory-measured fault strength. We assume that repeating earthquakes can be simulated by stick-slip sliding using a spring and slider block model. Simulations with static/kinetic strength, time-dependent strength, and rate- and state-variable-dependent strength indicate that the relationship between loading velocity and recurrence interval can be adequately described by the power law VL ??? trn, where n=-1. Deviations from n=-1 arise from second order effects on strength, with n>-1 corresponding to apparent time-dependent strengthening and n<-1 corresponding to weakening. Simulations with rate and state-variable equations show that dynamic shear stress drop ????d scales with recurrence as d????d/dlntr ??? ??e(b-a), where ??e is the effective normal stress, ??=??/??e, and (a-b)=d??ss/dlnV is the steady-state slip rate dependence of strength. In addition, accounting for seismic energy radiation, we suggest that the static shear stress drop ????s scales as d????s/dlntr ??? ??e(1+??)(b-a), where ?? is the fractional overshoot. The variation of ????s with lntr for earthquake stress drops is somewhat larger than implied by room temperature laboratory values of ?? and b-a. However, the uncertainty associated with the seismic data is large and the discrepancy between the seismic observations and the rate of strengthening predicted by room temperature experiments is less than an order of magnitude. Copyright 2001 by the American Geophysical Union.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flueck, Alex
The “High Fidelity, Faster than RealTime Simulator for Predicting Power System Dynamic Behavior” was designed and developed by Illinois Institute of Technology with critical contributions from Electrocon International, Argonne National Laboratory, Alstom Grid and McCoy Energy. Also essential to the project were our two utility partners: Commonwealth Edison and AltaLink. The project was a success due to several major breakthroughs in the area of largescale power system dynamics simulation, including (1) a validated faster than real time simulation of both stable and unstable transient dynamics in a largescale positive sequence transmission grid model, (2) a threephase unbalanced simulation platform formore » modeling new grid devices, such as independently controlled singlephase static var compensators (SVCs), (3) the world’s first high fidelity threephase unbalanced dynamics and protection simulator based on Electrocon’s CAPE program, and (4) a firstofits kind implementation of a singlephase induction motor model with stall capability. The simulator results will aid power grid operators in their true time of need, when there is a significant risk of cascading outages. The simulator will accelerate performance and enhance accuracy of dynamics simulations, enabling operators to maintain reliability and steer clear of blackouts. In the longterm, the simulator will form the backbone of the newly conceived hybrid realtime protection and control architecture that will coordinate local controls, widearea measurements, widearea controls and advanced realtime prediction capabilities. The nation’s citizens will benefit in several ways, including (1) less down time from power outages due to the fasterthanrealtime simulator’s predictive capability, (2) higher levels of reliability due to the detailed dynamics plus protection simulation capability, and (3) more resiliency due to the three phase unbalanced simulator’s ability to model threephase and single phase networks and devices.« less
Javaherchi, Teymour
2016-06-08
Attached are the .cas and .dat files along with the required User Defined Functions (UDFs) and look-up table of lift and drag coefficients for the Reynolds Averaged Navier-Stokes (RANS) simulation of three coaxially located lab-scaled DOE RM1 turbine implemented in ANSYS FLUENT CFD-package. The lab-scaled DOE RM1 is a re-design geometry, based of the full scale DOE RM1 design, producing same power output as the full scale model, while operating at matched Tip Speed Ratio values at reachable laboratory Reynolds number (see attached paper). In this case study the flow field around and in the wake of the lab-scaled DOE RM1 turbines in a coaxial array is simulated using Blade Element Model (a.k.a Virtual Blade Model) by solving RANS equations coupled with k-\\omega turbulence closure model. It should be highlighted that in this simulation the actual geometry of the rotor blade is not modeled. The effect of turbine rotating blades are modeled using the Blade Element Theory. This simulation provides an accurate estimate for the performance of each device and structure of their turbulent far wake. The results of these simulations were validated against the developed in-house experimental data. Simulations for other turbine configurations are available upon request.
Development of the electromagnetic technology for broken rail detection from a mobil platform
NASA Astrophysics Data System (ADS)
Plotnikov, Yuri; Raghunathan, Arun; Kumar, Ajith; Noffsinger, Joseph; Fries, Jeffrey; Ehret, Steven; Frangieh, Tannous; Palanganda, Samhitha
2016-02-01
Timely detection of breaks in running rails remains a topic of significant importance for the railroad industry. GE has been investigating new ideas of the Rail Integrity Monitoring or RIM technology that can be implemented on a wide range of the rolling stock platforms including locomotives, passenger and freight cars. The focus of the project is to establish a simple, non-contact, and inexpensive means of nondestructive inspection by fusion of known solutions with new technology development that can result in detection with high reliability. A scaled down model of a typical locomotive-track system has been developed at GE Global research for detailed study of the detection process. In addition, a finite element model has been established and used to understand distribution of the magnetic field and currents in such a system. Both models have been using the rails and wheel-axles geometry to establish a realistic model that would provide the electric current and magnetic field distribution close to the real world phenomenon. Initial magnetic field maps were obtained by scanning a 1:15 model constructed of steel bars using a 3D scanner and an inductive coil. Sensitivity to a broken rail located between two locomotive axles simulated by an opening in this metallic frame was demonstrated. Further investigation and optimization was conducted on a larger, 1:3 scale, physical model and by running mathematical simulations. Special attention was paid to consistency between the finite element and physical model results. The obtained results allowed establishment of a working frequency range, inductive current injection into the rail-wheel-axle loop and measuring the electromagnetic response to a broken rail. The verification and full scale system prototype tests are following the laboratory experiments and mathematical simulations.
NASA Technical Reports Server (NTRS)
Rochelle, W. C.; Liu, D. K.; Nunnery, W. J., Jr.; Brandli, A. E.
1975-01-01
This paper describes the application of the SINDA (systems improved numerical differencing analyzer) computer program to simulate the operation of the NASA/JSC MIUS integration and subsystems test (MIST) laboratory. The MIST laboratory is designed to test the integration capability of the following subsystems of a modular integrated utility system (MIUS): (1) electric power generation, (2) space heating and cooling, (3) solid waste disposal, (4) potable water supply, and (5) waste water treatment. The SINDA/MIST computer model is designed to simulate the response of these subsystems to externally impressed loads. The computer model determines the amount of recovered waste heat from the prime mover exhaust, water jacket and oil/aftercooler and from the incinerator. This recovered waste heat is used in the model to heat potable water, for space heating, absorption air conditioning, waste water sterilization, and to provide for thermal storage. The details of the thermal and fluid simulation of MIST including the system configuration, modes of operation modeled, SINDA model characteristics and the results of several analyses are described.
Support Services for Ceramic Fiber-Ceramic Matrix Composites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurley, J.P.; Crocker, C.R.
2000-06-28
Structural and functional materials used in solid- and liquid-fueled energy systems are subject to gas- and condensed-phase corrosion and erosion by entrained particles. For a given material, its temperature and the composition of the corrodents determine the corrosion rates, while gas flow conditions and particle aerodynamic diameters determine erosion rates. Because there are several mechanisms by which corrodents deposit on a surface, the corrodent composition depends not only on the composition of the fuel, but also on the temperature of the material and the size range of the particles being deposited. In general, it is difficult to simulate under controlledmore » laboratory conditions all of the possible corrosion and erosion mechanisms to which a material may be exposed in an energy system. Therefore, with funding from the Advanced Research Materials Program, the University of North Dakota Energy and Environmental Research Center (EERC) is coordinating with NCC Engineering and the National Energy Technology Laboratory (NETL) to provide researchers with no-cost opportunities to expose materials in pilot-scale systems to conditions of corrosion and erosion similar to those occurring in commercial power systems.« less
Turbulence dissipation challenge: particle-in-cell simulations
NASA Astrophysics Data System (ADS)
Roytershteyn, V.; Karimabadi, H.; Omelchenko, Y.; Germaschewski, K.
2015-12-01
We discuss application of three particle in cell (PIC) codes to the problems relevant to turbulence dissipation challenge. VPIC is a fully kinetic code extensively used to study a variety of diverse problems ranging from laboratory plasmas to astrophysics. PSC is a flexible fully kinetic code offering a variety of algorithms that can be advantageous to turbulence simulations, including high order particle shapes, dynamic load balancing, and ability to efficiently run on Graphics Processing Units (GPUs). Finally, HYPERS is a novel hybrid (kinetic ions+fluid electrons) code, which utilizes asynchronous time advance and a number of other advanced algorithms. We present examples drawn both from large-scale turbulence simulations and from the test problems outlined by the turbulence dissipation challenge. Special attention is paid to such issues as the small-scale intermittency of inertial range turbulence, mode content of the sub-proton range of scales, the formation of electron-scale current sheets and the role of magnetic reconnection, as well as numerical challenges of applying PIC codes to simulations of astrophysical turbulence.
Scaling laws in magnetized plasma turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boldyrev, Stanislav
2015-06-28
Interactions of plasma motion with magnetic fields occur in nature and in the laboratory in an impressively broad range of scales, from megaparsecs in astrophysical systems to centimeters in fusion devices. The fact that such an enormous array of phenomena can be effectively studied lies in the existence of fundamental scaling laws in plasma turbulence, which allow one to scale the results of analytic and numerical modeling to the sized of galaxies, velocities of supernovae explosions, or magnetic fields in fusion devices. Magnetohydrodynamics (MHD) provides the simplest framework for describing magnetic plasma turbulence. Recently, a number of new features ofmore » MHD turbulence have been discovered and an impressive array of thought-provoking phenomenological theories have been put forward. However, these theories have conflicting predictions, and the currently available numerical simulations are not able to resolve the contradictions. MHD turbulence exhibits a variety of regimes unusual in regular hydrodynamic turbulence. Depending on the strength of the guide magnetic field it can be dominated by weakly interacting Alfv\\'en waves or strongly interacting wave packets. At small scales such turbulence is locally anisotropic and imbalanced (cross-helical). In a stark contrast with hydrodynamic turbulence, which tends to ``forget'' global constrains and become uniform and isotropic at small scales, MHD turbulence becomes progressively more anisotropic and unbalanced at small scales. Magnetic field plays a fundamental role in turbulent dynamics. Even when such a field is not imposed by external sources, it is self-consistently generated by the magnetic dynamo action. This project aims at a comprehensive study of universal regimes of magnetic plasma turbulence, combining the modern analytic approaches with the state of the art numerical simulations. The proposed study focuses on the three topics: weak MHD turbulence, which is relevant for laboratory devices, the solar wind, solar corona heating, and planetary magnetospheres; strong MHD turbulence, which is relevant for fusion devices, star formation, cosmic rays acceleration, scattering and trapping in galaxies, as well as many aspects of dynamics, distribution and composition of space plasmas, and the process of magnetic dynamo action, which explains the generation and the structure of magnetic fields in turbulent plasmas. The planned work will aim at developing new analytic approaches, conducting new numerical simulations with currently unmatched resolution, and training students in the methods of the modern theory of plasma turbulence. The work will be performed at the University of Wisconsin--Madison.« less
NASA Astrophysics Data System (ADS)
Kouznetsova, I.; Gerhard, J. I.; Mao, X.; Barry, D. A.; Robinson, C.; Brovelli, A.; Harkness, M.; Fisher, A.; Mack, E. E.; Payne, J. A.; Dworatzek, S.; Roberts, J.
2008-12-01
A detailed model to simulate trichloroethene (TCE) dechlorination in anaerobic groundwater systems has been developed and implemented through PHAST, a robust and flexible geochemical modeling platform. The approach is comprehensive but retains flexibility such that models of varying complexity can be used to simulate TCE biodegradation in the vicinity of nonaqueous phase liquid (NAPL) source zones. The complete model considers a full suite of biological (e.g., dechlorination, fermentation, sulfate and iron reduction, electron donor competition, toxic inhibition, pH inhibition), physical (e.g., flow and mass transfer) and geochemical processes (e.g., pH modulation, gas formation, mineral interactions). Example simulations with the model demonstrated that the feedback between biological, physical, and geochemical processes is critical. Successful simulation of a thirty-two-month column experiment with site soil, complex groundwater chemistry, and exhibiting both anaerobic dechlorination and endogenous respiration, provided confidence in the modeling approach. A comprehensive suite of batch simulations was then conducted to estimate the sensitivity of predicted TCE degradation to the 36 model input parameters. A local sensitivity analysis was first employed to rank the importance of parameters, revealing that 5 parameters consistently dominated model predictions across a range of performance metrics. A global sensitivity analysis was then performed to evaluate the influence of a variety of full parameter data sets available in the literature. The modeling study was performed as part of the SABRE (Source Area BioREmediation) project, a public/private consortium whose charter is to determine if enhanced anaerobic bioremediation can result in effective and quantifiable treatment of chlorinated solvent DNAPL source areas. The modelling conducted has provided valuable insight into the complex interactions between processes in the evolving biogeochemical systems, particularly at the laboratory scale.
Yan, Shoubao; Chen, Xiangsong; Wu, Jingyong; Wang, Pingchao
2013-07-01
The aim of this study was to develop a bioprocess to produce ethanol from food waste at laboratory, semipilot and pilot scales. Laboratory tests demonstrated that ethanol fermentation with reducing sugar concentration of 200 g/L, inoculum size of 2 % (Initial cell number was 2 × 10⁶ CFU/mL) and addition of YEP (3 g/L of yeast extract and 5 g/L of peptone) was the best choice. The maximum ethanol concentration in laboratory scale (93.86 ± 1.15 g/L) was in satisfactory with semipilot scale (93.79 ± 1.11 g/L), but lower than that (96.46 ± 1.12 g/L) of pilot-scale. Similar ethanol yield and volumetric ethanol productivity of 0.47 ± 0.02 g/g, 1.56 ± 0.03 g/L/h and 0.47 ± 0.03 g/g, 1.56 ± 0.03 g/L/h after 60 h of fermentation in laboratory and semipilot fermentors, respectively, however, both were lower than that (0.48 ± 0.02 g/g, 1.79 ± 0.03 g/L/h) of pilot reactor. In addition, simple models were developed to predict the fermentation kinetics during the scale-up process and they were successfully applied to simulate experimental results.
NASA Astrophysics Data System (ADS)
Ding, Dong; Benson, David A.; Fernández-Garcia, Daniel; Henri, Christopher V.; Hyndman, David W.; Phanikumar, Mantha S.; Bolster, Diogo
2017-12-01
Measured (or empirically fitted) reaction rates at groundwater remediation sites are typically much lower than those found in the same material at the batch or laboratory scale. The reduced rates are commonly attributed to poorer mixing at the larger scales. A variety of methods have been proposed to account for this scaling effect in reactive transport. In this study, we use the Lagrangian particle-tracking and reaction (PTR) method to simulate a field bioremediation experiment at the Schoolcraft, MI site. A denitrifying bacterium, Pseudomonas Stutzeri strain KC (KC), was injected to the aquifer, along with sufficient substrate, to degrade the contaminant, carbon tetrachloride (CT), under anaerobic conditions. The PTR method simulates chemical reactions through probabilistic rules of particle collisions, interactions, and transformations to address the scale effect (lower apparent reaction rates for each level of upscaling, from batch to column to field scale). In contrast to a prior Eulerian reaction model, the PTR method is able to match the field-scale experiment using the rate coefficients obtained from batch experiments.
PIC Simulations of Hypersonic Plasma Instabilities
NASA Astrophysics Data System (ADS)
Niehoff, D.; Ashour-Abdalla, M.; Niemann, C.; Decyk, V.; Schriver, D.; Clark, E.
2013-12-01
The plasma sheaths formed around hypersonic aircraft (Mach number, M > 10) are relatively unexplored and of interest today to both further the development of new technologies and solve long-standing engineering problems. Both laboratory experiments and analytical/numerical modeling are required to advance the understanding of these systems; it is advantageous to perform these tasks in tandem. There has already been some work done to study these plasmas by experiments that create a rapidly expanding plasma through ablation of a target with a laser. In combination with a preformed magnetic field, this configuration leads to a magnetic "bubble" formed behind the front as particles travel at about Mach 30 away from the target. Furthermore, the experiment was able to show the generation of fast electrons which could be due to instabilities on electron scales. To explore this, future experiments will have more accurate diagnostics capable of observing time- and length-scales below typical ion scales, but simulations are a useful tool to explore these plasma conditions theoretically. Particle in Cell (PIC) simulations are necessary when phenomena are expected to be observed at these scales, and also have the advantage of being fully kinetic with no fluid approximations. However, if the scales of the problem are not significantly below the ion scales, then the initialization of the PIC simulation must be very carefully engineered to avoid unnecessary computation and to select the minimum window where structures of interest can be studied. One method of doing this is to seed the simulation with either experiment or ion-scale simulation results. Previous experiments suggest that a useful configuration for studying hypersonic plasma configurations is a ring of particles rapidly expanding transverse to an external magnetic field, which has been simulated on the ion scale with an ion-hybrid code. This suggests that the PIC simulation should have an equivalent configuration; however, modeling a plasma expanding radially in every direction is computationally expensive. In order to reduce the computational expense, we use a radial density profile from the hybrid simulation results to seed a self-consistent PIC simulation in one direction (x), while creating a current in the direction (y) transverse to both the drift velocity and the magnetic field (z) to create the magnetic bubble observed in experiment. The simulation will be run in two spatial dimensions but retain three velocity dimensions, and the results will be used to explore the growth of micro-instabilities present in hypersonic plasmas in the high-density region as it moves through the simulation box. This will still require a significantly large box in order to compare with experiment, as the experiments are being performed over distances of 104 λDe and durations of 105 ωpe-1.
SHynergie: Development of a virtual project laboratory for monitoring hydraulic stimulations
NASA Astrophysics Data System (ADS)
Renner, Jörg; Friederich, Wolfgang; Meschke, Günther; Müller, Thomas; Steeb, Holger
2016-04-01
Hydraulic stimulations are the primary means of developing subsurface reservoirs regarding the extent of fluid transport in them. The associated creation or conditioning of a system of hydraulic conduits involves a range of hydraulic and mechanical processes but also chemical reactions, such as dissolution and precipitation, may affect the stimulation result on time scales as short as hours. In the light of the extent and complexity of these processes, the steering potential for the operator of a stimulation critically depends on the ability to integrate the maximum amount of site-specific information with profound process understanding and a large spectrum of experience. We report on the development of a virtual project laboratory for monitoring hydraulic stimulations within the project SHynergie (http://www.ruhr-uni-bochum.de/shynergie/). The concept of the laboratory envisioned product that constitutes a preparing and accompanying rather than post-processing instrument ultimately accessible to persons responsible for a project over a web-repository. The virtual laboratory consists of a data base, a toolbox, and a model-building environment. Entries in the data base are of two categories. On the one hand, selected mineral and rock properties are provided from the literature. On the other hand, project-specific entries of any format can be made that are assigned attributes regarding their use in a stimulation problem at hand. The toolbox is interactive and allows the user to perform calculations of effective properties and simulations of different types (e.g., wave propagation in a reservoir, hydraulic test). The model component is also hybrid. The laboratory provides a library of models reflecting a range of scenarios but also allows the user to develop a site-specific model constituting the basis for simulations. The laboratory offers the option to use its components following the typical workflow of a stimulation project. The toolbox incorporates simulation instruments developed in the course of the SHynergie project that account for the experimental and modeling results of the various sub-projects.
System Dynamics Modeling of Transboundary Systems: The Bear River Basin Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerald Sehlke; Jake Jacobson
2005-09-01
System dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, system dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The Idaho National Engineering and Environmental Laboratory, a multi-purpose national laboratory managed by the Department of Energy, has developed a systems dynamics model in order to evaluate its utility for modeling large complex hydrological systems. We modeled the Bear River Basin, a transboundary basin that includes portions of Idaho,more » Utah and Wyoming. We found that system dynamics modeling is very useful for integrating surface water and groundwater data and for simulating the interactions between these sources within a given basin. In addition, we also found system dynamics modeling is useful for integrating complex hydrologic data with other information (e.g., policy, regulatory and management criteria) to produce a decision support system. Such decision support systems can allow managers and stakeholders to better visualize the key hydrologic elements and management constraints in the basin, which enables them to better understand the system via the simulation of multiple “what-if” scenarios. Although system dynamics models can be developed to conduct traditional hydraulic/hydrologic surface water or groundwater modeling, we believe that their strength lies in their ability to quickly evaluate trends and cause–effect relationships in large-scale hydrological systems; for integrating disparate data; for incorporating output from traditional hydraulic/hydrologic models; and for integration of interdisciplinary data, information and criteria to support better management decisions.« less
System Dynamics Modeling of Transboundary Systems: the Bear River Basin Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerald Sehlke; Jacob J. Jacobson
2005-09-01
System dynamics is a computer-aided approach to evaluating the interrelationships of different components and activities within complex systems. Recently, system dynamics models have been developed in areas such as policy design, biological and medical modeling, energy and the environmental analysis, and in various other areas in the natural and social sciences. The Idaho National Engineering and Environmental Laboratory, a multi-purpose national laboratory managed by the Department of Energy, has developed a systems dynamics model in order to evaluate its utility for modeling large complex hydrological systems. We modeled the Bear River Basin, a transboundary basin that includes portions of Idaho,more » Utah and Wyoming. We found that system dynamics modeling is very useful for integrating surface water and ground water data and for simulating the interactions between these sources within a given basin. In addition, we also found system dynamics modeling is useful for integrating complex hydrologic data with other information (e.g., policy, regulatory and management criteria) to produce a decision support system. Such decision support systems can allow managers and stakeholders to better visualize the key hydrologic elements and management constraints in the basin, which enables them to better understand the system via the simulation of multiple “what-if” scenarios. Although system dynamics models can be developed to conduct traditional hydraulic/hydrologic surface water or ground water modeling, we believe that their strength lies in their ability to quickly evaluate trends and cause–effect relationships in large-scale hydrological systems; for integrating disparate data; for incorporating output from traditional hydraulic/hydrologic models; and for integration of interdisciplinary data, information and criteria to support better management decisions.« less
The flight robotics laboratory
NASA Technical Reports Server (NTRS)
Tobbe, Patrick A.; Williamson, Marlin J.; Glaese, John R.
1988-01-01
The Flight Robotics Laboratory of the Marshall Space Flight Center is described in detail. This facility, containing an eight degree of freedom manipulator, precision air bearing floor, teleoperated motion base, reconfigurable operator's console, and VAX 11/750 computer system, provides simulation capability to study human/system interactions of remote systems. The facility hardware, software and subsequent integration of these components into a real time man-in-the-loop simulation for the evaluation of spacecraft contact proximity and dynamics are described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Vivek Agarwal; Kirk Fitzgerald
2013-03-01
The U.S. Department of Energy’s Light Water Reactor Sustainability program has developed a control room simulator in support of control room modernization at nuclear power plants in the U.S. This report highlights the recent completion of this reconfigurable, full-scale, full-scope control room simulator buildout at the Idaho National Laboratory. The simulator is fully reconfigurable, meaning it supports multiple plant models developed by different simulator vendors. The simulator is full-scale, using glasstop virtual panels to display the analog control boards found at current plants. The present installation features 15 glasstop panels, uniquely achieving a complete control room representation. The simulator ismore » also full-scope, meaning it uses the same plant models used for training simulators at actual plants. Unlike in the plant training simulators, the deployment on glasstop panels allows a high degree of customization of the panels, allowing the simulator to be used for research on the design of new digital control systems for control room modernization. This report includes separate sections discussing the glasstop panels, their layout to mimic control rooms at actual plants, technical details on creating a multi-plant and multi-vendor reconfigurable simulator, and current efforts to support control room modernization at U.S. utilities. The glasstop simulator provides an ideal testbed for prototyping and validating new control room concepts. Equally importantly, it is helping create a standardized and vetted human factors engineering process that can be used across the nuclear industry to ensure control room upgrades maintain and even improve current reliability and safety.« less
Ruano, M V; Ribes, J; Seco, A; Ferrer, J
2011-01-01
This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP.
Human System Simulation in Support of Human Performance Technical Basis at NPPs
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Gertman; Katya Le Blanc; alan mecham
2010-06-01
This paper focuses on strategies and progress toward establishing the Idaho National Laboratory’s (INL’s) Human Systems Simulator Laboratory at the Center for Advanced Energy Studies (CAES), a consortium of Idaho State Universities. The INL is one of the National Laboratories of the US Department of Energy. One of the first planned applications for the Human Systems Simulator Laboratory is implementation of a dynamic nuclear power plant simulation (NPP) where studies of operator workload, situation awareness, performance and preference will be carried out in simulated control rooms including nuclear power plant control rooms. Simulation offers a means by which to reviewmore » operational concepts, improve design practices and provide a technical basis for licensing decisions. In preparation for the next generation power plant and current government and industry efforts in support of light water reactor sustainability, human operators will be attached to a suite of physiological measurement instruments and, in combination with traditional Human Factors Measurement techniques, carry out control room tasks in simulated advanced digital and hybrid analog/digital control rooms. The current focus of the Human Systems Simulator Laboratory is building core competence in quantitative and qualitative measurements of situation awareness and workload. Of particular interest is whether introduction of digital systems including automated procedures has the potential to reduce workload and enhance safety while improving situation awareness or whether workload is merely shifted and situation awareness is modified in yet to be determined ways. Data analysis is carried out by engineers and scientists and includes measures of the physical and neurological correlates of human performance. The current approach supports a user-centered design philosophy (see ISO 13407 “Human Centered Design Process for Interactive Systems, 1999) wherein the context for task performance along with the requirements of the end-user are taken into account during the design process and the validity of design is determined through testing of real end users« less
Laboratory observations and simulations of phase reddening
NASA Astrophysics Data System (ADS)
Schröder, S. E.; Grynko, Ye.; Pommerol, A.; Keller, H. U.; Thomas, N.; Roush, T. L.
2014-09-01
The visible reflectance spectrum of many Solar System bodies changes with changing viewing geometry for reasons not fully understood. It is often observed to redden (increasing spectral slope) with increasing solar phase angle, an effect known as phase reddening. Only once, in an observation of the martian surface by the Viking 1 lander, was reddening observed up to a certain phase angle with bluing beyond, making the reflectance ratio as a function of phase angle shaped like an arch. However, in laboratory experiments this arch-shape is frequently encountered. To investigate this, we measured the bidirectional reflectance of particulate samples of several common rock types in the 400-1000 nm wavelength range and performed ray-tracing simulations. We confirm the occurrence of the arch for surfaces that are forward scattering, i.e. are composed of semi-transparent particles and are smooth on the scale of the particles, and for which the reflectance increases from the lower to the higher wavelength in the reflectance ratio. The arch shape is reproduced by the simulations, which assume a smooth surface. However, surface roughness on the scale of the particles, such as the Hapke and van Horn (Hapke, B., van Horn, H. [1963]. J. Geophys. Res. 68, 4545-4570) fairy castles that can spontaneously form when sprinkling a fine powder, leads to monotonic reddening. A further consequence of this form of microscopic roughness (being indistinct without the use of a microscope) is a flattening of the disk function at visible wavelengths, i.e. Lommel-Seeliger-type scattering. The experiments further reveal monotonic reddening for reflectance ratios at near-IR wavelengths. The simulations fail to reproduce this particular reddening, and we suspect that it results from roughness on the surface of the particles. Given that the regolith of atmosphereless Solar System bodies is composed of small particles, our results indicate that the prevalence of monotonic reddening and Lommel-Seeliger-type scattering for these bodies results from microscopic roughness, both in the form of structures built by the particles and roughness on the surface of the particles themselves. It follows from the singular Viking 1 observation that the surface in front of the lander was composed of semi-transparent particles, and was smooth on the scale of the particle size.
NASA Astrophysics Data System (ADS)
Schellart, Wouter P.; Strak, Vincent
2016-10-01
We present a review of the analogue modelling method, which has been used for 200 years, and continues to be used, to investigate geological phenomena and geodynamic processes. We particularly focus on the following four components: (1) the different fundamental modelling approaches that exist in analogue modelling; (2) the scaling theory and scaling of topography; (3) the different materials and rheologies that are used to simulate the complex behaviour of rocks; and (4) a range of recording techniques that are used for qualitative and quantitative analyses and interpretations of analogue models. Furthermore, we apply these four components to laboratory-based subduction models and describe some of the issues at hand with modelling such systems. Over the last 200 years, a wide variety of analogue materials have been used with different rheologies, including viscous materials (e.g. syrups, silicones, water), brittle materials (e.g. granular materials such as sand, microspheres and sugar), plastic materials (e.g. plasticine), visco-plastic materials (e.g. paraffin, waxes, petrolatum) and visco-elasto-plastic materials (e.g. hydrocarbon compounds and gelatins). These materials have been used in many different set-ups to study processes from the microscale, such as porphyroclast rotation, to the mantle scale, such as subduction and mantle convection. Despite the wide variety of modelling materials and great diversity in model set-ups and processes investigated, all laboratory experiments can be classified into one of three different categories based on three fundamental modelling approaches that have been used in analogue modelling: (1) The external approach, (2) the combined (external + internal) approach, and (3) the internal approach. In the external approach and combined approach, energy is added to the experimental system through the external application of a velocity, temperature gradient or a material influx (or a combination thereof), and so the system is open. In the external approach, all deformation in the system is driven by the externally imposed condition, while in the combined approach, part of the deformation is driven by buoyancy forces internal to the system. In the internal approach, all deformation is driven by buoyancy forces internal to the system and so the system is closed and no energy is added during an experimental run. In the combined approach, the externally imposed force or added energy is generally not quantified nor compared to the internal buoyancy force or potential energy of the system, and so it is not known if these experiments are properly scaled with respect to nature. The scaling theory requires that analogue models are geometrically, kinematically and dynamically similar to the natural prototype. Direct scaling of topography in laboratory models indicates that it is often significantly exaggerated. This can be ascribed to (1) The lack of isostatic compensation, which causes topography to be too high. (2) The lack of erosion, which causes topography to be too high. (3) The incorrect scaling of topography when density contrasts are scaled (rather than densities); In isostatically supported models, scaling of density contrasts requires an adjustment of the scaled topography by applying a topographic correction factor. (4) The incorrect scaling of externally imposed boundary conditions in isostatically supported experiments using the combined approach; When externally imposed forces are too high, this creates topography that is too high. Other processes that also affect surface topography in laboratory models but not in nature (or only in a negligible way) include surface tension (for models using fluids) and shear zone dilatation (for models using granular material), but these will generally only affect the model surface topography on relatively short horizontal length scales of the order of several mm across material boundaries and shear zones, respectively.
Laboratory development and testing of spacecraft diagnostics
NASA Astrophysics Data System (ADS)
Amatucci, William; Tejero, Erik; Blackwell, Dave; Walker, Dave; Gatling, George; Enloe, Lon; Gillman, Eric
2017-10-01
The Naval Research Laboratory's Space Chamber experiment is a large-scale laboratory device dedicated to the creation of large-volume plasmas with parameters scaled to realistic space plasmas. Such devices make valuable contributions to the investigation of space plasma phenomena under controlled, reproducible conditions, allowing for the validation of theoretical models being applied to space data. However, in addition to investigations such as plasma wave and instability studies, such devices can also make valuable contributions to the development and testing of space plasma diagnostics. One example is the plasma impedance probe developed at NRL. Originally developed as a laboratory diagnostic, the sensor has now been flown on a sounding rocket, is included on a CubeSat experiment, and will be included on the DoD Space Test Program's STP-H6 experiment on the International Space Station. In this talk, we will describe how the laboratory simulation of space plasmas made this development path possible. Work sponsored by the US Naval Research Laboratory Base Program.
Ejector subassembly for dual wall air drilling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolle, J.J.
1996-09-01
The dry drilling system developed for the Yucca Mountain Site Characterization Project incorporates a surface vacuum system to prevent drilling air and cuttings from contaminating the borehole wall during coring operations. As the drilling depth increases, however there is a potential for borehole contamination because of the limited volume of air which can be removed by the vacuum system. A feasibility analysis has shown that an ejector subassembly mounted in the drill string above the core barrel could significantly enhance the depth capacity of the dry drilling system. The ejector subassembly would use a portion of the air supplied tomore » the core bit to maintain a vacuum on the hole bottom. The results of a design study including performance testing of laboratory scale ejector simulator are presented here.« less
NEAMS Update. Quarterly Report for October - December 2011.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, K.
2012-02-16
The Advanced Modeling and Simulation Office within the DOE Office of Nuclear Energy (NE) has been charged with revolutionizing the design tools used to build nuclear power plants during the next 10 years. To accomplish this, the DOE has brought together the national laboratories, U.S. universities, and the nuclear energy industry to establish the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Program. The mission of NEAMS is to modernize computer modeling of nuclear energy systems and improve the fidelity and validity of modeling results using contemporary software environments and high-performance computers. NEAMS will create a set of engineering-level codes aimedmore » at designing and analyzing the performance and safety of nuclear power plants and reactor fuels. The truly predictive nature of these codes will be achieved by modeling the governing phenomena at the spatial and temporal scales that dominate the behavior. These codes will be executed within a simulation environment that orchestrates code integration with respect to spatial meshing, computational resources, and execution to give the user a common 'look and feel' for setting up problems and displaying results. NEAMS is building upon a suite of existing simulation tools, including those developed by the federal Scientific Discovery through Advanced Computing and Advanced Simulation and Computing programs. NEAMS also draws upon existing simulation tools for materials and nuclear systems, although many of these are limited in terms of scale, applicability, and portability (their ability to be integrated into contemporary software and hardware architectures). NEAMS investments have directly and indirectly supported additional NE research and development programs, including those devoted to waste repositories, safeguarded separations systems, and long-term storage of used nuclear fuel. NEAMS is organized into two broad efforts, each comprising four elements. The quarterly highlights October-December 2011 are: (1) Version 1.0 of AMP, the fuel assembly performance code, was tested on the JAGUAR supercomputer and released on November 1, 2011, a detailed discussion of this new simulation tool is given; (2) A coolant sub-channel model and a preliminary UO{sub 2} smeared-cracking model were implemented in BISON, the single-pin fuel code, more information on how these models were developed and benchmarked is given; (3) The Object Kinetic Monte Carlo model was implemented to account for nucleation events in meso-scale simulations and a discussion of the significance of this advance is given; (4) The SHARP neutronics module, PROTEUS, was expanded to be applicable to all types of reactors, and a discussion of the importance of PROTEUS is given; (5) A plan has been finalized for integrating the high-fidelity, three-dimensional reactor code SHARP with both the systems-level code RELAP7 and the fuel assembly code AMP. This is a new initiative; (6) Work began to evaluate the applicability of AMP to the problem of dry storage of used fuel and to define a relevant problem to test the applicability; (7) A code to obtain phonon spectra from the force-constant matrix for a crystalline lattice has been completed. This important bridge between subcontinuum and continuum phenomena is discussed; (8) Benchmarking was begun on the meso-scale, finite-element fuels code MARMOT to validate its new variable splitting algorithm; (9) A very computationally demanding simulation of diffusion-driven nucleation of new microstructural features has been completed. An explanation of the difficulty of this simulation is given; (10) Experiments were conducted with deformed steel to validate a crystal plasticity finite-element code for bodycentered cubic iron; (11) The Capability Transfer Roadmap was completed and published as an internal laboratory technical report; (12) The AMP fuel assembly code input generator was integrated into the NEAMS Integrated Computational Environment (NiCE). More details on the planned NEAMS computing environment is given; and (13) The NEAMS program website (neams.energy.gov) is nearly ready to launch.« less
Impact Flash Physics: Modeling and Comparisons With Experimental Results
NASA Astrophysics Data System (ADS)
Rainey, E.; Stickle, A. M.; Ernst, C. M.; Schultz, P. H.; Mehta, N. L.; Brown, R. C.; Swaminathan, P. K.; Michaelis, C. H.; Erlandson, R. E.
2015-12-01
Hypervelocity impacts frequently generate an observable "flash" of light with two components: a short-duration spike due to emissions from vaporized material, and a long-duration peak due to thermal emissions from expanding hot debris. The intensity and duration of these peaks depend on the impact velocity, angle, and the target and projectile mass and composition. Thus remote sensing measurements of planetary impact flashes have the potential to constrain the properties of impacting meteors and improve our understanding of impact flux and cratering processes. Interpreting impact flash measurements requires a thorough understanding of how flash characteristics correlate with impact conditions. Because planetary-scale impacts cannot be replicated in the laboratory, numerical simulations are needed to provide this insight for the solar system. Computational hydrocodes can produce detailed simulations of the impact process, but they lack the radiation physics required to model the optical flash. The Johns Hopkins University Applied Physics Laboratory (APL) developed a model to calculate the optical signature from the hot debris cloud produced by an impact. While the phenomenology of the optical signature is understood, the details required to accurately model it are complicated by uncertainties in material and optical properties and the simplifications required to numerically model radiation from large-scale impacts. Comparisons with laboratory impact experiments allow us to validate our approach and to draw insight regarding processes that occur at all scales in impact events, such as melt generation. We used Sandia National Lab's CTH shock physics hydrocode along with the optical signature model developed at APL to compare with a series of laboratory experiments conducted at the NASA Ames Vertical Gun Range. The experiments used Pyrex projectiles to impact pumice powder targets with velocities ranging from 1 to 6 km/s at angles of 30 and 90 degrees with respect to horizontal. High-speed radiometer measurements were made of the time-dependent impact flash at wavelengths of 350-1100 nm. We will present comparisons between these measurements and the output of APL's model. The results of this validation allow us to determine basic relationships between observed optical signatures and impact conditions.
Towards a unified Global Weather-Climate Prediction System
NASA Astrophysics Data System (ADS)
Lin, S. J.
2016-12-01
The Geophysical Fluid Dynamics Laboratory has been developing a unified regional-global modeling system with variable resolution capabilities that can be used for severe weather predictions and kilometer scale regional climate simulations within a unified global modeling system. The foundation of this flexible modeling system is the nonhydrostatic Finite-Volume Dynamical Core on the Cubed-Sphere (FV3). A unique aspect of FV3 is that it is "vertically Lagrangian" (Lin 2004), essentially reducing the equation sets to two dimensions, and is the single most important reason why FV3 outperforms other non-hydrostatic cores. Owning to its accuracy, adaptability, and computational efficiency, the FV3 has been selected as the "engine" for NOAA's Next Generation Global Prediction System (NGGPS). We have built into the modeling system a stretched grid, a two-way regional-global nested grid, and an optimal combination of the stretched and two-way nests capability, making kilometer-scale regional simulations within a global modeling system feasible. Our main scientific goal is to enable simulations of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously regarded as impossible. In this presentation I will demonstrate that, with the FV3, it is computationally feasible to simulate not only super-cell thunderstorms, but also the subsequent genesis of tornado-like vortices using a global model that was originally designed for climate simulations. The development and tuning strategy between traditional weather and climate models are fundamentally different due to different metrics. We were able to adapt and use traditional "climate" metrics or standards, such as angular momentum conservation, energy conservation, and flux balance at top of the atmosphere, and gain insight into problems of traditional weather prediction model for medium-range weather prediction, and vice versa. Therefore, the unification in weather and climate models can happen not just at the algorithm or parameterization level, but also in the metric and tuning strategy used for both applications, and ultimately, with benefits to both weather and climate applications.
Simulation of thermomechanical fatigue in solder joints
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, H.E.; Porter, V.L.; Fye, R.M.
1997-12-31
Thermomechanical fatigue (TMF) is a very complex phenomenon in electronic component systems and has been identified as one prominent degradation mechanism for surface mount solder joints in the stockpile. In order to precisely predict the TMF-related effects on the reliability of electronic components in weapons, a multi-level simulation methodology is being developed at Sandia National Laboratories. This methodology links simulation codes of continuum mechanics (JAS3D), microstructural mechanics (GLAD), and microstructural evolution (PARGRAIN) to treat the disparate length scales that exist between the macroscopic response of the component and the microstructural changes occurring in its constituent materials. JAS3D is used tomore » predict strain/temperature distributions in the component due to environmental variable fluctuations. GLAD identifies damage initiation and accumulation in detail based on the spatial information provided by JAS3D. PARGRAIN simulates the changes of material microstructure, such as the heterogeneous coarsening in Sn-Pb solder, when the component`s service environment varies.« less
NASA Astrophysics Data System (ADS)
Hall, Carlton Raden
A major objective of remote sensing is determination of biochemical and biophysical characteristics of plant canopies utilizing high spectral resolution sensors. Canopy reflectance signatures are dependent on absorption and scattering processes of the leaf, canopy properties, and the ground beneath the canopy. This research investigates, through field and laboratory data collection, and computer model parameterization and simulations, the relationships between leaf optical properties, canopy biophysical features, and the nadir viewed above-canopy reflectance signature. Emphasis is placed on parameterization and application of an existing irradiance radiative transfer model developed for aquatic systems. Data and model analyses provide knowledge on the relative importance of leaves and canopy biophysical features in estimating the diffuse absorption a(lambda,m-1), diffuse backscatter b(lambda,m-1), beam attenuation alpha(lambda,m-1), and beam to diffuse conversion c(lambda,m-1 ) coefficients of the two-flow irradiance model. Data sets include field and laboratory measurements from three plant species, live oak (Quercus virginiana), Brazilian pepper (Schinus terebinthifolius) and grapefruit (Citrus paradisi) sampled on Cape Canaveral Air Force Station and Kennedy Space Center Florida in March and April of 1997. Features measured were depth h (m), projected foliage coverage PFC, leaf area index LAI, and zenith leaf angle. Optical measurements, collected with a Spectron SE 590 high sensitivity narrow bandwidth spectrograph, included above canopy reflectance, internal canopy transmittance and reflectance and bottom reflectance. Leaf samples were returned to laboratory where optical and physical and chemical measurements of leaf thickness, leaf area, leaf moisture and pigment content were made. A new term, the leaf volume correction index LVCI was developed and demonstrated in support of model coefficient parameterization. The LVCI is based on angle adjusted leaf thickness Ltadj, LAI, and h (m). Its function is to translate leaf level estimates of diffuse absorption and backscatter to the canopy scale allowing the leaf optical properties to directly influence above canopy estimates of reflectance. The model was successfully modified and parameterized to operate in a canopy scale and a leaf scale mode. Canopy scale model simulations produced the best results. Simulations based on leaf derived coefficients produced calculated above canopy reflectance errors of 15% to 18%. A comprehensive sensitivity analyses indicated the most important parameters were beam to diffuse conversion c(lambda, m-1), diffuse absorption a(lambda, m-1), diffuse backscatter b(lambda, m-1), h (m), Q, and direct and diffuse irradiance. Sources of error include the estimation procedure for the direct beam to diffuse conversion and attenuation coefficients and other field and laboratory measurement and analysis errors. Applications of the model include creation of synthetic reflectance data sets for remote sensing algorithm development, simulations of stress and drought on vegetation reflectance signatures, and the potential to estimate leaf moisture and chemical status.
NASA Astrophysics Data System (ADS)
Gintautas, Vadas; Hubler, Alfred
2006-03-01
As worldwide computer resources increase in power and decrease in cost, real-time simulations of physical systems are becoming increasingly prevalent, from laboratory models to stock market projections and entire ``virtual worlds'' in computer games. Often, these systems are meticulously designed to match real-world systems as closely as possible. We study the limiting behavior of a virtual horizontally driven pendulum coupled to its real-world counterpart, where the interaction occurs on a time scale that is much shorter than the time scale of the dynamical system. We find that if the physical parameters of the virtual system match those of the real system within a certain tolerance, there is a qualitative change in the behavior of the two-pendulum system as the strength of the coupling is increased. Applications include a new method to measure the physical parameters of a real system and the use of resonance spectroscopy to refine a computer model. As virtual systems better approximate real ones, even very weak interactions may produce unexpected and dramatic behavior. The research is supported by the National Science Foundation Grant No. NSF PHY 01-40179, NSF DMS 03-25939 ITR, and NSF DGE 03-38215.
Numerical assessment of bureau of mines electric arc melter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paik, S.; Hawkes, G.; Nguyen, H.D.
1994-12-31
An electric arc melter used for the waste treatment process at Idaho National Engineering Laboratory (INEL) in cooperation with the U.S. Bureau of Mines (USBM) has been numerically studied. The arc melter is being used for vitrification of thermally oxidized, buried, transuranic (TRU) contaminated wastes by INEL in conjunction with the USBM as a part of the Buried Waste Integrated Demonstration project. The purpose of this study is to numerically investigate the performance of the laboratory-scale arc melter simulating the USBM arc melter. Initial results of modeling the full-scale USBM arc melter are also reported in this paper.
An innovative exercise method to simulate orbital EVA work - Applications to PLSS automatic controls
NASA Technical Reports Server (NTRS)
Lantz, Renee; Vykukal, H.; Webbon, Bruce
1987-01-01
An exercise method has been proposed which may satisfy the current need for a laboratory simulation representative of muscular, cardiovascular, respiratory, and thermoregulatory responses to work during orbital extravehicular activity (EVA). The simulation incorporates arm crank ergometry with a unique body support mechanism that allows all body position stabilization forces to be reacted at the feet. By instituting this exercise method in laboratory experimentation, an advanced portable life support system (PLSS) thermoregulatory control system can be designed to more accurately reflect the specific work requirements of orbital EVA.
Liu, Jianjun; Song, Rui; Cui, Mengmeng
2014-01-01
A novel approach of simulating hydromechanical coupling in pore-scale models of porous media is presented in this paper. Parameters of the sandstone samples, such as the stress-strain curve, Poisson's ratio, and permeability under different pore pressure and confining pressure, are tested in laboratory scale. The micro-CT scanner is employed to scan the samples for three-dimensional images, as input to construct the model. Accordingly, four physical models possessing the same pore and rock matrix characteristics as the natural sandstones are developed. Based on the micro-CT images, the three-dimensional finite element models of both rock matrix and pore space are established by MIMICS and ICEM software platform. Navier-Stokes equation and elastic constitutive equation are used as the mathematical model for simulation. A hydromechanical coupling analysis in pore-scale finite element model of porous media is simulated by ANSYS and CFX software. Hereby, permeability of sandstone samples under different pore pressure and confining pressure has been predicted. The simulation results agree well with the benchmark data. Through reproducing its stress state underground, the prediction accuracy of the porous rock permeability in pore-scale simulation is promoted. Consequently, the effects of pore pressure and confining pressure on permeability are revealed from the microscopic view.
Liu, Jianjun; Song, Rui; Cui, Mengmeng
2014-01-01
A novel approach of simulating hydromechanical coupling in pore-scale models of porous media is presented in this paper. Parameters of the sandstone samples, such as the stress-strain curve, Poisson's ratio, and permeability under different pore pressure and confining pressure, are tested in laboratory scale. The micro-CT scanner is employed to scan the samples for three-dimensional images, as input to construct the model. Accordingly, four physical models possessing the same pore and rock matrix characteristics as the natural sandstones are developed. Based on the micro-CT images, the three-dimensional finite element models of both rock matrix and pore space are established by MIMICS and ICEM software platform. Navier-Stokes equation and elastic constitutive equation are used as the mathematical model for simulation. A hydromechanical coupling analysis in pore-scale finite element model of porous media is simulated by ANSYS and CFX software. Hereby, permeability of sandstone samples under different pore pressure and confining pressure has been predicted. The simulation results agree well with the benchmark data. Through reproducing its stress state underground, the prediction accuracy of the porous rock permeability in pore-scale simulation is promoted. Consequently, the effects of pore pressure and confining pressure on permeability are revealed from the microscopic view. PMID:24955384
The Cell Collective: Toward an open and collaborative approach to systems biology
2012-01-01
Background Despite decades of new discoveries in biomedical research, the overwhelming complexity of cells has been a significant barrier to a fundamental understanding of how cells work as a whole. As such, the holistic study of biochemical pathways requires computer modeling. Due to the complexity of cells, it is not feasible for one person or group to model the cell in its entirety. Results The Cell Collective is a platform that allows the world-wide scientific community to create these models collectively. Its interface enables users to build and use models without specifying any mathematical equations or computer code - addressing one of the major hurdles with computational research. In addition, this platform allows scientists to simulate and analyze the models in real-time on the web, including the ability to simulate loss/gain of function and test what-if scenarios in real time. Conclusions The Cell Collective is a web-based platform that enables laboratory scientists from across the globe to collaboratively build large-scale models of various biological processes, and simulate/analyze them in real time. In this manuscript, we show examples of its application to a large-scale model of signal transduction. PMID:22871178
NASA Astrophysics Data System (ADS)
Liu, C.; Yang, X.; Bailey, V. L.; Bond-Lamberty, B. P.; Hinkle, C.
2013-12-01
Mathematical representations of hydrological and biogeochemical processes in soil, plant, aquatic, and atmospheric systems vary with scale. Process-rich models are typically used to describe hydrological and biogeochemical processes at the pore and small scales, while empirical, correlation approaches are often used at the watershed and regional scales. A major challenge for multi-scale modeling is that water flow, biogeochemical processes, and reactive transport are described using different physical laws and/or expressions at the different scales. For example, the flow is governed by the Navier-Stokes equations at the pore-scale in soils, by the Darcy law in soil columns and aquifer, and by the Navier-Stokes equations again in open water bodies (ponds, lake, river) and atmosphere surface layer. This research explores whether the physical laws at the different scales and in different physical domains can be unified to form a unified multi-scale model (UMSM) to systematically investigate the cross-scale, cross-domain behavior of fundamental processes at different scales. This presentation will discuss our research on the concept, mathematical equations, and numerical execution of the UMSM. Three-dimensional, multi-scale hydrological processes at the Disney Wilderness Preservation (DWP) site, Florida will be used as an example for demonstrating the application of the UMSM. In this research, the UMSM was used to simulate hydrological processes in rooting zones at the pore and small scales including water migration in soils under saturated and unsaturated conditions, root-induced hydrological redistribution, and role of rooting zone biogeochemical properties (e.g., root exudates and microbial mucilage) on water storage and wetting/draining. The small scale simulation results were used to estimate effective water retention properties in soil columns that were superimposed on the bulk soil water retention properties at the DWP site. The UMSM parameterized from smaller scale simulations were then used to simulate coupled flow and moisture migration in soils in saturated and unsaturated zones, surface and groundwater exchange, and surface water flow in streams and lakes at the DWP site under dynamic precipitation conditions. Laboratory measurements of soil hydrological and biogeochemical properties are used to parameterize the UMSM at the small scales, and field measurements are used to evaluate the UMSM.
RANS Simulation (Rotating Reference Frame Model [RRF]) of Single Lab-Scaled DOE RM1 MHK Turbine
Javaherchi, Teymour; Stelzenmuller, Nick; Aliseda, Alberto; Seydel, Joseph
2014-04-15
Attached are the .cas and .dat files for the Reynolds Averaged Navier-Stokes (RANS) simulation of a single lab-scaled DOE RM1 turbine implemented in ANSYS FLUENT CFD-package. The lab-scaled DOE RM1 is a re-design geometry, based of the full scale DOE RM1 design, producing same power output as the full scale model, while operating at matched Tip Speed Ratio values at reachable laboratory Reynolds number (see attached paper). In this case study taking advantage of the symmetry of lab-scaled DOE RM1 geometry, only half of the geometry is models using (Single) Rotating Reference Frame model [RRF]. In this model RANS equations, coupled with k-\\omega turbulence closure model, are solved in the rotating reference frame. The actual geometry of the turbine blade is included and the turbulent boundary layer along the blade span is simulated using wall-function approach. The rotation of the blade is modeled by applying periodic boundary condition to sets of plane of symmetry. This case study simulates the performance and flow field in the near and far wake of the device at the desired operating conditions. The results of these simulations were validated against in-house experimental data. Please see the attached paper.
Scaling effects in direct shear tests
Orlando, A.D.; Hanes, D.M.; Shen, H.H.
2009-01-01
Laboratory experiments of the direct shear test were performed on spherical particles of different materials and diameters. Results of the bulk friction vs. non-dimensional shear displacement are presented as a function of the non-dimensional particle diameter. Simulations of the direct shear test were performed using the Discrete Element Method (DEM). The simulation results show Considerable differences with the physical experiments. Particle level material properties, such as the coefficients of static friction, restitution and rolling friction need to be known a priori in order to guarantee that the simulation results are an accurate representation of the physical phenomenon. Furthermore, laboratory results show a clear size dependency on the results, with smaller particles having a higher bulk friction than larger ones. ?? 2009 American Institute of Physics.
Global view of Venus from Magellan, Pioneer, and Venera data
1991-10-29
This global view of Venus, centered at 270 degrees east longitude, is a compilation of data from several sources. Magellan synthetic aperature radar mosaics from the first cycle of Magellan mapping are mapped onto a computer-simulated globe to create the image. Data gaps are filled with Pioneer-Venus orbiter data, or a constant mid-range value. Simulated color is used to enhance small-scale structure. The simulated hues are based on color images recorded by the Soviet Venera 13 and 14 spacecraft. The image was produced at the Jet Propulsion Laboratory (JPL) Multimission Image Processing Laboratory and is a single frame from a video released at the JPL news conference, 10-29-91. View provided by JPL with alternate number P-39225 MGN81.
Preliminary assessment of the Mars Science Laboratory entry, descent, and landing simulation
NASA Astrophysics Data System (ADS)
Way, David W.
On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and the novel Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multi-body computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the Entry, Descent, and Landing system.
X-ray Micro-Tomography of Ablative Heat Shield Materials
NASA Technical Reports Server (NTRS)
Panerai, Francesco; Ferguson, Joseph; Borner, Arnaud; Mansour, Nagi N.; Barnard, Harold S.; MacDowell, Alastair A.; Parkinson, Dilworth Y.
2016-01-01
X-ray micro-tomography is a non-destructive characterization technique that allows imaging of materials structures with voxel sizes in the micrometer range. This level of resolution makes the technique very attractive for imaging porous ablators used in hypersonic entry systems. Besides providing a high fidelity description of the material architecture, micro-tomography enables computations of bulk material properties and simulations of micro-scale phenomena. This presentation provides an overview of a collaborative effort between NASA Ames Research Center and Lawrence Berkeley National Laboratory, aimed at developing micro-tomography experiments and simulations for porous ablative materials. Measurements are carried using x-rays from the Advanced Light Source at Berkeley Lab on different classes of ablative materials used in NASA entry systems. Challenges, strengths and limitations of the technique for imaging materials such as lightweight carbon-phenolic systems and woven textiles are discussed. Computational tools developed to perform numerical simulations based on micro-tomography are described. These enable computations of material properties such as permeability, thermal and radiative conductivity, tortuosity and other parameters that are used in ablator response models. Finally, we present the design of environmental cells that enable imaging materials under simulated operational conditions, such as high temperature, mechanical loads and oxidizing atmospheres.Keywords: Micro-tomography, Porous media, Ablation
Thermal Pretreatment For TRU Waste Sorting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sasaki, T.; Aoyama, Y.; Miyamoto, Y.
2008-07-01
Japan Atomic Energy Agency conducted a study on thermal treatment of TRU waste to develop a removal technology for materials that are forbidden for disposal. The thermal pretreatment in which hot nitrogen and/or air is introduced to the waste is a process of removing combustibles, liquids, and low melting point metals from PVC wrapped TRU waste. In this study, thermal pretreatment of simulated waste was conducted using a desktop thermal treatment vessel and a laboratory scale thermal pretreatment system. Combustibles and low melting point metals are effectively separated from wastes by choosing appropriate temperature of flowing gases. Combustibles such asmore » papers, PVC, oil, etc. were removed and low melting point metals such as zinc, lead, and aluminum were separated from the simulated waste by the thermal pretreatment. (authors)« less
NASA Astrophysics Data System (ADS)
Agaoglu, Berken; Scheytt, Traugott; Copty, Nadim K.
2012-10-01
This study examines the mechanistic processes governing multiphase flow of a water-cosolvent-NAPL system in saturated porous media. Laboratory batch and column flushing experiments were conducted to determine the equilibrium properties of pure NAPL and synthetically prepared NAPL mixtures as well as NAPL recovery mechanisms for different water-ethanol contents. The effect of contact time was investigated by considering different steady and intermittent flow velocities. A modified version of multiphase flow simulator (UTCHEM) was used to compare the multiphase model simulations with the column experiment results. The effect of employing different grid geometries (1D, 2D, 3D), heterogeneity and different initial NAPL saturation configurations was also examined in the model. It is shown that the change in velocity affects the mass transfer rate between phases as well as the ultimate NAPL recovery percentage. The experiments with low flow rate flushing of pure NAPL and the 3D UTCHEM simulations gave similar effluent concentrations and NAPL cumulative recoveries. Model simulations over-estimated NAPL recovery for high specific discharges and rate-limited mass transfer, suggesting a constant mass transfer coefficient for the entire flushing experiment may not be valid. When multi-component NAPLs are present, the dissolution rate of individual organic compounds (namely, toluene and benzene) into the ethanol-water flushing solution is found not to correlate with their equilibrium solubility values.
Reflectivity of the atmosphere-inhomogeneous surfaces system Laboratory simulation
NASA Technical Reports Server (NTRS)
Mekler, Y.; Kaufman, Y. J.; Fraser, R. S.
1984-01-01
Theoretical two- and three-dimensional solutions of the radiative transfer equation have been applied to the earth-atmosphere system. Such solutions have not been verified experimentally. A laboratory experiment simulates such a system to test the theory. The atmosphere was simulated by latex spheres suspended in water and the ground by a nonuniform surface, half white and half black. A stable radiation source provided uniform illumination over the hydrosol. The upward radiance along a line orthogonal to the boundary of the two-halves field was recorded for different amounts of the hydrosol. The simulation is a well-defined radiative transfer experiment to test radiative transfer models involving nonuniform surfaces. Good agreement is obtained between the measured and theoretical results.
∆ E /∆ E Measurements of Energetic Ions Using CVD Diamond Detectors
Alghamdi, Ahmed; Heilbronn, Lawrence; Castellanos, Luis A.; ...
2018-06-20
Experimental and computational results of a Δ E /Δ E diamond detection system are presented. The Δ E /Δ E detection system was evaluated using energetic proton and iron beams striking thick polyethylene targets at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL). The measured data for diamond sensor A show good agreement with the Geant4 simulation. In addition, simulations have demonstrated the ability to identify hydrogen isotopes using a diamond detection system.
∆ E /∆ E Measurements of Energetic Ions Using CVD Diamond Detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alghamdi, Ahmed; Heilbronn, Lawrence; Castellanos, Luis A.
Experimental and computational results of a Δ E /Δ E diamond detection system are presented. The Δ E /Δ E detection system was evaluated using energetic proton and iron beams striking thick polyethylene targets at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL). The measured data for diamond sensor A show good agreement with the Geant4 simulation. In addition, simulations have demonstrated the ability to identify hydrogen isotopes using a diamond detection system.
Velocity fields and optical turbulence near the boundary in a strongly convective laboratory flow
NASA Astrophysics Data System (ADS)
Matt, Silvia; Hou, Weilin; Goode, Wesley; Hellman, Samuel
2016-05-01
Boundary layers around moving underwater vehicles or other platforms can be a limiting factor for optical communication. Turbulence in the boundary layer of a body moving through a stratified medium can lead to small variations in the index of refraction, which impede optical signals. As a first step towards investigating this boundary layer effect on underwater optics, we study the flow near the boundary in the Rayleigh-Bénard laboratory tank at the Naval Research Laboratory Stennis Space Center. The tank is set up to generate temperature-driven, i.e., convective turbulence, and allows control of the turbulence intensity. This controlled turbulence environment is complemented by computational fluid dynamics simulations to visualize and quantify multi-scale flow patterns. The boundary layer dynamics in the laboratory tank are quantified using a state-of-the-art Particle Image Velocimetry (PIV) system to examine the boundary layer velocities and turbulence parameters. The velocity fields and flow dynamics from the PIV are compared to the numerical model and show the model to accurately reproduce the velocity range and flow dynamics. The temperature variations and thus optical turbulence effects can then be inferred from the model temperature data. Optical turbulence is also visible in the raw data from the PIV system. The newly collected data are consistent with previously reported measurements from high-resolution Acoustic Doppler Velocimeter profilers (Nortek Vectrino), as well as fast thermistor probes and novel next-generation fiber-optics temperature sensors. This multi-level approach to studying optical turbulence near a boundary, combining in-situ measurements, optical techniques, and numerical simulations, can provide new insight and aid in mitigating turbulence impacts on underwater optical signal transmission.
A modern space simulation facility to accommodate high production acceptance testing
NASA Technical Reports Server (NTRS)
Glover, J. D.
1986-01-01
A space simulation laboratory that supports acceptance testing of spacecraft and associated subsystems at throughput rates as high as nine per year is discussed. The laboratory includes a computer operated 27' by 30' space simulation, a 20' by 20' by 20' thermal cycle chamber and an eight station thermal cycle/thermal vacuum test system. The design philosophy and unique features of each system are discussed. The development of operating procedures, test team requirements, test team integration, and other peripheral activation details are described. A discussion of special accommodations for the efficient utilization of the systems in support of high rate production is presented.
Investigation related to hydrogen isotopes separation by cryogenic distillation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bornea, A.; Zamfirache, M.; Stefanescu, I.
2008-07-15
Research conducted in the last fifty years has shown that one of the most efficient techniques of removing tritium from the heavy water used as moderator and coolant in CANDU reactors (as that operated at Cernavoda (Romania)) is hydrogen cryogenic distillation. Designing and implementing the concept of cryogenic distillation columns require experiments to be conducted as well as computer simulations. Particularly, computer simulations are of great importance when designing and evaluating the performances of a column or a series of columns. Experimental data collected from laboratory work will be used as input for computer simulations run at larger scale (formore » The Pilot Plant for Tritium and Deuterium Separation) in order to increase the confidence in the simulated results. Studies carried out were focused on the following: - Quantitative analyses of important parameters such as the number of theoretical plates, inlet area, reflux flow, flow-rates extraction, working pressure, etc. - Columns connected in series in such a way to fulfil the separation requirements. Experiments were carried out on a laboratory-scale installation to investigate the performance of contact elements with continuous packing. The packing was manufactured in our institute. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Lake, L.W.; Sepehrnoori, K.
1988-11-01
The objective of this research is to develop, validate, and apply a comprehensive chemical flooding simulator for chemical recovery processes involving surfactants, polymers, and alkaline chemicals in various combinations. This integrated program includes components of laboratory experiments, physical property modelling, scale-up theory, and numerical analysis as necessary and integral components of the simulation activity. Developing, testing and applying flooding simulator (UTCHEM) to a wide variety of laboratory and reservoir problems involving tracers, polymers, polymer gels, surfactants, and alkaline agent has been continued. Improvements in both the physical-chemical and numerical aspects of UTCHEM have been made which enhance its versatility, accuracymore » and speed. Supporting experimental studies during the past year include relative permeability and trapping of microemulsion, tracer flow studies oil recovery in cores using alcohol free surfactant slugs, and microemulsion viscosity measurements. These have enabled model improvement simulator testing. Another code called PROPACK has also been developed which is used as a preprocessor for UTCHEM. Specifically, it is used to evaluate input to UTCHEM by computing and plotting key physical properties such as phase behavior interfacial tension.« less
Modelling runoff on ceramic tile roofs using the kinematic wave equations
NASA Astrophysics Data System (ADS)
Silveira, Alexandre; Abrantes, João; de Lima, João; Lira, Lincoln
2016-04-01
Rainwater harvesting is a water saving alternative strategy that presents many advantages and can provide solutions to address major water resources problems, such as fresh water scarcity, urban stream degradation and flooding. In recent years, these problems have become global challenges, due to climatic change, population growth and increasing urbanisation. Generally, roofs are the first to come into contact with rainwater; thus, they are the best candidates for rainwater harvesting. In this context, the correct evaluation of roof runoff quantity and quality is essential to effectively design rainwater harvesting systems. Despite this, many studies usually focus on the qualitative aspects in detriment of the quantitative aspects. Laboratory studies using rainfall simulators have been widely used to investigate rainfall-runoff processes. These studies enabled a detailed exploration and systematic replication of a large range of hydrologic conditions, such as rainfall spatial and temporal characteristics, providing for a fast way to obtain precise and consistent data that can be used to calibrate and validate numerical models. This study aims to evaluate the performance of a kinematic wave based numerical model in simulating runoff on sloping roofs, by comparing the numerical results with the ones obtained from laboratory rainfall simulations on a real-scale ceramic tile roof (Lusa tiles). For all studied slopes, simulated discharge hydrographs had a good adjust to observed ones. Coefficient of determination and Nash-Sutcliffe efficiency values were close to 1.0. Particularly, peak discharges, times to peak and peak durations were very well simulated.
NASA Technical Reports Server (NTRS)
Morris, Philip J.; McLaughlin, Dennis K.; Gabrielson, Thomas B.; Boluriaan, Said
2004-01-01
This report describes the activities completed under a grant from the NASA Langley Research Center to develop a plan for the assessment, improvement, and deployment of a Radar Acoustic Sounding System (RASS) for the detection of wake vortices. A brief review is provided of existing alternative instruments for wake vortex detection. This is followed by a review of previous implementations and assessment of a RASS. As a result of this review, it is concluded that the basic features of a RASS have several advantages over other commonly used wake vortex detection and measurement systems. Most important of these features are the good fidelity of the measurements and the potential for all weather operation. To realize the full potential of this remote sensing instrument, a plan for the development of a RASS designed specifically for wake vortex detection and measurement has been prepared. To keep costs to a minimum, this program would start with the development an inexpensive laboratory-scale version of a RASS system. The new instrument would be developed in several stages, each allowing for a critical assessment of the instrument s potential and limitations. The instrument, in its initial stages of development, would be tested in a controlled laboratory environment. A jet vortex simulator, a prototype version of which has already been fabricated, would be interrogated by the RASS system. The details of the laboratory vortex would be measured using a Particle Image Velocimetry (PIV) system. In the early development stages, the scattered radar signal would be digitized and the signal post-processed to determine how extensively and accurately the RASS could measure properties of the wake vortex. If the initial tests prove to be successful, a real-time, digital signal processing system would be developed as a component of the RASS system. At each stage of the instrument development and testing, the implications of the scaling required for a full-scale instrument would be considered. It is concluded that a RASS system, developed for the specific application of wake vortex detection, could become part of a robust Aircraft Vortex Spacing System (AVOSS). This system, in turn, could contribute to Reduced Spacing Operations (RSO) in US airports and improvements in Terminal Area productivity (TAP).
Probing free-space quantum channels with laboratory-based experiments
NASA Astrophysics Data System (ADS)
Bohmann, M.; Kruse, R.; Sperling, J.; Silberhorn, C.; Vogel, W.
2017-06-01
Atmospheric channels are a promising candidate to establish secure quantum communication on a global scale. However, due to their turbulent nature, it is crucial to understand the impact of the atmosphere on the quantum properties of light and examine it experimentally. In this paper, we introduce a method to probe atmospheric free-space links with quantum light on a laboratory scale. In contrast to previous works, our method models arbitrary intensity losses caused by turbulence to emulate general atmospheric conditions. This allows us to characterize turbulent quantum channels in a well-controlled manner. To implement this technique, we perform a series of measurements with different constant attenuations and simulate the fluctuating losses by combining the obtained data. We directly test the proposed method with an on-chip source of nonclassical light and a time-bin-multiplexed detection system. With the obtained data, we characterize the nonclassicality of the generated states for different atmospheric noise models and analyze a postselection protocol. This general technique in atmospheric quantum optics allows for studying turbulent quantum channels and predicting their properties for future applications.
Improved Pyrolysis Micro reactor Design via Computational Fluid Dynamics Simulations
2017-05-23
Dynamics Simulations Ghanshyam L. Vaghjiani Air Force Research Laboratory (AFMC) AFRL/RQRS 1 Ara Drive Edwards AFB, CA 93524-7013 Air Force...Aerospace Systems Directorate Air Force Research Laboratory AFRL/RQRS 1 Ara Road Edwards AFB, CA 93524 *Email: ghanshyam.vaghjiani@us.af.mil IMPROVED...PYROLYSIS MICRO-REACTOR DESIGN VIA COMPUTATIONAL FLUID DYNAMICS SIMULATIONS Ghanshyam L. Vaghjiani* DISTRIBUTION A: Approved for public release
HOMER Economic Models - US Navy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bush, Jason William; Myers, Kurt Steven
This LETTER REPORT has been prepared by Idaho National Laboratory for US Navy NAVFAC EXWC to support in testing pre-commercial SIREN (Simulated Integration of Renewable Energy Networks) computer software models. In the logistics mode SIREN software simulates the combination of renewable power sources (solar arrays, wind turbines, and energy storage systems) in supplying an electrical demand. NAVFAC EXWC will create SIREN software logistics models of existing or planned renewable energy projects at five Navy locations (San Nicolas Island, AUTEC, New London, & China Lake), and INL will deliver additional HOMER computer models for comparative analysis. In the transient mode SIRENmore » simulates the short time-scale variation of electrical parameters when a power outage or other destabilizing event occurs. In the HOMER model, a variety of inputs are entered such as location coordinates, Generators, PV arrays, Wind Turbines, Batteries, Converters, Grid costs/usage, Solar resources, Wind resources, Temperatures, Fuels, and Electric Loads. HOMER's optimization and sensitivity analysis algorithms then evaluate the economic and technical feasibility of these technology options and account for variations in technology costs, electric load, and energy resource availability. The Navy can then use HOMER’s optimization and sensitivity results to compare to those of the SIREN model. The U.S. Department of Energy (DOE) Idaho National Laboratory (INL) possesses unique expertise and experience in the software, hardware, and systems design for the integration of renewable energy into the electrical grid. NAVFAC EXWC will draw upon this expertise to complete mission requirements.« less
Realtime monitoring of bridge scour using remote monitoring technology
DOT National Transportation Integrated Search
2011-02-01
The research performed in this project focuses on the application of instruments including accelerometers : and tiltmeters to monitor bridge scour. First, two large scale laboratory experiments were performed. One : experiment is the simulation of a ...
NASA Astrophysics Data System (ADS)
Bolhuis, Peter
Important reaction-diffusion processes, such as biochemical networks in living cells, or self-assembling soft matter, span many orders in length and time scales. In these systems, the reactants' spatial dynamics at mesoscopic length and time scales of microns and seconds is coupled to the reactions between the molecules at microscopic length and time scales of nanometers and milliseconds. This wide range of length and time scales makes these systems notoriously difficult to simulate. While mean-field rate equations cannot describe such processes, the mesoscopic Green's Function Reaction Dynamics (GFRD) method enables efficient simulation at the particle level provided the microscopic dynamics can be integrated out. Yet, many processes exhibit non-trivial microscopic dynamics that can qualitatively change the macroscopic behavior, calling for an atomistic, microscopic description. The recently developed multiscale Molecular Dynamics Green's Function Reaction Dynamics (MD-GFRD) approach combines GFRD for simulating the system at the mesocopic scale where particles are far apart, with microscopic Molecular (or Brownian) Dynamics, for simulating the system at the microscopic scale where reactants are in close proximity. The association and dissociation of particles are treated with rare event path sampling techniques. I will illustrate the efficiency of this method for patchy particle systems. Replacing the microscopic regime with a Markov State Model avoids the microscopic regime completely. The MSM is then pre-computed using advanced path-sampling techniques such as multistate transition interface sampling. I illustrate this approach on patchy particle systems that show multiple modes of binding. MD-GFRD is generic, and can be used to efficiently simulate reaction-diffusion systems at the particle level, including the orientational dynamics, opening up the possibility for large-scale simulations of e.g. protein signaling networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, M. S.; Keene, William C.; Zhang, J.
2016-11-08
Primary marine aerosol (PMA) is emitted into the atmosphere via breaking wind waves on the ocean surface. Most parameterizations of PMA emissions use 10-meter wind speed as a proxy for wave action. This investigation coupled the 3 rd generation prognostic WAVEWATCH-III wind-wave model within a coupled Earth system model (ESM) to drive PMA production using wave energy dissipation rate – analogous to whitecapping – in place of 10-meter wind speed. The wind speed parameterization did not capture basin-scale variability in relations between wind and wave fields. Overall, the wave parameterization did not improve comparison between simulated versus measured AOD ormore » Na +, thus highlighting large remaining uncertainties in model physics. Results confirm the efficacy of prognostic wind-wave models for air-sea exchange studies coupled with laboratory- and field-based characterizations of the primary physical drivers of PMA production. No discernible correlations were evident between simulated PMA fields and observed chlorophyll or sea surface temperature.« less
Energy Systems Integration News | Energy Systems Integration Facility |
Power Grid Simulation at a Distance NREL and Idaho National Laboratory (INL) have successfully connected of Power System Modeling and Simulation: "Bus.py: A GridLAB-D Communication Interface for Smart Modeling and Simulation" session at the IEEE PES General Meeting in Denver, Colorado, from 15 p.m. on
PILOT-SCALE REMOVAL OF FLUORIDE FROM LEGACY PLUTONIUM MATERIALS USING VACUUM SALT DISTILLATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierce, R. A.; Pak, D. J.
2012-09-11
Between September 2009 and January 2011, the Savannah River National Laboratory (SRNL) and HB-Line designed, developed, tested, and successfully deployed a system for the distillation of chloride salts. In 2011, SRNL adapted the technology for the removal of fluoride from fluoride-bearing salts. The method involved an in situ reaction between potassium hydroxide (KOH) and the fluoride salt to yield potassium fluoride (KF) and the corresponding oxide. The KF and excess KOH can be distilled below 1000{deg}C using vacuum salt distillation (VSD). The apparatus for vacuum distillation contains a zone heated by a furnace and a zone actively cooled using eithermore » recirculated water or compressed air. During a vacuum distillation operation, a sample boat containing the feed material is placed into the apparatus while it is cool, and the system is sealed. The system is evacuated using a vacuum pump. Once a sufficient vacuum is attaned, heating begins. Volatile salts distill from the heated zone to the cooled zone where they condense, leaving behind the non-volatile material in the feed boat. Studies discussed in this report were performed involving the use of non-radioactive simulants in small-scale and pilot-scale systems as well as radioactive testing of a small-scale system with plutonium-bearing materials. Aspects of interest include removable liner design considerations, boat materials, in-line moisture absorption, and salt deposition.« less
PDC-bit performance under simulated borehole conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, E.E.; Azar, J.J.
1993-09-01
Laboratory drilling tests were used to investigate the effects of pressure on polycrystalline-diamond-compact (PDC) drill-bit performance. Catoosa shale core samples were drilled with PDC and roller-cone bits at up to 1,750-psi confining pressure. All tests were conducted in a controlled environment with a full-scale laboratory drilling system. Test results indicate, that under similar operating conditions, increases in confining pressure reduce PDC-bit performance as much as or more than conventional-rock-bit performance. Specific energy calculations indicate that a combination of rock strength, chip hold-down, and bit balling may have reduced performance. Quantifying the degree to which pressure reduces PDC-bit performance will helpmore » researchers interpret test results and improve bit designs and will help drilling engineers run PDC bits more effectively in the field.« less
TREATMENT OF INORGANIC CONTAMINANTS USING PERMEABLE REACTIVE BARRIERS
Permeable reactive barriers are an emerging alternative to traditional pump and treat systems for groundwater remediation. This technique has progressed rapidly over the past decade from laboratory bench-scale studies to full-scale implementation. Laboratory studies indicate the ...
Fernández-Barrera, Andrés H; Castro-Fresno, Daniel; Rodriguez-Hernandez, Jorge; Vega-Zamanillo, Angel
2011-01-30
Runoff contamination has motivated the development of different systems for its treatment in order to decrease the pollutant load that is discharged into natural water bodies. In the long term, these systems may undergo operational problems. This paper presents the results obtained in a laboratory study with a 1:1 scale prototype of a System of Catchment, Pre-treatment and Treatment (SCPT) of runoff waters. The analysis aims to establish the operational behaviour of the SCPT in the long term with respect to oil degradation and hydraulic conductivity in the geotextile filter. It is concluded that bio-degradation processes take place inside the SCPT and that hydraulic conductivity of the geotextile filtration system decreases slowly with successive simulated runoff events. Copyright © 2010 Elsevier B.V. All rights reserved.
Large Eddy Simulation of a Turbulent Jet
NASA Technical Reports Server (NTRS)
Webb, A. T.; Mansour, Nagi N.
2001-01-01
Here we present the results of a Large Eddy Simulation of a non-buoyant jet issuing from a circular orifice in a wall, and developing in neutral surroundings. The effects of the subgrid scales on the large eddies have been modeled with the dynamic large eddy simulation model applied to the fully 3D domain in spherical coordinates. The simulation captures the unsteady motions of the large-scales within the jet as well as the laminar motions in the entrainment region surrounding the jet. The computed time-averaged statistics (mean velocity, concentration, and turbulence parameters) compare well with laboratory data without invoking an empirical entrainment coefficient as employed by line integral models. The use of the large eddy simulation technique allows examination of unsteady and inhomogeneous features such as the evolution of eddies and the details of the entrainment process.
Dual Arm Work Package performance estimates and telerobot task network simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draper, J.V.; Blair, L.M.
1997-02-01
This paper describes the methodology and results of a network simulation study of the Dual Arm Work Package (DAWP), to be employed for dismantling the Argonne National Laboratory CP-5 reactor. The development of the simulation model was based upon the results of a task analysis for the same system. This study was performed by the Oak Ridge National Laboratory (ORNL), in the Robotics and Process Systems Division. Funding was provided the US Department of Energy`s Office of Technology Development, Robotics Technology Development Program (RTDP). The RTDP is developing methods of computer simulation to estimate telerobotic system performance. Data were collectedmore » to provide point estimates to be used in a task network simulation model. Three skilled operators performed six repetitions of a pipe cutting task representative of typical teleoperation cutting operations.« less
Biohazards Assessment in Large-Scale Zonal Centrifugation
Baldwin, C. L.; Lemp, J. F.; Barbeito, M. S.
1975-01-01
A study was conducted to determine the biohazards associated with use of the large-scale zonal centrifuge for purification of moderate risk oncogenic viruses. To safely and conveniently assess the hazard, coliphage T3 was substituted for the virus in a typical processing procedure performed in a National Cancer Institute contract laboratory. Risk of personnel exposure was found to be minimal during optimal operation but definite potential for virus release from a number of centrifuge components during mechanical malfunction was shown by assay of surface, liquid, and air samples collected during the processing. High concentration of phage was detected in the turbine air exhaust and the seal coolant system when faulty seals were employed. The simulant virus was also found on both centrifuge chamber interior and rotor surfaces. Images PMID:1124921
Next-generation genome-scale models for metabolic engineering.
King, Zachary A; Lloyd, Colton J; Feist, Adam M; Palsson, Bernhard O
2015-12-01
Constraint-based reconstruction and analysis (COBRA) methods have become widely used tools for metabolic engineering in both academic and industrial laboratories. By employing a genome-scale in silico representation of the metabolic network of a host organism, COBRA methods can be used to predict optimal genetic modifications that improve the rate and yield of chemical production. A new generation of COBRA models and methods is now being developed--encompassing many biological processes and simulation strategies-and next-generation models enable new types of predictions. Here, three key examples of applying COBRA methods to strain optimization are presented and discussed. Then, an outlook is provided on the next generation of COBRA models and the new types of predictions they will enable for systems metabolic engineering. Copyright © 2014 Elsevier Ltd. All rights reserved.
Model Parameter Variability for Enhanced Anaerobic Bioremediation of DNAPL Source Zones
NASA Astrophysics Data System (ADS)
Mao, X.; Gerhard, J. I.; Barry, D. A.
2005-12-01
The objective of the Source Area Bioremediation (SABRE) project, an international collaboration of twelve companies, two government agencies and three research institutions, is to evaluate the performance of enhanced anaerobic bioremediation for the treatment of chlorinated ethene source areas containing dense, non-aqueous phase liquids (DNAPL). This 4-year, 5.7 million dollars research effort focuses on a pilot-scale demonstration of enhanced bioremediation at a trichloroethene (TCE) DNAPL field site in the United Kingdom, and includes a significant program of laboratory and modelling studies. Prior to field implementation, a large-scale, multi-laboratory microcosm study was performed to determine the optimal system properties to support dehalogenation of TCE in site soil and groundwater. This statistically-based suite of experiments measured the influence of key variables (electron donor, nutrient addition, bioaugmentation, TCE concentration and sulphate concentration) in promoting the reductive dechlorination of TCE to ethene. As well, a comprehensive biogeochemical numerical model was developed for simulating the anaerobic dehalogenation of chlorinated ethenes. An appropriate (reduced) version of this model was combined with a parameter estimation method based on fitting of the experimental results. Each of over 150 individual microcosm calibrations involved matching predicted and observed time-varying concentrations of all chlorinated compounds. This study focuses on an analysis of this suite of fitted model parameter values. This includes determining the statistical correlation between parameters typically employed in standard Michaelis-Menten type rate descriptions (e.g., maximum dechlorination rates, half-saturation constants) and the key experimental variables. The analysis provides insight into the degree to which aqueous phase TCE and cis-DCE inhibit dechlorination of less-chlorinated compounds. Overall, this work provides a database of the numerical modelling parameters typically employed for simulating TCE dechlorination relevant for a range of system conditions (e.g, bioaugmented, high TCE concentrations, etc.). The significance of the obtained variability of parameters is illustrated with one-dimensional simulations of enhanced anaerobic bioremediation of residual TCE DNAPL.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Haomin; Solberg, Jerome; Merzari, Elia
This paper describes a numerical study of flow-induced vibration in a helical coil steam generator experiment conducted at Argonne National Laboratory in the 1980s. In the experiment, a half-scale sector model of a steam generator helical coil tube bank was subjected to still and flowing air and water, and the vibrational characteristics were recorded. The research detailed in this document utilizes the multi-physics simulation toolkit SHARP developed at Argonne National Laboratory, in cooperation with Lawrence Livermore National Laboratory, to simulate the experiment. SHARP uses the spectral element code Nek5000 for fluid dynamics analysis and the finite element code DIABLO formore » structural analysis. The flow around the coil tubes is modeled in Nek5000 by using a large eddy simulation turbulence model. Transient pressure data on the tube surfaces is sampled and transferred to DIABLO for the structural simulation. The structural response is simulated in DIABLO via an implicit time-marching algorithm and a combination of continuum elements and structural shells. Tube vibration data (acceleration and frequency) are sampled and compared with the experimental data. Currently, only one-way coupling is used, which means that pressure loads from the fluid simulation are transferred to the structural simulation but the resulting structural displacements are not fed back to the fluid simulation« less
Yuan, Haomin; Solberg, Jerome; Merzari, Elia; ...
2017-08-01
This study describes a numerical study of flow-induced vibration in a helical coil steam generator experiment conducted at Argonne National Laboratory in the 1980 s. In the experiment, a half-scale sector model of a steam generator helical coil tube bank was subjected to still and flowing air and water, and the vibrational characteristics were recorded. The research detailed in this document utilizes the multi-physics simulation toolkit SHARP developed at Argonne National Laboratory, in cooperation with Lawrence Livermore National Laboratory, to simulate the experiment. SHARP uses the spectral element code Nek5000 for fluid dynamics analysis and the finite element code DIABLOmore » for structural analysis. The flow around the coil tubes is modeled in Nek5000 by using a large eddy simulation turbulence model. Transient pressure data on the tube surfaces is sampled and transferred to DIABLO for the structural simulation. The structural response is simulated in DIABLO via an implicit time-marching algorithm and a combination of continuum elements and structural shells. Tube vibration data (acceleration and frequency) are sampled and compared with the experimental data. Currently, only one-way coupling is used, which means that pressure loads from the fluid simulation are transferred to the structural simulation but the resulting structural displacements are not fed back to the fluid simulation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuan, Haomin; Solberg, Jerome; Merzari, Elia
This study describes a numerical study of flow-induced vibration in a helical coil steam generator experiment conducted at Argonne National Laboratory in the 1980 s. In the experiment, a half-scale sector model of a steam generator helical coil tube bank was subjected to still and flowing air and water, and the vibrational characteristics were recorded. The research detailed in this document utilizes the multi-physics simulation toolkit SHARP developed at Argonne National Laboratory, in cooperation with Lawrence Livermore National Laboratory, to simulate the experiment. SHARP uses the spectral element code Nek5000 for fluid dynamics analysis and the finite element code DIABLOmore » for structural analysis. The flow around the coil tubes is modeled in Nek5000 by using a large eddy simulation turbulence model. Transient pressure data on the tube surfaces is sampled and transferred to DIABLO for the structural simulation. The structural response is simulated in DIABLO via an implicit time-marching algorithm and a combination of continuum elements and structural shells. Tube vibration data (acceleration and frequency) are sampled and compared with the experimental data. Currently, only one-way coupling is used, which means that pressure loads from the fluid simulation are transferred to the structural simulation but the resulting structural displacements are not fed back to the fluid simulation.« less
2010-08-04
airway management practices in the PACU has been deemed successful by KMC anesthesia management 15. SUBJECT TERMS Human Patient Simulation; Emergency...of South Alabama and KMC Clinical Research Laboratory (CRL) were received. The training sessions were planned for two 4-hour sessions in the HPS...assistance ofthe KMC CRL research statistician. Findings Results of the NLN Simulation Design Scale surveys showed seven of eight nurses in the
NASA Astrophysics Data System (ADS)
Scarlat, Raluca O.; Peterson, Per F.
2014-01-01
The fluoride salt cooled high temperature reactor (FHR) is a class of fission reactor designs that use liquid fluoride salt coolant, TRISO coated particle fuel, and graphite moderator. Heavy ion fusion (HIF) can likewise make use of liquid fluoride salts, to create thick or thin liquid layers to protect structures in the target chamber from ablation by target X-rays and damage from fusion neutron irradiation. This presentation summarizes ongoing work in support of design development and safety analysis of FHR systems. Development work for fluoride salt systems with application to both FHR and HIF includes thermal-hydraulic modeling and experimentation, salt chemistry control, tritium management, salt corrosion of metallic alloys, and development of major components (e.g., pumps, heat exchangers) and gas-Brayton cycle power conversion systems. In support of FHR development, a thermal-hydraulic experimental test bay for separate effects (SETs) and integral effect tests (IETs) was built at UC Berkeley, and a second IET facility is under design. The experiments investigate heat transfer and fluid dynamics and they make use of oils as simulant fluids at reduced scale, temperature, and power of the prototypical salt-cooled system. With direct application to HIF, vortex tube flow was investigated in scaled experiments with mineral oil. Liquid jets response to impulse loading was likewise studied using water as a simulant fluid. A set of four workshops engaging industry and national laboratory experts were completed in 2012, with the goal of developing a technology pathway to the design and licensing of a commercial FHR. The pathway will include experimental and modeling efforts at universities and national laboratories, requirements for a component test facility for reliability testing of fluoride salt equipment at prototypical conditions, requirements for an FHR test reactor, and development of a pre-conceptual design for a commercial reactor.
NASA Astrophysics Data System (ADS)
Krauland, Christine; Drake, R.; Loupias, B.; Falize, E.; Busschaert, C.; Ravasio, A.; Yurchak, R.; Pelka, A.; Koenig, M.; Kuranz, C. C.; Plewa, T.; Huntington, C. M.; Kaczala, D. N.; Klein, S.; Sweeney, R.; Villete, B.; Young, R.; Keiter, P. A.
2012-05-01
We present results from high-energy-density (HED) laboratory experiments that explore the contribution of radiative shock waves to the evolving dynamics of the cataclysmic variable (CV) systems in which they reside. CVs can be classified under two main categories, non-magnetic and magnetic. In the process of accretion, both types involve strongly radiating shocks that provide the main source of radiation in the binary systems. This radiation can cause varying structure to develop depending on the optical properties of the material on either side of the shock. The ability of high-intensity lasers to create large energy densities in targets of millimeter-scale volume makes it feasible to create similar radiative shocks in the laboratory. We provide an overview of both CV systems and their connection to the designed and executed laboratory experiments preformed on two laser facilities. Available data and accompanying simulations will likewise be shown. Funded by the NNSA-DS and SC-OFES Joint Prog. in High-Energy-Density Lab. Plasmas, by the Nat. Laser User Facility Prog. in NNSA-DS and by the Predictive Sci. Acad. Alliances Prog. in NNSA-ASC, under grant numbers are DE-FG52-09NA29548, DE-FG52-09NA29034, and DE-FC52-08NA28616.
Virtual and remote robotic laboratory using EJS, MATLAB and LabVIEW.
Chaos, Dictino; Chacón, Jesús; Lopez-Orozco, Jose Antonio; Dormido, Sebastián
2013-02-21
This paper describes the design and implementation of a virtual and remote laboratory based on Easy Java Simulations (EJS) and LabVIEW. The main application of this laboratory is to improve the study of sensors in Mobile Robotics, dealing with the problems that arise on the real world experiments. This laboratory allows the user to work from their homes, tele-operating a real robot that takes measurements from its sensors in order to obtain a map of its environment. In addition, the application allows interacting with a robot simulation (virtual laboratory) or with a real robot (remote laboratory), with the same simple and intuitive graphical user interface in EJS. Thus, students can develop signal processing and control algorithms for the robot in simulation and then deploy them on the real robot for testing purposes. Practical examples of application of the laboratory on the inter-University Master of Systems Engineering and Automatic Control are presented.
Virtual and Remote Robotic Laboratory Using EJS, MATLAB and Lab VIEW
Chaos, Dictino; Chacón, Jesús; Lopez-Orozco, Jose Antonio; Dormido, Sebastián
2013-01-01
This paper describes the design and implementation of a virtual and remote laboratory based on Easy Java Simulations (EJS) and LabVIEW. The main application of this laboratory is to improve the study of sensors in Mobile Robotics, dealing with the problems that arise on the real world experiments. This laboratory allows the user to work from their homes, tele-operating a real robot that takes measurements from its sensors in order to obtain a map of its environment. In addition, the application allows interacting with a robot simulation (virtual laboratory) or with a real robot (remote laboratory), with the same simple and intuitive graphical user interface in EJS. Thus, students can develop signal processing and control algorithms for the robot in simulation and then deploy them on the real robot for testing purposes. Practical examples of application of the laboratory on the inter-University Master of Systems Engineering and Automatic Control are presented. PMID:23429578
Establishment and assessment of a novel cleaner production process of corn grain fuel ethanol.
Wang, Ke; Zhang, Jianhua; Tang, Lei; Zhang, Hongjian; Zhang, Guiying; Yang, Xizhao; Liu, Pei; Mao, Zhonggui
2013-11-01
An integrated corn ethanol-methane fermentation system was proposed to solve the problem of stillage handling, where thin stillage was treated by anaerobic digestion and then reused to make mash for the following ethanol fermentation. This system was evaluated at laboratory and pilot scale. Anaerobic digestion of thin stillage ran steadily with total chemical oxygen demand removal efficiency of 98% at laboratory scale and 97% at pilot scale. Ethanol production was not influenced by recycling anaerobic digestion effluent at laboratory and pilot scale. Compared with dried distillers' grains with solubles produced in conventional process, dried distillers' grains in the proposed system exhibited higher quality because of increased protein concentration and decreased salts concentration. Energetic assessment indicated that application of this novel process enhanced the net energy balance ratio from 1.26 (conventional process) to 1.76. In conclusion, the proposed system possessed technical advantage over the conventional process for corn fuel ethanol production. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Doan, Minh; Padricelli, Claudrio; Obi, Shinnosuke; Totsuka, Yoshitaka
2017-11-01
We present the torque and power measurement of laboratory-scale counter-rotating vertical-axis hydrokinetic turbines, built around a magnetic hysteresis brake as the speed controller and a Hall-effect sensor as the rotational speed transducer. A couple of straight-three-bladed turbines were linked through a transmission of spur gears and timing pulleys and coupled to the electronic instrumentation via flexible shaft couplers. A total of 8 experiments in 2 configurations were conducted in the water channel facility (4-m long, 0.3-m wide, and 0.15-m deep). Power generation of the turbines (0.06-m rotor diameter) was measured and compared with that of single turbines of the same size. The wakes generated by these experiments were also measured by particle image velocimetry (PIV) and numerically simulated by unsteady Reynolds-averaged Navier-Stokes (URANS) simulation using OpenFOAM. Preliminary results from wake measurement indicated the mechanism of enhanced power production behind the counter-rotating configuration of vertical-axis turbines. Current address: Politecnico di Milano.
NASA Astrophysics Data System (ADS)
Clark, Stephen; Winske, Dan; Schaeffer, Derek; Everson, Erik; Bondarenko, Anton; Constantin, Carmen; Niemann, Christoph
2014-10-01
We present 3D hybrid simulations of laser produced expanding debris clouds propagating though a magnetized ambient plasma in the context of magnetized collisionless shocks. New results from the 3D code are compared to previously obtained simulation results using a 2D hybrid code. The 3D code is an extension of a previously developed 2D code developed at Los Alamos National Laboratory. It has been parallelized and ported to execute on a cluster environment. The new simulations are used to verify scaling relationships, such as shock onset time and coupling parameter (Rm /ρd), developed via 2D simulations. Previous 2D results focus primarily on laboratory shock formation relevant to experiments being performed on the Large Plasma Device, where the shock propagates across the magnetic field. The new 3D simulations show wave structure and dynamics oblique to the magnetic field that introduce new physics to be considered in future experiments.
NASA Technical Reports Server (NTRS)
Stevens, N. J.
1979-01-01
Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.
NASA Astrophysics Data System (ADS)
Mori, H.; Trevisan, L.; Sakaki, T.; Cihan, A.; Smits, K. M.; Illangasekare, T. H.
2013-12-01
Multiphase flow models can be used to improve our understanding of the complex behavior of supercritical CO2 (scCO2) in deep saline aquifers to make predictions for the stable storage strategies. These models rely on constitutive relationships such as capillary pressure (Pc) - saturation (Sw) and relative permeability (kr) - saturation (Sw) as input parameters. However, for practical application of these models, such relationships for scCO2 and brine system are not readily available for geological formations. This is due to the complicated and expensive traditional methods often used to obtain these relationships in the laboratory through high pressure and/or high-temperature controls. A method that has the potential to overcome the difficulty in conducting such experiments is to replicate scCO2 and brine with surrogate fluids that capture the density and viscosity effects to obtain the constitutive relationships under ambient conditions. This study presents an investigation conducted to evaluate this method. An assessment of the method allows us to evaluate the prediction accuracy of multiphase models using the constitutive relationships developed from this approach. With this as a goal, the study reports multiple laboratory column experiments conducted to measure these relationships. The obtained relationships were then used in the multiphase flow simulator TOUGH2 T2VOC to explore capillary trapping mechanisms of scCO2. A comparison of the model simulation to experimental observation was used to assess the accuracy of the measured constitutive relationships. Experimental data confirmed, as expected, that the scaling method cannot be used to obtain the residual and irreducible saturations. The results also showed that the van Genuchten - Mualem model was not able to match the independently measured kr data obtained from column experiments. Simulated results of fluid saturations were compared with saturation measurements obtained using x-ray attenuations. This comparison demonstrated that the experimentally derived constitutive relationships matched the experimental data more accurately than the simulation using constitutive relationships derived from scaling methods and van Genuchten - Mualem model. However, simulated imbibition fronts did not match well, suggesting the need for further study. In general, the study demonstrated the feasibility of using surrogate fluids to obtain both Pc - Sw and kr - Sw relationships to be used in multiphase models of scCO2 migration and entrapment.
Physical Interpretation of Laboratory Friction Laws in the Context of Damage Physics
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Tiampo, K. F.; Martins, J. S.; Klein, W.
2002-12-01
Frictional on sliding surfaces is ultimately related to processes of surface damage, and can be understood in the context of the physics of dynamical threshold systems. Threshold systems are known to be some of the most important nonlinear, self-organizing systems in nature, including networks of earthquake faults, neural networks, superconductors and semiconductors, and the World Wide Web, as well as political, social, and ecological systems. All of these systems have dynamics that are strongly correlated in space and time, and all typically display a multiplicity of spatial and temporal scales. Here we discuss the physics of self-organization and damage in earthquake threshold systems at the "microscopic" laboratory scale, in which consideration of results from simulations leads to dynamical equations that can be used to derive results obtained from sliding friction experiments, specifically, the empirical "rate-and-state" friction equations of Ruina. Paradoxically, in all of these dissipative systems, long-range interactions induce the existence of locally ergodic dynamics, even though the dissipation of energy is involved. The existence of dissipative effects leads to the appearance of a "leaky threshold" dynamics, equivalent to a new scaling field that controls the size of nucleation events relative to the size of the background fluctuations. The corresponding appearance of a mean field spinodal leads to a general coarse-grained equation, which expresses the balance between rate of stress supplied, and rate of stress dissipated in the processes leading to surface damage. We can use ideas from thermodynamics and kinetics of phase transitions to develop the exact form of the rate-and-state equations, giving clear physical meaning to all terms and variables. Ultimately, the self-organizing dynamics arise from the appearance of an energy landscape in these systems, which in turn arises from the strong correlations and mean field nature of the physics.
Fire tests for airplane interior materials
NASA Technical Reports Server (NTRS)
Tustin, E. A.
1980-01-01
Large scale, simulated fire tests of aircraft interior materials were carried out in salvaged airliner fuselage. Two "design" fire sources were selected: Jet A fuel ignited in fuselage midsection and trash bag fire. Comparison with six established laboratory fire tests show that some laboratory tests can rank materials according to heat and smoke production, but existing tests do not characterize toxic gas emissions accurately. Report includes test parameters and test details.
Tuncer, Necibe; Gulbudak, Hayriye; Cannataro, Vincent L; Martcheva, Maia
2016-09-01
In this article, we discuss the structural and practical identifiability of a nested immuno-epidemiological model of arbovirus diseases, where host-vector transmission rate, host recovery, and disease-induced death rates are governed by the within-host immune system. We incorporate the newest ideas and the most up-to-date features of numerical methods to fit multi-scale models to multi-scale data. For an immunological model, we use Rift Valley Fever Virus (RVFV) time-series data obtained from livestock under laboratory experiments, and for an epidemiological model we incorporate a human compartment to the nested model and use the number of human RVFV cases reported by the CDC during the 2006-2007 Kenya outbreak. We show that the immunological model is not structurally identifiable for the measurements of time-series viremia concentrations in the host. Thus, we study the non-dimensionalized and scaled versions of the immunological model and prove that both are structurally globally identifiable. After fixing estimated parameter values for the immunological model derived from the scaled model, we develop a numerical method to fit observable RVFV epidemiological data to the nested model for the remaining parameter values of the multi-scale system. For the given (CDC) data set, Monte Carlo simulations indicate that only three parameters of the epidemiological model are practically identifiable when the immune model parameters are fixed. Alternatively, we fit the multi-scale data to the multi-scale model simultaneously. Monte Carlo simulations for the simultaneous fitting suggest that the parameters of the immunological model and the parameters of the immuno-epidemiological model are practically identifiable. We suggest that analytic approaches for studying the structural identifiability of nested models are a necessity, so that identifiable parameter combinations can be derived to reparameterize the nested model to obtain an identifiable one. This is a crucial step in developing multi-scale models which explain multi-scale data.
NASA Astrophysics Data System (ADS)
Torrealba, V.; Karpyn, Z.; Yoon, H.; Hart, D. B.; Klise, K. A.
2013-12-01
The pore-scale dynamics that govern multiphase flow under variable stress conditions are not well understood. This lack of fundamental understanding limits our ability to quantitatively predict multiphase flow and fluid distributions in natural geologic systems. In this research, we focus on pore-scale, single and multiphase flow properties that impact displacement mechanisms and residual trapping of non-wetting phase under varying stress conditions. X-ray micro-tomography is used to image pore structures and distribution of wetting and non-wetting fluids in water-wet synthetic granular packs, under dynamic load. Micro-tomography images are also used to determine structural features such as medial axis, surface area, and pore body and throat distribution; while the corresponding transport properties are determined from Lattice-Boltzmann simulations performed on lattice replicas of the imaged specimens. Results are used to investigate how inter-granular deformation mechanisms affect fluid displacement and residual trapping at the pore-scale. This will improve our understanding of the dynamic interaction of mechanical deformation and fluid flow during enhanced oil recovery and geologic CO2 sequestration. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Modern and Unconventional Approaches to Karst Hydrogeology
NASA Astrophysics Data System (ADS)
Sukop, M. C.
2017-12-01
Karst hydrogeology is frequently approached from a hydrograph/statistical perspective where precipitation/recharge inputs are converted to output hydrographs and the conversion process reflects the hydrology of the system. Karst catchments show hydrological response to short-term meteorological events and to long-term variation of large-scale atmospheric circulation. Modern approaches to analysis of these data include, for example, multiresolution wavelet techniques applied to understand relations between karst discharge and climate fields. Much less effort has been directed towards direct simulation of flow fields and transport phenomena in karst settings. This is primarily due to the lack of information on the detailed physical geometry of most karst systems. New mapping, sampling, and modeling techniques are beginning to enable direct simulation of flow and transport. A Conduit Flow Process (CFP) add-on to the USGS ModFlow model became available in 2007. FEFLOW and similar models are able to represent flows in individual conduits. Lattice Boltzmann models have also been applied to flow modeling in karst systems. Regarding quantitative measurement of karst system geometry, at scales to 0.1 m, X-ray computed tomography enables good detection of detailed (sub-millimeter) pore space in karstic rocks. Three-dimensional printing allows reconstruction of fragile high porosity rocks, and surrogate samples generated this way can then be subjected to laboratory testing. Borehole scales can be accessed with high-resolution ( 0.001 m) Digital Optical Borehole Imaging technologies and can provide virtual samples more representative of the true nature of karst aquifers than can obtained from coring. Subsequent extrapolation of such samples can generate three-dimensional models suitable for direct modeling of flow and transport. Finally, new cave mapping techniques are beginning to provide information than can be applied to direct simulation of flow. Due to flow rates and cave diameter, very high Reynolds number flows may be encountered.
A Wireless Communications Systems Laboratory Course
ERIC Educational Resources Information Center
Guzelgoz, Sabih; Arslan, Huseyin
2010-01-01
A novel wireless communications systems laboratory course is introduced. The course teaches students how to design, test, and simulate wireless systems using modern instrumentation and computer-aided design (CAD) software. One of the objectives of the course is to help students understand the theoretical concepts behind wireless communication…
Spectral deconvolution and operational use of stripping ratios in airborne radiometrics.
Allyson, J D; Sanderson, D C
2001-01-01
Spectral deconvolution using stripping ratios for a set of pre-defined energy windows is the simplest means of reducing the most important part of gamma-ray spectral information. In this way, the effective interferences between the measured peaks are removed, leading, through a calibration, to clear estimates of radionuclide inventory. While laboratory measurements of stripping ratios are relatively easy to acquire, with detectors placed above small-scale calibration pads of known radionuclide concentrations, the extrapolation to measurements at altitudes where airborne survey detectors are used bring difficulties such as air-path attenuation and greater uncertainties in knowing ground level inventories. Stripping ratios are altitude dependent, and laboratory measurements using various absorbers to simulate the air-path have been used with some success. Full-scale measurements from an aircraft require a suitable location where radionuclide concentrations vary little over the field of view of the detector (which may be hundreds of metres). Monte Carlo simulations offer the potential of full-scale reproduction of gamma-ray transport and detection mechanisms. Investigations have been made to evaluate stripping ratios using experimental and Monte Carlo methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baggu, Murali
2017-01-01
This project will enable effective utilization of high penetration of photovoltaics (PV) in islanded microgrids, increasing overall system efficiency, decreased fuel costs and resiliency of the overall system to help meet the SunShot goals of enhancing system integration methods to increase penetration of PV. National Renewable Energy Laboratory (NREL) will collaborate with San Diego Gas & Electric (SDG&E) to provide research and testing support to address their needs in energy storage sizing and placement, Integrated Test Facility (ITF) development, Real Time Digital Simulator (RTDS) Modeling and simulation support at ITF, Visualization and Virtual connection to Energy Systems Integration Facility (ESIF),more » and microgrid simulation and testing areas. Specifically in this project a real microgrid scenario with high penetration of PV (existing in SDG&E territory) is tested in the ESIF laboratory. Multiple control cases for firming PV using storage in a microgrid scenario will be investigated and tested in the laboratory setup.« less
Power Systems Integration Laboratory | Energy Systems Integration Facility
inverters. Key Infrastructure Grid simulator, load bank, Opal-RT, battery, inverter mounting racks, data , frequency-watt, and grid anomaly ride-through. Key Infrastructure House power, Opal-RT, PV simulator access
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-10-01
A new construction pilot community was constructed by builder-partner Wathen-Castanos Hybrid Homes (WCHH) based on a single occupied test house that was designed to achieve greater than 30% energy savings with respect to the House Simulation Protocols (Hendron, Robert; Engebrecht, Cheryn (2010). Building America House Simulation Protocols. Golden, CO: National Renewable Energy Laboratory.). Builders face several key problems when implementing a whole-house systems integrated measures package (SIMP) from a single test house into multiple houses. Although a technical solution already may have been evaluated and validated in an individual test house, the potential exists for constructability failures at the communitymore » scale. This report addresses factors of implementation and scalability at the community scale and proposes methodologies by which community-scale energy evaluations can be performed based on results at the occupied test house level. Research focused on the builder and trade implementation of a SIMP and the actual utility usage in the houses at the community scale of production. Five occupants participated in this community-scale research by providing utility bills and information on occupancy and miscellaneous gas and electric appliance use for their houses. IBACOS used these utility data and background information to analyze the actual energy performance of the houses. Verification with measured data is an important component in predictive energy modeling. The actual utility bill readings were compared to projected energy consumption using BEopt with actual weather and thermostat set points for normalization.« less
Remote Control Laboratory Using EJS Applets and TwinCAT Programmable Logic Controllers
ERIC Educational Resources Information Center
Besada-Portas, E.; Lopez-Orozco, J. A.; de la Torre, L.; de la Cruz, J. M.
2013-01-01
This paper presents a new methodology to develop remote laboratories for systems engineering and automation control courses, based on the combined use of TwinCAT, a laboratory Java server application, and Easy Java Simulations (EJS). The TwinCAT system is used to close the control loop for the selected plants by means of programmable logic…
Simulation of the dynamic environment for missile component testing: Demonstration
NASA Technical Reports Server (NTRS)
Chang, Kurng Y.
1989-01-01
The problems in defining a realistic test requirement for missile and space vehicle components can be classified into two categories: (1) definition of the test environment representing the expected service condition, and (2) simulation of the desired environment in the test laboratory. Recently, a new three-dimensional (3-D) test facility was completed at the U.S. Army Harry Diamond Laboratory (HDL) to simulate triaxial vibration input to a test specimen. The vibration test system is designed to support multi-axial vibration tests over the frequency range of 5 to 2000 Hertz. The availability of this 3-D test system motivates the development of new methodologies addressing environmental definition and simulation.
NASA Technical Reports Server (NTRS)
Pomerantz, M. I.; Lim, C.; Myint, S.; Woodward, G.; Balaram, J.; Kuo, C.
2012-01-01
he Jet Propulsion Laboratory's Entry, Descent and Landing (EDL) Reconstruction Task has developed a software system that provides mission operations personnel and analysts with a real time telemetry-based live display, playback and post-EDL reconstruction capability that leverages the existing high-fidelity, physics-based simulation framework and modern game engine-derived 3D visualization system developed in the JPL Dynamics and Real Time Simulation (DARTS) Lab. Developed as a multi-mission solution, the EDL Telemetry Visualization (ETV) system has been used for a variety of projects including NASA's Mars Science Laboratory (MSL), NASA'S Low Density Supersonic Decelerator (LDSD) and JPL's MoonRise Lunar sample return proposal.
A Hardware-in-the-Loop Simulator for Software Development for a Mars Airplane
NASA Technical Reports Server (NTRS)
Slagowski, Stefan E.; Vican, Justin E.; Kenney, P. Sean
2007-01-01
Draper Laboratory recently developed a Hardware-In-The-Loop Simulator (HILSIM) to provide a simulation of the Aerial Regional-scale Environmental Survey (ARES) airplane executing a mission in the Martian environment. The HILSIM was used to support risk mitigation activities under the Planetary Airplane Risk Reduction (PARR) program. PARR supported NASA Langley Research Center's (LaRC) ARES proposal efforts for the Mars Scout 2011 opportunity. The HILSIM software was a successful integration of two simulation frameworks, Draper's CSIM and NASA LaRC's Langley Standard Real-Time Simulation in C++ (LaSRS++).
Real-Time Hardware-in-the-Loop Simulation of Ares I Launch Vehicle
NASA Technical Reports Server (NTRS)
Tobbe, Patrick; Matras, Alex; Walker, David; Wilson, Heath; Fulton, Chris; Alday, Nathan; Betts, Kevin; Hughes, Ryan; Turbe, Michael
2009-01-01
The Ares Real-Time Environment for Modeling, Integration, and Simulation (ARTEMIS) has been developed for use by the Ares I launch vehicle System Integration Laboratory at the Marshall Space Flight Center. The primary purpose of the Ares System Integration Laboratory is to test the vehicle avionics hardware and software in a hardware - in-the-loop environment to certify that the integrated system is prepared for flight. ARTEMIS has been designed to be the real-time simulation backbone to stimulate all required Ares components for verification testing. ARTE_VIIS provides high -fidelity dynamics, actuator, and sensor models to simulate an accurate flight trajectory in order to ensure realistic test conditions. ARTEMIS has been designed to take advantage of the advances in underlying computational power now available to support hardware-in-the-loop testing to achieve real-time simulation with unprecedented model fidelity. A modular realtime design relying on a fully distributed computing architecture has been implemented.
NASA Astrophysics Data System (ADS)
Magyar, Rudolph
2013-06-01
We report a computational and validation study of equation of state (EOS) properties of liquid / dense plasma mixtures of xenon and ethane to explore and to illustrate the physics of the molecular scale mixing of light elements with heavy elements. Accurate EOS models are crucial to achieve high-fidelity hydrodynamics simulations of many high-energy-density phenomena such as inertial confinement fusion and strong shock waves. While the EOS is often tabulated for separate species, the equation of state for arbitrary mixtures is generally not available, requiring properties of the mixture to be approximated by combining physical properties of the pure systems. The main goal of this study is to access how accurate this approximation is under shock conditions. Density functional theory molecular dynamics (DFT-MD) at elevated-temperature and pressure is used to assess the thermodynamics of the xenon-ethane mixture. The simulations are unbiased as to elemental species and therefore provide comparable accuracy when describing total energies, pressures, and other physical properties of mixtures as they do for pure systems. In addition, we have performed shock compression experiments using the Sandia Z-accelerator on pure xenon, ethane, and various mixture ratios thereof. The Hugoniot results are compared to the DFT-MD results and the predictions of different rules for combing EOS tables. The DFT-based simulation results compare well with the experimental points, and it is found that a mixing rule based on pressure equilibration performs reliably well for the mixtures considered. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
NASA Astrophysics Data System (ADS)
Guo, L.; Huang, H.; Gaston, D.; Redden, G. D.; Fox, D. T.; Fujita, Y.
2010-12-01
Inducing mineral precipitation in the subsurface is one potential strategy for immobilizing trace metal and radionuclide contaminants. Generating mineral precipitates in situ can be achieved by manipulating chemical conditions, typically through injection or in situ generation of reactants. How these reactants transport, mix and react within the medium controls the spatial distribution and composition of the resulting mineral phases. Multiple processes, including fluid flow, dispersive/diffusive transport of reactants, biogeochemical reactions and changes in porosity-permeability, are tightly coupled over a number of scales. Numerical modeling can be used to investigate the nonlinear coupling effects of these processes which are quite challenging to explore experimentally. Many subsurface reactive transport simulators employ a de-coupled or operator-splitting approach where transport equations and batch chemistry reactions are solved sequentially. However, such an approach has limited applicability for biogeochemical systems with fast kinetics and strong coupling between chemical reactions and medium properties. A massively parallel, fully coupled, fully implicit Reactive Transport simulator (referred to as “RAT”) based on a parallel multi-physics object-oriented simulation framework (MOOSE) has been developed at the Idaho National Laboratory. Within this simulator, systems of transport and reaction equations can be solved simultaneously in a fully coupled, fully implicit manner using the Jacobian Free Newton-Krylov (JFNK) method with additional advanced computing capabilities such as (1) physics-based preconditioning for solution convergence acceleration, (2) massively parallel computing and scalability, and (3) adaptive mesh refinements for 2D and 3D structured and unstructured mesh. The simulator was first tested against analytical solutions, then applied to simulating induced calcium carbonate mineral precipitation in 1D columns and 2D flow cells as analogs to homogeneous and heterogeneous porous media, respectively. In 1D columns, calcium carbonate mineral precipitation was driven by urea hydrolysis catalyzed by urease enzyme, and in 2D flow cells, calcium carbonate mineral forming reactants were injected sequentially, forming migrating reaction fronts that are typically highly nonuniform. The RAT simulation results for the spatial and temporal distributions of precipitates, reaction rates and major species in the system, and also for changes in porosity and permeability, were compared to both laboratory experimental data and computational results obtained using other reactive transport simulators. The comparisons demonstrate the ability of RAT to simulate complex nonlinear systems and the advantages of fully coupled approaches, over de-coupled methods, for accurate simulation of complex, dynamic processes such as engineered mineral precipitation in subsurface environments.
1988-04-13
Simulation: An Artificial Intelligence Approach to System Modeling and Automating the Simulation Life Cycle Mark S. Fox, Nizwer Husain, Malcolm...McRoberts and Y.V.Reddy CMU-RI-TR-88-5 Intelligent Systems Laboratory The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania D T T 13...years of research in the application of Artificial Intelligence to Simulation. Our focus has been in two areas: the use of Al knowledge representation
Particle-In-Cell Modeling For MJ Dense Plasma Focus with Varied Anode Shape
NASA Astrophysics Data System (ADS)
Link, A.; Halvorson, C.; Schmidt, A.; Hagen, E. C.; Rose, D.; Welch, D.
2014-10-01
Megajoule scale dense plasma focus (DPF) Z-pinches with deuterium gas fill are compact devices capable of producing 1012 neutrons per shot but past predictive models of large-scale DPF have not included kinetic effects such as ion beam formation or anomalous resistivity. We report on progress of developing a predictive DPF model by extending our 2D axisymmetric collisional kinetic particle-in-cell (PIC) simulations to the 1 MJ, 2 MA Gemini DPF using the PIC code LSP. These new simulations incorporate electrodes, an external pulsed-power driver circuit, and model the plasma from insulator lift-off through the pinch phase. The simulations were performed using a new hybrid fluid-to-kinetic model transitioning from a fluid description to a fully kinetic PIC description during the run-in phase. Simulations are advanced through the final pinch phase using an adaptive variable time-step to capture the fs and sub-mm scales of the kinetic instabilities involved in the ion beam formation and neutron production. Results will be present on the predicted effects of different anode configurations. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory (LLNL) under Contract DE-AC52-07NA27344 and supported by the Laboratory Directed Research and Development Program (11-ERD-063) and the Computing Grand Challenge program at LLNL. This work supported by Office of Defense Nuclear Nonproliferation Research and Development within U.S. Department of Energy's National Nuclear Security Administration.
ERIC Educational Resources Information Center
Terry, John
1987-01-01
Discusses the feasibility of using fermenters in secondary school laboratories. Includes discussions of equipment, safety, and computer interfacing. Describes how a simple fermenter could be used to simulate large-scale processes. Concludes that, although teachers and technicians will require additional training, the prospects for biotechnology in…
NASA Astrophysics Data System (ADS)
Gorrick, S.; Rodriguez, J. F.
2011-12-01
A movable bed physical model was designed in a laboratory flume to simulate both bed and suspended load transport in a mildly sinuous sand-bed stream. Model simulations investigated the impact of different vegetation arrangements along the outer bank to evaluate rehabilitation options. Preserving similitude in the 1:16 laboratory model was very important. In this presentation the scaling approach, as well as the successes and challenges of the strategy are outlined. Firstly a near-bankfull flow event was chosen for laboratory simulation. In nature, bankfull events at the field site deposit new in-channel features but cause only small amounts of bank erosion. Thus the fixed banks in the model were not a drastic simplification. Next, and as in other studies, the flow velocity and turbulence measurements were collected in separate fixed bed experiments. The scaling of flow in these experiments was simply maintained by matching the Froude number and roughness levels. The subsequent movable bed experiments were then conducted under similar hydrodynamic conditions. In nature, the sand-bed stream is fairly typical; in high flows most sediment transport occurs in suspension and migrating dunes cover the bed. To achieve similar dynamics in the model equivalent values of the dimensionless bed shear stress and the particle Reynolds number were important. Close values of the two dimensionless numbers were achieved with lightweight sediments (R=0.3) including coal and apricot pips with a particle size distribution similar to that of the field site. Overall the moveable bed experiments were able to replicate the dominant sediment dynamics present in the stream during a bankfull flow and yielded relevant information for the analysis of the effects of riparian vegetation. There was a potential conflict in the strategy, in that grain roughness was exaggerated with respect to nature. The advantage of this strategy is that although grain roughness is exaggerated, the similarity of bedforms and resulting drag can return similar levels of roughness to those in the field site.
RANS Simulation (Virtual Blade Model [VBM]) of Single Lab Scaled DOE RM1 MHK Turbine
Javaherchi, Teymour; Stelzenmuller, Nick; Aliseda, Alberto; Seydel, Joseph
2014-04-15
Attached are the .cas and .dat files for the Reynolds Averaged Navier-Stokes (RANS) simulation of a single lab-scaled DOE RM1 turbine implemented in ANSYS FLUENT CFD-package. The lab-scaled DOE RM1 is a re-design geometry, based of the full scale DOE RM1 design, producing same power output as the full scale model, while operating at matched Tip Speed Ratio values at reachable laboratory Reynolds number (see attached paper). In this case study the flow field around and in the wake of the lab-scaled DOE RM1 turbine is simulated using Blade Element Model (a.k.a Virtual Blade Model) by solving RANS equations coupled with k-\\omega turbulence closure model. It should be highlighted that in this simulation the actual geometry of the rotor blade is not modeled. The effect of turbine rotating blades are modeled using the Blade Element Theory. This simulation provides an accurate estimate for the performance of device and structure of it's turbulent far wake. Due to the simplifications implemented for modeling the rotating blades in this model, VBM is limited to capture details of the flow field in near wake region of the device. The required User Defined Functions (UDFs) and look-up table of lift and drag coefficients are included along with the .cas and .dat files.
The Magnetic Reconnection Code: an AMR-based fully implicit simulation suite
NASA Astrophysics Data System (ADS)
Germaschewski, K.; Bhattacharjee, A.; Ng, C.-S.
2006-12-01
Extended MHD models, which incorporate two-fluid effects, are promising candidates to enhance understanding of collisionless reconnection phenomena in laboratory, space and astrophysical plasma physics. In this paper, we introduce two simulation codes in the Magnetic Reconnection Code suite which integrate reduced and full extended MHD models. Numerical integration of these models comes with two challenges: Small-scale spatial structures, e.g. thin current sheets, develop and must be well resolved by the code. Adaptive mesh refinement (AMR) is employed to provide high resolution where needed while maintaining good performance. Secondly, the two-fluid effects in extended MHD give rise to dispersive waves, which lead to a very stringent CFL condition for explicit codes, while reconnection happens on a much slower time scale. We use a fully implicit Crank--Nicholson time stepping algorithm. Since no efficient preconditioners are available for our system of equations, we instead use a direct solver to handle the inner linear solves. This requires us to actually compute the Jacobian matrix, which is handled by a code generator that calculates the derivative symbolically and then outputs code to calculate it.
Pore-scale modeling of moving contact line problems in immiscible two-phase flow
NASA Astrophysics Data System (ADS)
Kucala, Alec; Noble, David; Martinez, Mario
2016-11-01
Accurate modeling of moving contact line (MCL) problems is imperative in predicting capillary pressure vs. saturation curves, permeability, and preferential flow paths for a variety of applications, including geological carbon storage (GCS) and enhanced oil recovery (EOR). Here, we present a model for the moving contact line using pore-scale computational fluid dynamics (CFD) which solves the full, time-dependent Navier-Stokes equations using the Galerkin finite-element method. The MCL is modeled as a surface traction force proportional to the surface tension, dependent on the static properties of the immiscible fluid/solid system. We present a variety of verification test cases for simple two- and three-dimensional geometries to validate the current model, including threshold pressure predictions in flows through pore-throats for a variety of wetting angles. Simulations involving more complex geometries are also presented to be used in future simulations for GCS and EOR problems. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Modeling and Simulation of Ballistic Penetration of Ceramic-Polymer-Metal Layered Systems
2016-01-01
ARL-RP-0562 ● JAN 2016 US Army Research Laboratory Modeling and Simulation of Ballistic Penetration of Ceramic-Polymer-Metal...manufacturer’s or trade names does not constitute an official endorsement or approval of the use thereof. Destroy this report when it is no longer needed...Do not return it to the originator. ARL-RP-0562 ● JAN 2016 US Army Research Laboratory Modeling and Simulation of Ballistic
CALIBRATION OF FULL-SCALE OZONATION SYSTEMS WITH CONSERVATIVE AND REACTIVE TRACERS
A full-scale ozonation reactor was characterized with respect to the overall oxidation budget by coupling laboratory kinetics with reactor hydraulics. The ozone decomposition kinetics and the ratio of the OH radical to the ozone concentration were determined in laboratory batch ...
Development and prototype testing of MgCl 2 /graphite foam latent heat thermal energy storage system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Dileep; Yu, Wenhua; Zhao, Weihuan
Composites of graphite foam infiltrated with a magnesium chloride phase-change material have been developed as high-temperature thermal energy storage media for concentrated solar power applications. This storage medium provides a high thermal energy storage density, a narrow operating temperature range, and excellent heat transfer characteristics. In this study, experimental investigations were conducted on laboratory-scale prototypes with magnesium chloride/graphite foam composite as the latent heat thermal energy storage system. Prototypes were designed and built to monitor the melt front movement during the charging/discharging tests. A test loop was built to ensure the charging/discharging of the prototypes at temperatures > 700 degreesmore » C. Repeated thermal cycling experiments were carried out on the fabricated prototypes, and the experimental temperature profiles were compared to the predicted results from numerical simulations using COMSOL Multiphysics software. Experimental results were found to be in good agreement with the simulations to validate the thermal models.« less
Post-Cold War Science and Technology at Los Alamos
NASA Astrophysics Data System (ADS)
Browne, John C.
2002-04-01
Los Alamos National Laboratory serves the nation through the development and application of leading-edge science and technology in support of national security. Our mission supports national security by: ensuring the safety, security, and reliability of the U.S. nuclear stockpile; reducing the threat of weapons of mass destruction in support of counter terrorism and homeland defense; and solving national energy, environment, infrastructure, and health security problems. We require crosscutting fundamental and advanced science and technology research to accomplish our mission. The Stockpile Stewardship Program develops and applies, advanced experimental science, computational simulation, and technology to ensure the safety and reliability of U.S. nuclear weapons in the absence of nuclear testing. This effort in itself is a grand challenge. However, the terrorist attack of September 11, 2001, reminded us of the importance of robust and vibrant research and development capabilities to meet new and evolving threats to our national security. Today through rapid prototyping we are applying new, innovative, science and technology for homeland defense, to address the threats of nuclear, chemical, and biological weapons globally. Synergistically, with the capabilities that we require for our core mission, we contribute in many other areas of scientific endeavor. For example, our Laboratory has been part of the NASA effort on mapping water on the moon and NSF/DOE projects studying high-energy astrophysical phenomena, understanding fundamental scaling phenomena of life, exploring high-temperature superconductors, investigating quantum information systems, applying neutrons to condensed-matter and nuclear physics research, developing large-scale modeling and simulations to understand complex phenomena, and exploring nanoscience that bridges the atomic to macroscopic scales. In this presentation, I will highlight some of these post-cold war science and technology advances including our national security contributions, and discuss some of challenges for Los Alamos in the future.
NASA Technical Reports Server (NTRS)
Christoffersen, R.; Loeffler, M. J.; Rahman, Z.; Dukes, C.; IMPACT Team
2017-01-01
The space weathering of regoliths on airless bodies and the formation of their exospheres is driven to a large extent by hypervelocity impacts from the high relative flux of micron to sub-micron meteoroids that comprise approximately 90 percent of the solar system meteoroid population. Laboratory hypervelocity impact experiments are crucial for quantifying how these small impact events drive space weathering through target shock, melting and vaporization. Simulating these small scale impacts experimentally is challenging because the natural impactors are both very small and many have velocities above the approximately 8 kilometers-per-second limit attainable by conventional chemical/light gas accelerator technology. Electrostatic "dust" accelerators, such as the one recently developed at the Colorado Center for Lunar Dust and Atmospheric Studies (CCLDAS), allow the experimental velocity regime to be extended up to tens of kilometers-per-second. Even at these velocities the region of latent target damage created by each impact, in the form of microcraters or pits, is still only about 0.1 to 10 micrometers in size. Both field-emission analytical scanning electron microscopy (FE-SEM) and advanced field-emission scanning transmission electron microscopy (FE-STEM) are uniquely suited for characterizing the individual dust impact sites in these experiments. In this study, we have used both techniques, along with focused ion beam (FIB) sample preparation, to characterize the micrometer to nanometer scale effects created by accelerated dust impacts into olivine single crystals. To our knowledge this work presents the first TEM-scale characterization of dust impacts into a key solar system silicate mineral using the CCLDAS facility. Our overarching goal for this work is to establish a basis to compare with our previous results on natural dust-impacted lunar olivine and laser-irradiated olivine.
Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcox, Karen; Marzouk, Youssef
2013-11-12
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
NASA Technical Reports Server (NTRS)
Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.
2008-01-01
NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In addition, the performance of DSIL under different traffic loads with different mix of data and priorities are evaluated.
Laboratory-Scale Evidence for Lightning-Mediated Gene Transfer in Soil
Demanèche, Sandrine; Bertolla, Franck; Buret, François; Nalin, Renaud; Sailland, Alain; Auriol, Philippe; Vogel, Timothy M.; Simonet, Pascal
2001-01-01
Electrical fields and current can permeabilize bacterial membranes, allowing for the penetration of naked DNA. Given that the environment is subjected to regular thunderstorms and lightning discharges that induce enormous electrical perturbations, the possibility of natural electrotransformation of bacteria was investigated. We demonstrated with soil microcosm experiments that the transformation of added bacteria could be increased locally via lightning-mediated current injection. The incorporation of three genes coding for antibiotic resistance (plasmid pBR328) into the Escherichia coli strain DH10B recipient previously added to soil was observed only after the soil had been subjected to laboratory-scale lightning. Laboratory-scale lightning had an electrical field gradient (700 versus 600 kV m−1) and current density (2.5 versus 12.6 kA m−2) similar to those of full-scale lightning. Controls handled identically except for not being subjected to lightning produced no detectable antibiotic-resistant clones. In addition, simulated storm cloud electrical fields (in the absence of current) did not produce detectable clones (transformation detection limit, 10−9). Natural electrotransformation might be a mechanism involved in bacterial evolution. PMID:11472916
Benchmark Modeling of the Near-Field and Far-Field Wave Effects of Wave Energy Arrays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rhinefrank, Kenneth E; Haller, Merrick C; Ozkan-Haller, H Tuba
2013-01-26
This project is an industry-led partnership between Columbia Power Technologies and Oregon State University that will perform benchmark laboratory experiments and numerical modeling of the near-field and far-field impacts of wave scattering from an array of wave energy devices. These benchmark experimental observations will help to fill a gaping hole in our present knowledge of the near-field effects of multiple, floating wave energy converters and are a critical requirement for estimating the potential far-field environmental effects of wave energy arrays. The experiments will be performed at the Hinsdale Wave Research Laboratory (Oregon State University) and will utilize an array ofmore » newly developed Buoys' that are realistic, lab-scale floating power converters. The array of Buoys will be subjected to realistic, directional wave forcing (1:33 scale) that will approximate the expected conditions (waves and water depths) to be found off the Central Oregon Coast. Experimental observations will include comprehensive in-situ wave and current measurements as well as a suite of novel optical measurements. These new optical capabilities will include imaging of the 3D wave scattering using a binocular stereo camera system, as well as 3D device motion tracking using a newly acquired LED system. These observing systems will capture the 3D motion history of individual Buoys as well as resolve the 3D scattered wave field; thus resolving the constructive and destructive wave interference patterns produced by the array at high resolution. These data combined with the device motion tracking will provide necessary information for array design in order to balance array performance with the mitigation of far-field impacts. As a benchmark data set, these data will be an important resource for testing of models for wave/buoy interactions, buoy performance, and far-field effects on wave and current patterns due to the presence of arrays. Under the proposed project we will initiate high-resolution (fine scale, very near-field) fluid/structure interaction simulations of buoy motions, as well as array-scale, phase-resolving wave scattering simulations. These modeling efforts will utilize state-of-the-art research quality models, which have not yet been brought to bear on this complex problem of large array wave/structure interaction problem.« less
Why build a virtual brain? Large-scale neural simulations as jump start for cognitive computing
NASA Astrophysics Data System (ADS)
Colombo, Matteo
2017-03-01
Despite the impressive amount of financial resources recently invested in carrying out large-scale brain simulations, it is controversial what the pay-offs are of pursuing this project. One idea is that from designing, building, and running a large-scale neural simulation, scientists acquire knowledge about the computational performance of the simulating system, rather than about the neurobiological system represented in the simulation. It has been claimed that this knowledge may usher in a new era of neuromorphic, cognitive computing systems. This study elucidates this claim and argues that the main challenge this era is facing is not the lack of biological realism. The challenge lies in identifying general neurocomputational principles for the design of artificial systems, which could display the robust flexibility characteristic of biological intelligence.
Nonlocal and collective relaxation in stellar systems
NASA Technical Reports Server (NTRS)
Weinberg, Martin D.
1993-01-01
The modal response of stellar systems to fluctuations at large scales is presently investigated by means of analytic theory and n-body simulation; the stochastic excitation of these modes is shown to increase the relaxation rate even for a system which is moderately far from instability. The n-body simulations, when designed to suppress relaxation at small scales, clearly show the effects of large-scale fluctuations. It is predicted that large-scale fluctuations will be largest for such marginally bound systems as forming star clusters and associations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
SAMS TL; GUILLOT S
Scoping laboratory scale tests were performed at the Chemical Engineering Department of the Georgia Institute of Technology (Georgia Tech), and the Hanford 222-S Laboratory, involving double-shell tank (DST) and single-shell tank (SST) Hanford waste simulants. These tests established the viability of the Lithium Hydrotalcite precipitation process as a solution to remove aluminum and recycle sodium hydroxide from the Hanford tank waste, and set the basis of a validation test campaign to demonstrate a Technology Readiness Level of 3.
Radiation Induced Chemistry of Icy Surfaces: Laboratory Simulations
NASA Technical Reports Server (NTRS)
Gudipati, Murthy S.; Lignell, Antti; Li, Irene; Yang, Rui; Jacovi, Ronen
2011-01-01
We will discuss laboratory experiments designed to enhance our understanding the chemical processes on icy solar system bodies, enable interpretation of in-situ and remote-sensing data, and help future missions to icy solar system bodies, such as comets, Europa, Ganymede, Enceladus etc.
Computing the universe: how large-scale simulations illuminate galaxies and dark energy
NASA Astrophysics Data System (ADS)
O'Shea, Brian
2015-04-01
High-performance and large-scale computing is absolutely to understanding astronomical objects such as stars, galaxies, and the cosmic web. This is because these are structures that operate on physical, temporal, and energy scales that cannot be reasonably approximated in the laboratory, and whose complexity and nonlinearity often defies analytic modeling. In this talk, I show how the growth of computing platforms over time has facilitated our understanding of astrophysical and cosmological phenomena, focusing primarily on galaxies and large-scale structure in the Universe.
Current-Sheet Formation and Reconnection at a Magnetic X Line in Particle-in-Cell Simulations
NASA Technical Reports Server (NTRS)
Black, C.; Antiochos, S. K.; Hesse, M.; Karpen, J. T.; Kuznetsova, M. M.; Zenitani, S.
2011-01-01
The integration of kinetic effects into macroscopic numerical models is currently of great interest to the heliophysics community, particularly in the context of magnetic reconnection. Reconnection governs the large-scale energy release and topological rearrangement of magnetic fields in a wide variety of laboratory, heliophysical, and astrophysical systems. We are examining the formation and reconnection of current sheets in a simple, two-dimensional X-line configuration using high-resolution particle-in-cell (PIC) simulations. The initial minimum-energy, potential magnetic field is perturbed by excess thermal pressure introduced into the particle distribution function far from the X line. Subsequently, the relaxation of this added stress leads self-consistently to the development of a current sheet that reconnects for imposed stress of sufficient strength. We compare the time-dependent evolution and final state of our PIC simulations with macroscopic magnetohydrodynamic simulations assuming both uniform and localized electrical resistivities (C. R. DeVore et al., this meeting), as well as with force-free magnetic-field equilibria in which the amount of reconnection across the X line can be constrained to be zero (ideal evolution) or optimal (minimum final magnetic energy). We will discuss implications of our results for understanding magnetic-reconnection onset and cessation at kinetic scales in dynamically formed current sheets, such as those occurring in the solar corona and terrestrial magnetotail.
The WEBSIM FISHBANKS Simulation Laboratory: Analysis of Its Ripple Effects
ERIC Educational Resources Information Center
Arantes do Amaral, João Alberto; Hess, Aurélio
2018-01-01
In this article, we discuss the ripple effects of the WEBSIM FISHBANKS Simulation Laboratory held at Federal University of Sao Paulo (UNIFESP) in 2014, held as a result of a partnership between the Sloan School of Management of the Massachusetts Institute of Technology, the UNIFESP, and the Brazilian Chapter of the System Dynamics Society of…
NASA Astrophysics Data System (ADS)
Sotiropoulos, Fotis; Khosronejad, Ali
2016-02-01
Sand waves arise in subaqueous and Aeolian environments as the result of the complex interaction between turbulent flows and mobile sand beds. They occur across a wide range of spatial scales, evolve at temporal scales much slower than the integral scale of the transporting turbulent flow, dominate river morphodynamics, undermine streambank stability and infrastructure during flooding, and sculpt terrestrial and extraterrestrial landscapes. In this paper, we present the vision for our work over the last ten years, which has sought to develop computational tools capable of simulating the coupled interactions of sand waves with turbulence across the broad range of relevant scales: from small-scale ripples in laboratory flumes to mega-dunes in large rivers. We review the computational advances that have enabled us to simulate the genesis and long-term evolution of arbitrarily large and complex sand dunes in turbulent flows using large-eddy simulation and summarize numerous novel physical insights derived from our simulations. Our findings explain the role of turbulent sweeps in the near-bed region as the primary mechanism for destabilizing the sand bed, show that the seeds of the emergent structure in dune fields lie in the heterogeneity of the turbulence and bed shear stress fluctuations over the initially flatbed, and elucidate how large dunes at equilibrium give rise to energetic coherent structures and modify the spectra of turbulence. We also discuss future challenges and our vision for advancing a data-driven simulation-based engineering science approach for site-specific simulations of river flooding.
NASA Technical Reports Server (NTRS)
Young, Gerald W.; Clemons, Curtis B.
2004-01-01
The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.
NASA Astrophysics Data System (ADS)
Elkhoury, J. E.; Detwiler, R. L.; Serajian, V.; Bruno, M. S.
2012-12-01
Geothermal energy resources are more widespread than previously thought and have the potential for providing a significant amount of sustainable clean energy worldwide. In particular, hot permeable sedimentary formations provide many advantages over traditional geothermal recovery and enhanced geothermal systems in low permeability crystalline formations. These include: (1) eliminating the need for hydraulic fracturing, (2) significant reduction in risk for induced seismicity, (3) reducing the need for surface wastewater disposal, (4) contributing to decreases in greenhouse gases, and (5) potential use for CO2 sequestration. Advances in horizontal drilling, completion, and production technology from the oil and gas industry can now be applied to unlock these geothermal resources. Here, we present experimental results from a laboratory scale circulation system and numerical simulations aimed at quantifying the heat transfer capacity of sedimentary rocks. Our experiments consist of fluid flow through a saturated and pressurized sedimentary disc of 23-cm diameter and 3.8-cm thickness heated along its circumference at a constant temperature. Injection and production ports are 7.6-cm apart in the center of the disc. We used DI de-aired water and mineral oil as working fluids and explored temperatures from 20 to 150 oC and flow rates from 2 to 30 ml/min. We performed experiments on sandstone samples (Castlegate and Kirby) with different porosity, permeability and thermal conductivity to evaluate the effect of hydraulic and thermal properties on the heat transfer capacity of sediments. The producing fluid temperature followed an exponential form with time scale transients between 15 and 45 min. Steady state outflow temperatures varied between 60% and 95% of the set boundary temperature, higher percentages were observed for lower temperatures and flow rates. We used the flow and heat transport simulator TOUGH2 to develop a numerical model of our laboratory setting. Given the remarkable match between our observations and numerical results, we extended our model to explore a wider range of thermal and hydrological parameters beyond the experimental conditions. Our results prove the capability of heat transfer in sedimentary formations for geothermal energy production.) Sandstone sample with two thermally insulating Teflon caps (white discs). In and out arrows indicate the flow direction while the sample is heated along its circumference (heater not shown). B) Example of a 2D temperature distribution during injection. White x shows the location of the flow ports, inlet (left) and outlet (right). Red is the set boundary temperature and blue is the fluid temperature at the inlet.
Potential release of fibers from burning carbon composites. [aircraft fires
NASA Technical Reports Server (NTRS)
Bell, V. L.
1980-01-01
A comprehensive experimental carbon fiber source program was conducted to determine the potential for the release of conductive carbon fibers from burning composites. Laboratory testing determined the relative importance of several parameters influencing the amounts of single fibers released, while large-scale aviation jet fuel pool fires provided realistic confirmation of the laboratory data. The dimensions and size distributions of fire-released carbon fibers were determined, not only for those of concern in an electrical sense, but also for those of potential interest from a health and environmental standpoint. Fire plume and chemistry studies were performed with large pool fires to provide an experimental input into an analytical modelling of simulated aircraft crash fires. A study of a high voltage spark system resulted in a promising device for the detection, counting, and sizing of electrically conductive fibers, for both active and passive modes of operation.
High-fidelity nursing simulation: impact on student self-confidence and clinical competence.
Blum, Cynthia A; Borglund, Susan; Parcells, Dax
2010-01-01
Development of safe nursing practice in entry-level nursing students requires special consideration from nurse educators. The paucity of data supporting high-fidelity patient simulation effectiveness in this population informed the development of a quasi-experimental, quantitative study of the relationship between simulation and student self-confidence and clinical competence. Moreover, the study reports a novel approach to measuring self-confidence and competence of entry-level nursing students. Fifty-three baccalaureate students, enrolled in either a traditional or simulation-enhanced laboratory, participated during their first clinical rotation. Student self-confidence and faculty perception of student clinical competence were measured using selected scale items of the Lasater Clinical Judgment Rubric. The results indicated an overall improvement in self-confidence and competence across the semester, however, simulation did not significantly enhance these caring attributes. The study highlights the need for further examination of teaching strategies developed to promote the transfer of self-confidence and competence from the laboratory to the clinical setting.
Properties important to mixing and simulant recommendations for WTP full-scale vessel testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poirier, M. R.; Martino, C. J.
2015-12-01
Full Scale Vessel Testing (FSVT) is being planned by Bechtel National, Inc., to demonstrate the ability of the standard high solids vessel design (SHSVD) to meet mixing requirements over the range of fluid properties planned for processing in the Pretreatment Facility (PTF) of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. WTP personnel requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in FSVT. Among the tasks assignedmore » to SRNL was to develop a list of waste properties that are important to pulse-jet mixer (PJM) performance in WTP vessels with elevated concentrations of solids.« less
Fritt-Rasmussen, Janne; Brandvik, Per Johan
2011-08-01
This paper compares the ignitability of Troll B crude oil weathered under simulated Arctic conditions (0%, 50% and 90% ice cover). The experiments were performed in different scales at SINTEF's laboratories in Trondheim, field research station on Svalbard and in broken ice (70-90% ice cover) in the Barents Sea. Samples from the weathering experiments were tested for ignitability using the same laboratory burning cell. The measured ignitability from the experiments in these different scales showed a good agreement for samples with similar weathering. The ice conditions clearly affected the weathering process, and 70% ice or more reduces the weathering and allows a longer time window for in situ burning. The results from the Barents Sea revealed that weathering and ignitability can vary within an oil slick. This field use of the burning cell demonstrated that it can be used as an operational tool to monitor the ignitability of oil spills. Copyright © 2011 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thoma, C.; Welch, D. R.; Hsu, S. C.
2013-08-15
We describe numerical simulations, using the particle-in-cell (PIC) and hybrid-PIC code lsp[T. P. Hughes et al., Phys. Rev. ST Accel. Beams 2, 110401 (1999)], of the head-on merging of two laboratory supersonic plasma jets. The goals of these experiments are to form and study astrophysically relevant collisionless shocks in the laboratory. Using the plasma jet initial conditions (density ∼10{sup 14}–10{sup 16} cm{sup −3}, temperature ∼ few eV, and propagation speed ∼20–150 km/s), large-scale simulations of jet propagation demonstrate that interactions between the two jets are essentially collisionless at the merge region. In highly resolved one- and two-dimensional simulations, we showmore » that collisionless shocks are generated by the merging jets when immersed in applied magnetic fields (B∼0.1–1 T). At expected plasma jet speeds of up to 150 km/s, our simulations do not give rise to unmagnetized collisionless shocks, which require much higher velocities. The orientation of the magnetic field and the axial and transverse density gradients of the jets have a strong effect on the nature of the interaction. We compare some of our simulation results with those of previously published PIC simulation studies of collisionless shock formation.« less
Simulation Framework for Intelligent Transportation Systems
DOT National Transportation Integrated Search
1996-10-01
A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...
THE FORMATION OF INORGANIC PARTICLES DURING SUSPENSION HEATING OF SIMULATED WASTES
Measurements of metal partitioning between the fine condensation aerosol and the larger particles produced during rapid heating of aqueous and organic solutions containing metal additives with widely varying volatilities were made in a laboratory-scale furnace operated over a ran...
Xiangyang Zhou; Shankar Mahalingam; David Weise
2007-01-01
This paper presents a combined study of laboratory scale fire spread experiments and a three-dimensional large eddy simulation (LES) to analyze the effect of terrain slope on marginal burning behavior in live chaparral shrub fuel beds. Line fire was initiated in single species fuel beds of four common chaparral plants under various fuel bed configurations and ambient...
Findings from the Supersonic Qualification Program of the Mars Science Laboratory Parachute System
NASA Technical Reports Server (NTRS)
Sengupta, Anita; Steltzner, Adam; Witkowski, Allen; Candler, Graham; Pantano, Carlos
2009-01-01
In 2012, the Mars Science Laboratory Mission (MSL) will deploy NASA's largest extra-terrestrial parachute, a technology integral to the safe landing of its advanced robotic explorer on the surface. The supersonic parachute system is a mortar deployed 21.5 m disk-gap-band (DGB) parachute, identical in geometric scaling to the Viking era DGB parachutes of the 1970's. The MSL parachute deployment conditions are Mach 2.3 at a dynamic pressure of 750 Pa. The Viking Balloon Launched Decelerator Test (BLDT) successfully demonstrated a maximum of 700 Pa at Mach 2.2 for a 16.1 m DGB parachute in its AV4 flight. All previous Mars deployments have derived their supersonic qualification from the Viking BLDT test series, preventing the need for full scale high altitude supersonic testing. The qualification programs for Mars Pathfinder, Mars Exploration Rover, and Phoenix Scout Missions were all limited to subsonic structural qualification, with supersonic performance and survivability bounded by the BLDT qualification. The MSL parachute, at the edge of the supersonic heritage deployment space and 33% larger than the Viking parachute, accepts a certain degree of risk without addressing the supersonic environment in which it will deploy. In addition, MSL will spend up to 10 seconds above Mach 1.5, an aerodynamic regime that is associated with a known parachute instability characterized by significant canopy projected area fluctuation and dynamic drag variation. This aerodynamic instability, referred to as "area oscillations" by the parachute community has drag performance, inflation stability, and structural implications, introducing risk to mission success if not quantified for the MSL parachute system. To minimize this risk and as an alternative to a prohibitively expensive high altitude test program, a multi-phase qualification program using computation simulation validated by subscale test was developed and implemented for MSL. The first phase consisted of 2% of fullscale supersonic wind tunnel testing of a rigid DGB parachute with entry-vehicle to validate two high fidelity computational fluid dynamics (CFD) tools. The computer codes utilized Large Eddy Simulation and Detached Eddy Simulation numerical approaches to accurately capture the turbulent wake of the entry vehicle and its coupling to the parachute bow-shock. The second phase was the development of fluid structure interaction (FSI) computational tools to predict parachute response to the supersonic flow field. The FSI development included the integration of the CFD from the first phase with a finite element structural model of the parachute membrane and cable elements. In this phase, a 4% of full-scale supersonic flexible parachute test program was conducted to provide validation data to the FSI code and an empirical dataset of the MSL parachute in a flight-like environment. The final phase is FSI simulations of the full-scale MSL parachute in a Mars type deployment. Findings from this program will be presented in terms of code development and validation, empirical findings from the supersonic testing, and drag performance during supersonic operation.
The Reel Deal In 3D: The Spatio-Temporal Evolution of YSO Jets
NASA Astrophysics Data System (ADS)
Frank, Adam
2014-10-01
Jets are a ubiquitous phenomena in astrophysics, though in most cases their central engines are unresolvable. Thus the structure of the jets often acts as a proxy for understanding the objects creating them. Jets are also of interest in their own right, serving as critical examples of rapidly evolving astrophysical magnetized plasma systems. And while millions of CPU hours {at least} have been spent simulating the kinds of astrophysical plasma dynamics that occur routinely in jets, we rarely have had the chance to study their real-time evolution. In this proposal we seek to use a unique multi-epoch HST dataset of protostellar jets to carry forward an innovative theoretical, numerical and laboratory-based study of magnetized outflows and the plasma processes which determine their evolution. Our work will make direct and detailed contact with these HST data sets and will articulate newly-observed features of jet dynamics that have not been possible to explore before. Using numerical simulations and laboratory plasma studies we seek to articulate the full 3-D nature of new behaviors seen in the HST data. Our collaboration includes the use of scaled laboratory plasma experiments with hypersonic magnetized radiative jets. The MHD experiments have explored how jets break up into clumps via kink-mode instabilities. Therefore such experiments are directly relevant to the initial conditions in our models.
Laboratory meter-scale seismic monitoring of varying water levels in granular media
NASA Astrophysics Data System (ADS)
Pasquet, S.; Bodet, L.; Bergamo, P.; Guérin, R.; Martin, R.; Mourgues, R.; Tournat, V.
2016-12-01
Laboratory physical modelling and non-contacting ultrasonic techniques are frequently proposed to tackle theoretical and methodological issues related to geophysical prospecting. Following recent developments illustrating the ability of seismic methods to image spatial and/or temporal variations of water content in the vadose zone, we developed laboratory experiments aimed at testing the sensitivity of seismic measurements (i.e., pressure-wave travel times and surface-wave phase velocities) to water saturation variations. Ultrasonic techniques were used to simulate typical seismic acquisitions on small-scale controlled granular media presenting different water levels. Travel times and phase velocity measurements obtained at the dry state were validated with both theoretical models and numerical simulations and serve as reference datasets. The increasing water level clearly affects the recorded wave field in both its phase and amplitude, but the collected data cannot yet be inverted in the absence of a comprehensive theoretical model for such partially saturated and unconsolidated granular media. The differences in travel time and phase velocity observed between the dry and wet models show patterns that are interestingly coincident with the observed water level and depth of the capillary fringe, thus offering attractive perspectives for studying soil water content variations in the field.
Role of Laboratory Plasma Experiments in exploring the Physics of Solar Eruptions
NASA Astrophysics Data System (ADS)
Tripathi, S.
2017-12-01
Solar eruptive events are triggered over a broad range of spatio-temporal scales by a variety of fundamental processes (e.g., force-imbalance, magnetic-reconnection, electrical-current driven instabilities) associated with arched magnetoplasma structures in the solar atmosphere. Contemporary research on solar eruptive events is at the forefront of solar and heliospheric physics due to its relevance to space weather. Details on the formation of magnetized plasma structures on the Sun, storage of magnetic energy in such structures over a long period (several Alfven transit times), and their impulsive eruptions have been recorded in numerous observations and simulated in computer models. Inherent limitations of space observations and uncontrolled nature of solar eruptions pose significant challenges in testing theoretical models and developing the predictive capability for space-weather. The pace of scientific progress in this area can be significantly boosted by tapping the potential of appropriately scaled laboratory plasma experiments to compliment solar observations, theoretical models, and computer simulations. To give an example, recent results from a laboratory plasma experiment on arched magnetic flux ropes will be presented and future challenges will be discussed. (Work supported by National Science Foundation, USA under award number 1619551)
Wang, Yongjiang; Pang, Li; Liu, Xinyu; Wang, Yuansheng; Zhou, Kexun; Luo, Fei
2016-04-01
A comprehensive model of thermal balance and degradation kinetics was developed to determine the optimal reactor volume and insulation material. Biological heat production and five channels of heat loss were considered in the thermal balance model for a representative reactor. Degradation kinetics was developed to make the model applicable to different types of substrates. Simulation of the model showed that the internal energy accumulation of compost was the significant heat loss channel, following by heat loss through reactor wall, and latent heat of water evaporation. Lower proportion of heat loss occurred through the reactor wall when the reactor volume was larger. Insulating materials with low densities and low conductive coefficients were more desirable for building small reactor systems. Model developed could be used to determine the optimal reactor volume and insulation material needed before the fabrication of a lab-scale composting system. Copyright © 2016 Elsevier Ltd. All rights reserved.
Optimal output fast feedback in two-time scale control of flexible arms
NASA Technical Reports Server (NTRS)
Siciliano, B.; Calise, A. J.; Jonnalagadda, V. R. P.
1986-01-01
Control of lightweight flexible arms moving along predefined paths can be successfully synthesized on the basis of a two-time scale approach. A model following control can be designed for the reduced order slow subsystem. The fast subsystem is a linear system in which the slow variables act as parameters. The flexible fast variables which model the deflections of the arm along the trajectory can be sensed through strain gage measurements. For full state feedback design the derivatives of the deflections need to be estimated. The main contribution of this work is the design of an output feedback controller which includes a fixed order dynamic compensator, based on a recent convergent numerical algorithm for calculating LQ optimal gains. The design procedure is tested by means of simulation results for the one link flexible arm prototype in the laboratory.
Agaoglu, Berken; Scheytt, Traugott; Copty, Nadim K
2012-10-01
This study examines the mechanistic processes governing multiphase flow of a water-cosolvent-NAPL system in saturated porous media. Laboratory batch and column flushing experiments were conducted to determine the equilibrium properties of pure NAPL and synthetically prepared NAPL mixtures as well as NAPL recovery mechanisms for different water-ethanol contents. The effect of contact time was investigated by considering different steady and intermittent flow velocities. A modified version of multiphase flow simulator (UTCHEM) was used to compare the multiphase model simulations with the column experiment results. The effect of employing different grid geometries (1D, 2D, 3D), heterogeneity and different initial NAPL saturation configurations was also examined in the model. It is shown that the change in velocity affects the mass transfer rate between phases as well as the ultimate NAPL recovery percentage. The experiments with low flow rate flushing of pure NAPL and the 3D UTCHEM simulations gave similar effluent concentrations and NAPL cumulative recoveries. Model simulations over-estimated NAPL recovery for high specific discharges and rate-limited mass transfer, suggesting a constant mass transfer coefficient for the entire flushing experiment may not be valid. When multi-component NAPLs are present, the dissolution rate of individual organic compounds (namely, toluene and benzene) into the ethanol-water flushing solution is found not to correlate with their equilibrium solubility values. Copyright © 2012 Elsevier B.V. All rights reserved.
Solar simulator for concentrator photovoltaic systems.
Domínguez, César; Antón, Ignacio; Sala, Gabriel
2008-09-15
A solar simulator for measuring performance of large area concentrator photovoltaic (CPV) modules is presented. Its illumination system is based on a Xenon flash light and a large area collimator mirror, which simulates natural sun light. Quality requirements imposed by the CPV systems have been characterized: irradiance level and uniformity at the receiver, light collimation and spectral distribution. The simulator allows indoor fast and cost-effective performance characterization and classification of CPV systems at the production line as well as module rating carried out by laboratories.
Characteristics of coking coal burnout
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakamura, M.; Bailey, J.G.
An attempt was made to clarify the characteristics of coking coal burnout by the morphological analysis of char and fly ash samples. Laboratory-scale combustion testing, simulating an ignition process, was carried out for three kinds of coal (two coking coals and one non-coking coal for reference), and sampled chars were analyzed for size, shape and type by image analysis. The full combustion process was examined in industrial-scale combustion testing for the same kinds of coal. Char sampled at the burner outlet and fly ash at the furnace exit were also analyzed. The difference between the char type, swelling properties, agglomeration,more » anisotropy and carbon burnout were compared at laboratory scale and at industrial scale. As a result, it was found that coking coals produced chars with relatively thicker walls, which mainly impeded char burnout, especially for low volatile coals.« less
Building laboratory capacity to support HIV care in Nigeria: Harvard/APIN PEPFAR, 2004-2012.
Hamel, Donald J; Sankalé, Jean-Louis; Samuels, Jay Osi; Sarr, Abdoulaye D; Chaplin, Beth; Ofuche, Eke; Meloni, Seema T; Okonkwo, Prosper; Kanki, Phyllis J
From 2004-2012, the Harvard/AIDS Prevention Initiative in Nigeria, funded through the US President's Emergency Plan for AIDS Relief programme, scaled up HIV care and treatment services in Nigeria. We describe the methodologies and collaborative processes developed to improve laboratory capacity significantly in a resource-limited setting. These methods were implemented at 35 clinic and laboratory locations. Systems were established and modified to optimise numerous laboratory processes. These included strategies for clinic selection and management, equipment and reagent procurement, supply chains, laboratory renovations, equipment maintenance, electronic data management, quality development programmes and trainings. Over the eight-year programme, laboratories supported 160 000 patients receiving HIV care in Nigeria, delivering over 2.5 million test results, including regular viral load quantitation. External quality assurance systems were established for CD4+ cell count enumeration, blood chemistries and viral load monitoring. Laboratory equipment platforms were improved and standardised and use of point-of-care analysers was expanded. Laboratory training workshops supported laboratories toward increasing staff skills and improving overall quality. Participation in a World Health Organisation-led African laboratory quality improvement system resulted in significant gains in quality measures at five laboratories. Targeted implementation of laboratory development processes, during simultaneous scale-up of HIV treatment programmes in a resource-limited setting, can elicit meaningful gains in laboratory quality and capacity. Systems to improve the physical laboratory environment, develop laboratory staff, create improvements to reduce costs and increase quality are available for future health and laboratory strengthening programmes. We hope that the strategies employed may inform and encourage the development of other laboratories in resource-limited settings.
SURFACTANT - POLYMER INTERACTION FOR IMPROVED OIL RECOVERY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Unknown
1998-10-01
The goal of this research is to use the interaction between a surfactant and a polymer for efficient displacement of tertiary oil by improving slug integrity, adsorption and mobility control. Surfactant--polymer flooding has been shown to be highly effective in laboratory-scale linear floods. The focus of this proposal is to design an inexpensive surfactant-polymer mixture that can efficiently recover tertiary oil by avoiding surfactant slug degradation high adsorption and viscous/heterogeneity fingering. A mixture comprising a ''pseudo oil'' with appropriate surfactant and polymer has been selected to study micellar-polymer chemical flooding. The physical properties and phase behavior of this system havemore » been determined. A surfactant-polymer slug has been designed to achieve high efficiency recovery by improving phase behavior and mobility control. Recovery experiments have been performed on linear cores and a quarter 5-spot. The same recovery experiments have been simulated using a commercially available simulator (UTCHEM). Good agreement between experimental data and simulation results has been achieved.« less
Modelling Iron-Bentonite Interactions
NASA Astrophysics Data System (ADS)
Watson, C.; Savage, D.; Benbow, S.; Wilson, J.
2009-04-01
The presence of both iron canisters and bentonitic clay in some engineered barrier system (EBS) designs for the geological disposal of high-level radioactive wastes creates the potential for chemical interactions which may impact upon the long-term performance of the clay as a barrier to radionuclide migration. Flooding of potential radionuclide sorption sites on the clay by ferrous ions and conversion of clay to non-swelling sheet silicates (e.g. berthierine) are two possible outcomes deleterious to long-term performance. Laboratory experimental studies of the corrosion of iron in clay show that corrosion product layers are generally thin (< 1 µm) with magnetite, siderite, or ‘green rust' occurring depending upon temperature and ambient partial pressure of carbon dioxide. In theory, incorporation of iron into clay alteration products could act as a ‘pump' to accelerate corrosion. However, the results of laboratory experiments to characterise the products of iron-bentonite interaction are less than unequivocal. The type and amounts of solid products appear to be strong functions of time, temperature, water/clay ratio, and clay and pore fluid compositions. For example, the products of high temperature experiments (> 250 °C) are dominated by chlorite, whereas lower temperatures produce berthierine, odinite, cronstedtite, or Fe-rich smectite. Unfortunately, the inevitable short-term nature of laboratory experimental studies introduces issues of metastability and kinetics. The sequential formation in time of minerals in natural systems often produces the formation of phases not predicted by equilibrium thermodynamics. Evidence from analogous natural systems suggests that the sequence of alteration of clay by Fe-rich fluids will proceed via an Ostwald step sequence. The computer code, QPAC, has been modified to incorporate processes of nucleation, growth, precursor cannibalisation, and Ostwald ripening to address the issues of the slow growth of bentonite alteration products. This, together with inclusion of processes of iron corrosion and diffusion, has enabled investigation of a representative model of the alteration of bentonite in a typical EBS environment. Simulations with fixed mineral surface areas show that berthierine dominates the solid product assemblage, with siderite replacing it at simulation times greater than 10 000 years. Simulations with time-dependent mineral surface areas show a sequence of solid alteration products, described by: magnetite -> cronstedtite -> berthierine -> chlorite. Using plausible estimates of mineral-fluid interfacial free energies, chlorite growth is not achieved until 5 000 years of simulation time. The results of this modelling work suggest that greater emphasis should be placed upon methods to up-scale the results of laboratory experiments to timescales of relevance to performance assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucknor, Matthew; Hu, Rui; Lisowski, Darius
2016-04-17
The Reactor Cavity Cooling System (RCCS) is an important passive safety system being incorporated into the overall safety strategy for high temperature advanced reactor concepts such as the High Temperature Gas- Cooled Reactors (HTGR). The Natural Convection Shutdown Heat Removal Test Facility (NSTF) at Argonne National Laboratory (Argonne) reflects a 1/2-scale model of the primary features of one conceptual air-cooled RCCS design. The project conducts ex-vessel, passive heat removal experiments in support of Department of Energy Office of Nuclear Energy’s Advanced Reactor Technology (ART) program, while also generating data for code validation purposes. While experiments are being conducted at themore » NSTF to evaluate the feasibility of the passive RCCS, parallel modeling and simulation efforts are ongoing to support the design, fabrication, and operation of these natural convection systems. Both system-level and high fidelity computational fluid dynamics (CFD) analyses were performed to gain a complete understanding of the complex flow and heat transfer phenomena in natural convection systems. This paper provides a summary of the RELAP5-3D NSTF model development efforts and provides comparisons between simulation results and experimental data from the NSTF. Overall, the simulation results compared favorably to the experimental data, however, further analyses need to be conducted to investigate any identified differences.« less
Principles of control for robotic excavation
NASA Astrophysics Data System (ADS)
Bernold, Leonhard E.
The issues of automatic planning and control systems for robotic excavation are addressed. Attention is given to an approach to understanding the principles of path and motion control which is based on scaled modeling and experimentation with different soil types and soil conditions. Control concepts for the independent control of a bucket are discussed, and ways in which force sensors could provide the necessary data are demonstrated. Results of experiments with lunar simulant showed that explosive loosening has a substantial impact on the energy needed during excavation. It is argued that through further laboratory and field research, 'pattern languages' for different excavators and soil conditions could be established and employed for robotic excavation.
Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.
Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne
2018-01-01
State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.
Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers
Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne
2018-01-01
State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613
Laboratory Scale Coal And Biomass To Drop-In Fuels (CBDF) Production And Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lux, Kenneth; Imam, Tahmina; Chevanan, Nehru
This Final Technical Report describes the work and accomplishments of the project entitled, “Laboratory Scale Coal and Biomass to Drop-In Fuels (CBDF) Production and Assessment.” The main objective of the project was to fabricate and test a lab-scale liquid-fuel production system using coal containing different percentages of biomass such as corn stover and switchgrass at a rate of 2 liters per day. The system utilizes the patented Altex fuel-production technology, which incorporates advanced catalysts developed by Pennsylvania State University. The system was designed, fabricated, tested, and assessed for economic and environmental feasibility relative to competing technologies.
Experiments and simulation of a net closing mechanism for tether-net capture of space debris
NASA Astrophysics Data System (ADS)
Sharf, Inna; Thomsen, Benjamin; Botta, Eleonora M.; Misra, Arun K.
2017-10-01
This research addresses the design and testing of a debris containment system for use in a tether-net approach to space debris removal. The tether-net active debris removal involves the ejection of a net from a spacecraft by applying impulses to masses on the net, subsequent expansion of the net, the envelopment and capture of the debris target, and the de-orbiting of the debris via a tether to the chaser spacecraft. To ensure a debris removal mission's success, it is important that the debris be successfully captured and then, secured within the net. To this end, we present a concept for a net closing mechanism, which we believe will permit consistently successful debris capture via a simple and unobtrusive design. This net closing system functions by extending the main tether connecting the chaser spacecraft and the net vertex to the perimeter and around the perimeter of the net, allowing the tether to actuate closure of the net in a manner similar to a cinch cord. A particular embodiment of the design in a laboratory test-bed is described: the test-bed itself is comprised of a scaled-down tether-net, a supporting frame and a mock-up debris. Experiments conducted with the facility demonstrate the practicality of the net closing system. A model of the net closure concept has been integrated into the previously developed dynamics simulator of the chaser/tether-net/debris system. Simulations under tether tensioning conditions demonstrate the effectiveness of the closure concept for debris containment, in the gravity-free environment of space, for a realistic debris target. The on-ground experimental test-bed is also used to showcase its utility for validating the dynamics simulation of the net deployment, and a full-scale automated setup would make possible a range of validation studies of other aspects of a tether-net debris capture mission.
Computer Modeling of the Earliest Cellular Structures and Functions
NASA Technical Reports Server (NTRS)
Pohorille, Andrew; Chipot, Christophe; Schweighofer, Karl
2000-01-01
In the absence of extinct or extant record of protocells (the earliest ancestors of contemporary cells). the most direct way to test our understanding of the origin of cellular life is to construct laboratory models of protocells. Such efforts are currently underway in the NASA Astrobiology Program. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures and developing designs for molecules that perform proto-cellular functions. Many of these functions, such as import of nutrients, capture and storage of energy. and response to changes in the environment are carried out by proteins bound to membrane< We will discuss a series of large-scale, molecular-level computer simulations which demonstrate (a) how small proteins (peptides) organize themselves into ordered structures at water-membrane interfaces and insert into membranes, (b) how these peptides aggregate to form membrane-spanning structures (eg. channels), and (c) by what mechanisms such aggregates perform essential proto-cellular functions, such as proton transport of protons across cell walls, a key step in cellular bioenergetics. The simulations were performed using the molecular dynamics method, in which Newton's equations of motion for each item in the system are solved iteratively. The problems of interest required simulations on multi-nanosecond time scales, which corresponded to 10(exp 6)-10(exp 8) time steps.
NASA Technical Reports Server (NTRS)
Groleau, Nicolas; Frainier, Richard; Colombano, Silvano; Hazelton, Lyman; Szolovits, Peter
1993-01-01
This paper describes portions of a novel system called MARIKA (Model Analysis and Revision of Implicit Key Assumptions) to automatically revise a model of the normal human orientation system. The revision is based on analysis of discrepancies between experimental results and computer simulations. The discrepancies are calculated from qualitative analysis of quantitative simulations. The experimental and simulated time series are first discretized in time segments. Each segment is then approximated by linear combinations of simple shapes. The domain theory and knowledge are represented as a constraint network. Incompatibilities detected during constraint propagation within the network yield both parameter and structural model alterations. Interestingly, MARIKA diagnosed a data set from the Massachusetts Eye and Ear Infirmary Vestibular Laboratory as abnormal though the data was tagged as normal. Published results from other laboratories confirmed the finding. These encouraging results could lead to a useful clinical vestibular tool and to a scientific discovery system for space vestibular adaptation.
Performance evaluation of a kinesthetic-tactual display
NASA Technical Reports Server (NTRS)
Jagacinski, R. J.; Flach, J. M.; Gilson, R. D.; Dunn, R. S.
1982-01-01
Simulator studies demonstrated the feasibility of using kinesthetic-tactual (KT) displays for providing collective and cyclic command information, and suggested that KT displays may increase pilot workload capability. A dual-axis laboratory tracking task suggested that beyond reduction in visual scanning, there may be additional sensory or cognitive benefits to the use of multiple sensory modalities. Single-axis laboratory tracking tasks revealed performance with a quickened KT display to be equivalent to performance with a quickened visual display for a low frequency sum-of-sinewaves input. In contrast, an unquickened KT display was inferior to an unquickened visual display. Full scale simulator studies and/or inflight testing are recommended to determine the generality of these results.
Assessment of the Mars Science Laboratory Entry, Descent, and Landing Simulation
NASA Technical Reports Server (NTRS)
Way, David W.; Davis, J. L.; Shidner, Jeremy D.
2013-01-01
On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was only the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and a novel and untested Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multi-body computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the vehicle.
Preliminary Assessment of the Mars Science Laboratory Entry, Descent, and Landing Simulation
NASA Technical Reports Server (NTRS)
Way, David W.
2013-01-01
On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was only the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and a novel and untested Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multibody computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the vehicle.
Methods to Prescribe Particle Motion to Minimize Quadrature Error in Meshfree Methods
NASA Astrophysics Data System (ADS)
Templeton, Jeremy; Erickson, Lindsay; Morris, Karla; Poliakoff, David
2015-11-01
Meshfree methods are an attractive approach for simulating material systems undergoing large-scale deformation, such as spray break up, free surface flows, and droplets. Particles, which can be easily moved, are used as nodes and/or quadrature points rather than a relying on a fixed mesh. Most methods move particles according to the local fluid velocity that allows for the convection terms in the Navier-Stokes equations to be easily accounted for. However, this is a trade-off against numerical accuracy as the flow can often move particles to configurations with high quadrature error, and artificial compressibility is often required to prevent particles from forming undesirable regions of high and low concentrations. In this work, we consider the other side of the trade-off: moving particles based on reducing numerical error. Methods derived from molecular dynamics show that particles can be moved to minimize a surrogate for the solution error, resulting in substantially more accurate simulations at a fixed cost. Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Fluid dynamics structures in a fire environment observed in laboratory-scale experiments
J. Lozano; W. Tachajapong; D.R. Weise; S. Mahalingam; M. Princevac
2010-01-01
Particle Image Velocimetry (PIV) measurements were performed in laboratory-scale experimental fires spreading across horizontal fuel beds composed of aspen (Populus tremuloides Michx) excelsior. The continuous flame, intermittent flame, and thermal plume regions of a fire were investigated. Utilizing a PIV system, instantaneous velocity fields for...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilke, Jeremiah J; Kenny, Joseph P.
2015-02-01
Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading frameworkmore » allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.« less
Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis
Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders; ...
2017-09-01
Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less
Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders
Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less
Multi-scale modeling in cell biology
Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick
2009-01-01
Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808
Capabilities of the Large-Scale Sediment Transport Facility
2016-04-01
experiments in wave /current environments. INTRODUCTION: The LSTF (Figure 1) is a large-scale laboratory facility capable of simulating conditions...comparable to low- wave energy coasts. The facility was constructed to address deficiencies in existing methods for calculating longshore sediment...transport. The LSTF consists of a 30 m wide, 50 m long, 1.4 m deep basin. Waves are generated by four digitally controlled wave makers capable of producing
Assessing sorbent injection mercury control effectiveness in flue gas streams
Carey, T.R.; Richardson, C.F.; Chang, R.; Meserole, F.B.; Rostam-Abadi, M.; Chen, S.
2000-01-01
One promising approach for removing mercury from coal-fired, utility flue gas involves the direct injection of mercury sorbents. Although this method has been effective at removing mercury in municipal waste incinerators, tests conducted to date on utility coal-fired boilers show that mercury removal is much more difficult in utility flue gas. EPRI is conducting research to investigate mercury removal using sorbents in this application. Bench-scale, pilot-scale, and field tests have been conducted to determine the ability of different sorbents to remove mercury in simulated and actual flue gas streams. This paper focuses on recent bench-scale and field test results evaluating the adsorption characteristics of activated carbon and fly ash and the use of these results to develop a predictive mercury removal model. Field tests with activated carbon show that adsorption characteristics measured in the lab agree reasonably well with characteristics measured in the field. However, more laboratory and field data will be needed to identify other gas phase components which may impact performance. This will allow laboratory tests to better simulate field conditions and provide improved estimates of sorbent performance for specific sites. In addition to activated carbon results, bench-scale and modeling results using fly ash are presented which suggest that certain fly ashes are capable of adsorbing mercury.
Hydrodynamic parameters estimation from self-potential data in a controlled full scale site
NASA Astrophysics Data System (ADS)
Chidichimo, Francesco; De Biase, Michele; Rizzo, Enzo; Masi, Salvatore; Straface, Salvatore
2015-03-01
A multi-physical approach developed for the hydrodynamic characterization of porous media using hydrogeophysical information is presented. Several pumping tests were performed in the Hydrogeosite Laboratory, a controlled full-scale site designed and constructed at the CNR-IMAA (Consiglio Nazionale delle Ricerche - Istituto di Metodologia per l'Analisi Ambientale), in Marsico Nuovo (Basilicata Region, Southern Italy), in order to obtain an intermediate stage between laboratory experiments and field survey. The facility consists of a pool, used to study water infiltration processes, to simulate the space and time dynamics of subsurface contamination phenomena, to improve and to find new relationship between geophysical and hydrogeological parameters, to test and to calibrate new geophysical techniques and instruments. Therefore, the Hydrogeosite Laboratory has the advantage of carrying out controlled experiments, like in a flow cell or sandbox, but at field comparable scale. The data collected during the experiments have been used to estimate the saturated hydraulic conductivity ks [ms-1] using a coupled inversion model working in transient conditions, made up of the modified Richards equation describing the water flow in a variably saturated porous medium and the Poisson equation providing the self-potential ϕ [V], which naturally occurs at points of the soil surface owing to the presence of an electric field produced by the motion of underground electrolytic fluids through porous systems. The result obtained by this multi-physical numerical approach, which removes all the approximations adopted in previous works, makes a useful instrument for real heterogeneous aquifer characterization and for predictive analysis of its behavior.
Evaluation of acoustic testing techniques for spacecraft systems
NASA Technical Reports Server (NTRS)
Cockburn, J. A.
1971-01-01
External acoustic environments, structural responses, noise reductions, and the internal acoustic environments have been predicted for a typical shroud/spacecraft system during lift-off and various critical stages of flight. Spacecraft responses caused by energy transmission from the shroud via mechanical and acoustic paths have been compared and the importance of the mechanical path has been evaluated. Theoretical predictions have been compared extensively with available laboratory and in-flight measurements. Equivalent laboratory acoustic fields for simulation of shroud response during the various phases of flight have been derived and compared in detail. Techniques for varying the time-space correlations of laboratory acoustic fields have been examined, together with methods for varying the time and spatial distribution of acoustic amplitudes. Possible acoustic testing configurations for shroud/spacecraft systems have been suggested and trade-off considerations have been reviewed. The problem of simulating the acoustic environments versus simulating the structural responses has been considered and techniques for testing without the shroud installed have been discussed.
Performance of the fusion code GYRO on four generations of Cray computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fahey, Mark R
2014-01-01
GYRO is a code used for the direct numerical simulation of plasma microturbulence. It has been ported to a variety of modern MPP platforms including several modern commodity clusters, IBM SPs, and Cray XC, XT, and XE series machines. We briefly describe the mathematical structure of the equations, the data layout, and the redistribution scheme. Also, while the performance and scaling of GYRO on many of these systems has been shown before, here we show the comparative performance and scaling on four generations of Cray supercomputers including the newest addition - the Cray XC30. The more recently added hybrid OpenMP/MPImore » imple- mentation also shows a great deal of promise on custom HPC systems that utilize fast CPUs and proprietary interconnects. Four machines of varying sizes were used in the experiment, all of which are located at the National Institute for Computational Sciences at the University of Tennessee at Knoxville and Oak Ridge National Laboratory. The advantages, limitations, and performance of using each system are discussed.« less
Rivard, C J; Duff, B W; Dickow, J H; Wiles, C C; Nagle, N J; Gaddy, J L; Clausen, E C
1998-01-01
Early evaluations of the bioconversion potential for combined wastes such as tuna sludge and sorted municipal solid waste (MSW) were conducted at laboratory scale and compared conventional low-solids, stirred-tank anaerobic systems with the novel, high-solids anaerobic digester (HSAD) design. Enhanced feedstock conversion rates and yields were determined for the HSAD system. In addition, the HSAD system demonstrated superior resiliency to process failure. Utilizing relatively dry feedstocks, the HSAD system is approximately one-tenth the size of conventional low-solids systems. In addition, the HSAD system is capable of organic loading rates (OLRs) on the order of 20-25 g volatile solids per liter digester volume per d (gVS/L/d), roughly 4-5 times those of conventional systems. Current efforts involve developing a demonstration-scale (pilot-scale) HSAD system. A two-ton/d plant has been constructed in Stanton, CA and is currently in the commissioning/startup phase. The purposes of the project are to verify laboratory- and intermediate-scale process performance; test the performance of large-scale prototype mechanical systems; demonstrate the long-term reliability of the process; and generate the process and economic data required for the design, financing, and construction of full-scale commercial systems. This study presents conformational fermentation data obtained at intermediate-scale and a snapshot of the pilot-scale project.
NASA Astrophysics Data System (ADS)
Vanderborght, J.; Javaux, M.; Couvreur, V.; Schröder, N.; Huber, K.; Abesha, B.; Schnepf, A.; Vereecken, H.
2013-12-01
Plant roots play a crucial role in several key processes in soils. Besides their impact on biogeochemical cycles and processes, they also have an important influence on physical processes such as water flow and transport of dissolved substances in soils. Interaction between plant roots and soil processes takes place at different scales and ranges from the scale of an individual root and its directly surrounding soil or rhizosphere over the scale of a root system of an individual plant in a soil profile to the scale of vegetation patterns in landscapes. Simulation models that are used to predict water flow and solute transport in soil-plant systems mainly focus on the individual plant root system scale, parameterize single-root scale phenomena, and aggregate the root system scale to the vegetation scale. In this presentation, we will focus on the transition from the single root to the root system scale. Using high resolution non-invasive imaging techniques and methods, gradients in soil properties and states around roots and their difference from the bulk soil properties could be demonstrated. Recent developments in plant sciences provide new insights in the mechanisms that control water fluxes in plants and in the adaptation of root properties or root plasticity to changing soil conditions. However, since currently used approaches to simulate root water uptake neither resolve these small scale processes nor represent processes and controls within the root system, transferring this information to the whole soil-plant system scale is a challenge. Using a simulation model that describes flow and transport processes in the soil, resolves flow and transport towards individual roots, and describes flow and transport within the root system, such a transfer could be achieved. We present a few examples that illustrate: (i) the impact of changed rhizosphere hydraulic properties, (ii) the effect of root hydraulic properties and root system architecture, (iii) the regulation of plant transpiration by root-zone produced plant hormones, and (iv) the impact of salt accumulation at the soil-root interface on root water uptake. We further propose a framework how this process knowledge could be implemented in root zone simulation models that do not resolve small scale processes.
Energy Evaluation of a New Construction Pilot Community: Fresno, California
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burdick, A.; Poerschke, A.; Rapport, A.
2014-06-01
A new construction pilot community was constructed by builder-partner Wathen-Castanos Hybrid Homes (WCHH) based on a single occupied test house that was designed to achieve greater than 30% energy savings with respect to the House Simulation Protocols (Hendron, Robert; Engebrecht, Cheryn (2010). Building America House Simulation Protocols. Golden, CO: National Renewable Energy Laboratory). Builders face several key problems when implementing a whole-house systems integrated measures package (SIMP) from a single test house into multiple houses. Although a technical solution already may have been evaluated and validated in an individual test house, the potential exists for constructability failures at the communitymore » scale. This report addresses factors of implementation and scalability at the community scale and proposes methodologies by which community-scale energy evaluations can be performed based on results at the occupied test house level. Research focused on the builder and trade implementation of a SIMP and the actual utility usage in the houses at the community scale of production. Five occupants participated in this community-scale research by providing utility bills and information on occupancy and miscellaneous gas and electric appliance use for their houses. IBACOS used these utility data and background information to analyze the actual energy performance of the houses. Verification with measured data is an important component in predictive energy modeling. The actual utility bill readings were compared to projected energy consumption using BEopt with actual weather and thermostat set points for normalization.« less
NASA Technical Reports Server (NTRS)
Jansen, B. J., Jr.
1998-01-01
The features of the data acquisition and control systems of the NASA Langley Research Center's Jet Noise Laboratory are presented. The Jet Noise Laboratory is a facility that simulates realistic mixed flow turbofan jet engine nozzle exhaust systems in simulated flight. The system is capable of acquiring data for a complete take-off assessment of noise and nozzle performance. This paper describes the development of an integrated system to control and measure the behavior of model jet nozzles featuring dual independent high pressure combusting air streams with wind tunnel flow. The acquisition and control system is capable of simultaneous measurement of forces, moments, static and dynamic model pressures and temperatures, and jet noise. The design concepts for the coordination of the control computers and multiple data acquisition computers and instruments are discussed. The control system design and implementation are explained, describing the features, equipment, and the experiences of using a primarily Personal Computer based system. Areas for future development are examined.
Large Eddy Simulations of Colorless Distributed Combustion Systems
NASA Astrophysics Data System (ADS)
Abdulrahman, Husam F.; Jaberi, Farhad; Gupta, Ashwani
2014-11-01
Development of efficient and low-emission colorless distributed combustion (CDC) systems for gas turbine applications require careful examination of the role of various flow and combustion parameters. Numerical simulations of CDC in a laboratory-scale combustor have been conducted to carefully examine the effects of these parameters on the CDC. The computational model is based on a hybrid modeling approach combining large eddy simulation (LES) with the filtered mass density function (FMDF) equations, solved with high order numerical methods and complex chemical kinetics. The simulated combustor operates based on the principle of high temperature air combustion (HiTAC) and has shown to significantly reduce the NOx, and CO emissions while improving the reaction pattern factor and stability without using any flame stabilizer and with low pressure drop and noise. The focus of the current work is to investigate the mixing of air and hydrocarbon fuels and the non-premixed and premixed reactions within the combustor by the LES/FMDF with the reduced chemical kinetic mechanisms for the same flow conditions and configurations investigated experimentally. The main goal is to develop better CDC with higher mixing and efficiency, ultra-low emission levels and optimum residence time. The computational results establish the consistency and the reliability of LES/FMDF and its Lagrangian-Eulerian numerical methodology.
NASA Astrophysics Data System (ADS)
Li, Xin; Song, Weiying; Yang, Kai; Krishnan, N. M. Anoop; Wang, Bu; Smedskjaer, Morten M.; Mauro, John C.; Sant, Gaurav; Balonis, Magdalena; Bauchy, Mathieu
2017-08-01
Although molecular dynamics (MD) simulations are commonly used to predict the structure and properties of glasses, they are intrinsically limited to short time scales, necessitating the use of fast cooling rates. It is therefore challenging to compare results from MD simulations to experimental results for glasses cooled on typical laboratory time scales. Based on MD simulations of a sodium silicate glass with varying cooling rate (from 0.01 to 100 K/ps), here we show that thermal history primarily affects the medium-range order structure, while the short-range order is largely unaffected over the range of cooling rates simulated. This results in a decoupling between the enthalpy and volume relaxation functions, where the enthalpy quickly plateaus as the cooling rate decreases, whereas density exhibits a slower relaxation. Finally, we show that, using the proper extrapolation method, the outcomes of MD simulations can be meaningfully compared to experimental values when extrapolated to slower cooling rates.
Design of full-scale adsorption systems typically includes expensive and time-consuming pilot studies to simulate full-scale adsorber performance. Accordingly, the rapid small-scale column test (RSSCT) was developed and evaluated experimentally. The RSSCT can simulate months of f...
Using a Laboratory Simulator in the Teaching and Study of Chemical Processes in Estuarine Systems
ERIC Educational Resources Information Center
Garcia-Luque, E.; Ortega, T.; Forja, J. M.; Gomez-Parra, A.
2004-01-01
The teaching of Chemical Oceanography in the Faculty of Marine and Environmental Sciences of the University of Cadiz (Spain) has been improved since 1994 by the employment of a device for the laboratory simulation of estuarine mixing processes and the characterisation of the chemical behaviour of many substances that pass through an estuary. The…
Thermoelectric temperature control system for the pushbroom microwave radiometer (PBMR)
NASA Technical Reports Server (NTRS)
Dillon-Townes, L. A.; Averill, R. D.
1984-01-01
A closed loop thermoelectric temperature control system is developed for stabilizing sensitive RF integrated circuits within a microwave radiometer to an accuracy of + or - 0.1 C over a range of ambient conditions from -20 C to +45 C. The dual mode (heating and cooling) control concept utilizes partial thermal isolation of the RF units from an instrument deck which is thermally controlled by thermoelectric coolers and thin film heaters. The temperature control concept is simulated with a thermal analyzer program (MITAS) which consists of 37 nodes and 61 conductors. A full scale thermal mockup is tested in the laboratory at temperatures of 0 C, 21 C, and 45 C to confirm the validity of the control concept. A flight radiometer and temperature control system is successfully flight tested on the NASA Skyvan aircraft.
Mission and Objectives for the X-1 Advanced Radiation Source*
NASA Astrophysics Data System (ADS)
Rochau, Gary E.; Ramirez, Juan J.; Raglin, Paul S.
1998-11-01
Sandia National Laboratories PO Box 5800, MS-1178, Albuquerque, NM 87185 The X-1 Advanced Radiation Source represents a next step in providing the U.S. Department of Energy's Stockpile Stewardship Program with the high-energy, large volume, laboratory x-ray source for the Radiation Effects Science and Simulation, Inertial Confinement Fusion, and Weapon Physics Programs. Advances in fast pulsed power technology and in z-pinch hohlraums on Sandia National Laboratories' Z Accelerator provide sufficient basis for pursuing the development of X-1. The X-1 plan follows a strategy based on scaling the 2 MJ x-ray output on Z via a 3-fold increase in z-pinch load current. The large volume (>5 cm3), high temperature (>150 eV), temporally long (>10 ns) hohlraums are unique outside of underground nuclear weapon testing. Analytical scaling arguments and hydrodynamic simulations indicate that these hohlraums at temperatures of 230-300 eV will ignite thermonuclear fuel and drive the reaction to a yield of 200 to 1,200 MJ in the laboratory. Non-ignition sources will provide cold x-ray environments (<15 keV) and high yield fusion burn sources will provide high fidelity warm x-ray environments (15 keV-80 keV). This paper will introduce the X-1 Advanced Radiation Source Facility Project, describe the project mission, objective, and preliminary schedule.
A Unique Software System For Simulation-to-Flight Research
NASA Technical Reports Server (NTRS)
Chung, Victoria I.; Hutchinson, Brian K.
2001-01-01
"Simulation-to-Flight" is a research development concept to reduce costs and increase testing efficiency of future major aeronautical research efforts at NASA. The simulation-to-flight concept is achieved by using common software and hardware, procedures, and processes for both piloted-simulation and flight testing. This concept was applied to the design and development of two full-size transport simulators, a research system installed on a NASA B-757 airplane, and two supporting laboratories. This paper describes the software system that supports the simulation-to-flight facilities. Examples of various simulation-to-flight experimental applications were also provided.
NASA Astrophysics Data System (ADS)
Tessier, Frederic
Microfluidic and nanofluidic technology is revolutionizing experimental practices in analytical chemistry, molecular biology and medicine. Indeed, the development of systems of small dimensions for the processing of fluids heralds the miniaturization of traditional, cumbersome laboratory equipment onto robust, portable and efficient microchip devices (similar to the electronic microchips found in computers). Moreover, the conjunction of scale between the smallest man-made device and the largest macromolecules evolved by Nature is fertile ground for the blooming of our knowledge about the key processes of life. In fact, the conjunction is threefold, because modern computational resources also allow us to contemplate a rather explicit modelling of physical systems between the nanoscale and the microscale. In the five articles comprising this thesis, we present the results of computer simulations that address specific questions concerning the operation of two different model systems relevant to the development of small-scale fluidic devices for the manipulation and analysis of biomolecules. First, we use a Bond-Fluctuation Monte Carlo approach to study the electrophoretic drift of macromolecules across an entropic trap array built for the length separation of long, double-stranded DNA molecules. We show that the motion of the molecules is consistent with a simple balance between electric and entropic forces, in terms of a single characteristic parameter. We also extract detailed information on polymer deformation during migration, predict the separation of topoisomers, and investigate innovative ratchet driving regimes. Secondly, we present theoretical derivations, numerical calculations and Molecular Dynamics simulation results for an electrolyte confined in a capillary of nanoscopic dimensions. In particular, we study the effectiveness of neutral grafted polymer chains in reducing the magnitude of electroosmotic flow (fluid flow induced by an external electric field). Our results constitute the first independent, quantitative verification of theoretical scaling predictions for the coupling between grafted macromolecules and electroosmotic flow. Such simulations will contribute to the rationalization of the existing empirical knowledge about flow control with polymer coatings.
A pilot study of surgical training using a virtual robotic surgery simulator.
Tergas, Ana I; Sheth, Sangini B; Green, Isabel C; Giuntoli, Robert L; Winder, Abigail D; Fader, Amanda N
2013-01-01
Our objectives were to compare the utility of learning a suturing task on the virtual reality da Vinci Skills Simulator versus the da Vinci Surgical System dry laboratory platform and to assess user satisfaction among novice robotic surgeons. Medical trainees were enrolled prospectively; one group trained on the virtual reality simulator, and the other group trained on the da Vinci dry laboratory platform. Trainees received pretesting and post-testing on the dry laboratory platform. Participants then completed an anonymous online user experience and satisfaction survey. We enrolled 20 participants. Mean pretest completion times did not significantly differ between the 2 groups. Training with either platform was associated with a similar decrease in mean time to completion (simulator platform group, 64.9 seconds [P = .04]; dry laboratory platform group, 63.9 seconds [P < .01]). Most participants (58%) preferred the virtual reality platform. The majority found the training "definitely useful" in improving robotic surgical skills (mean, 4.6) and would attend future training sessions (mean, 4.5). Training on the virtual reality robotic simulator or the dry laboratory robotic surgery platform resulted in significant improvements in time to completion and economy of motion for novice robotic surgeons. Although there was a perception that both simulators improved performance, there was a preference for the virtual reality simulator. Benefits unique to the simulator platform include autonomy of use, computerized performance feedback, and ease of setup. These features may facilitate more efficient and sophisticated simulation training above that of the conventional dry laboratory platform, without loss of efficacy.
Multiscale Modeling in the Clinic: Drug Design and Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clancy, Colleen E.; An, Gary; Cannon, William R.
A wide range of length and time scales are relevant to pharmacology, especially in drug development, drug design and drug delivery. Therefore, multi-scale computational modeling and simulation methods and paradigms that advance the linkage of phenomena occurring at these multiple scales have become increasingly important. Multi-scale approaches present in silico opportunities to advance laboratory research to bedside clinical applications in pharmaceuticals research. This is achievable through the capability of modeling to reveal phenomena occurring across multiple spatial and temporal scales, which are not otherwise readily accessible to experimentation. The resultant models, when validated, are capable of making testable predictions tomore » guide drug design and delivery. In this review we describe the goals, methods, and opportunities of multi-scale modeling in drug design and development. We demonstrate the impact of multiple scales of modeling in this field. We indicate the common mathematical techniques employed for multi-scale modeling approaches used in pharmacology and present several examples illustrating the current state-of-the-art regarding drug development for: Excitable Systems (Heart); Cancer (Metastasis and Differentiation); Cancer (Angiogenesis and Drug Targeting); Metabolic Disorders; and Inflammation and Sepsis. We conclude with a focus on barriers to successful clinical translation of drug development, drug design and drug delivery multi-scale models.« less
1980-05-01
engineering ,ZteNo D R RPTE16 research w 9 laboratory COMPARISON OF BUILDING LOADS ANALYSIS AND SYSTEM THERMODYNAMICS (BLAST) AD 0 5 5,0 3COMPUTER PROGRAM...Building Loads Analysis and System Thermodynamics (BLAST) computer program. A dental clinic and a battalion headquarters and classroom building were...Building and HVAC System Data Computer Simulation Comparison of Actual and Simulated Results ANALYSIS AND FINDINGS
NASA Technical Reports Server (NTRS)
Jefferson, David; Beckman, Brian
1986-01-01
This paper describes the concept of virtual time and its implementation in the Time Warp Operating System at the Jet Propulsion Laboratory. Virtual time is a distributed synchronization paradigm that is appropriate for distributed simulation, database concurrency control, real time systems, and coordination of replicated processes. The Time Warp Operating System is targeted toward the distributed simulation application and runs on a 32-node JPL Mark II Hypercube.
Fate of Salmonella Typhimurium in laboratory-scale drinking water biofilms.
Schaefer, L M; Brözel, V S; Venter, S N
2013-12-01
Investigations were carried out to evaluate and quantify colonization of laboratory-scale drinking water biofilms by a chromosomally green fluorescent protein (gfp)-tagged strain of Salmonella Typhimurium. Gfp encodes the green fluorescent protein and thus allows in situ detection of undisturbed cells and is ideally suited for monitoring Salmonella in biofilms. The fate and persistence of non-typhoidal Salmonella in simulated drinking water biofilms was investigated. The ability of Salmonella to form biofilms in monoculture and the fate and persistence of Salmonella in a mixed aquatic biofilm was examined. In monoculture S. Typhimurium formed loosely structured biofilms. Salmonella colonized established multi-species drinking water biofilms within 24 hours, forming micro-colonies within the biofilm. S. Typhimurium was also released at high levels from the drinking water-associated biofilm into the water passing through the system. This indicated that Salmonella could enter into, survive and grow within, and be released from a drinking water biofilm. The ability of Salmonella to survive and persist in a drinking water biofilm, and be released at high levels into the flow for recolonization elsewhere, indicates the potential for a persistent health risk to consumers once a network becomes contaminated with this bacterium.
Nonlinear plasma wave models in 3D fluid simulations of laser-plasma interaction
NASA Astrophysics Data System (ADS)
Chapman, Thomas; Berger, Richard; Arrighi, Bill; Langer, Steve; Banks, Jeffrey; Brunner, Stephan
2017-10-01
Simulations of laser-plasma interaction (LPI) in inertial confinement fusion (ICF) conditions require multi-mm spatial scales due to the typical laser beam size and durations of order 100 ps in order for numerical laser reflectivities to converge. To be computationally achievable, these scales necessitate a fluid-like treatment of light and plasma waves with a spatial grid size on the order of the light wave length. Plasma waves experience many nonlinear phenomena not naturally described by a fluid treatment, such as frequency shifts induced by trapping, a nonlinear (typically suppressed) Landau damping, and mode couplings leading to instabilities that can cause the plasma wave to decay rapidly. These processes affect the onset and saturation of stimulated Raman and Brillouin scattering, and are of direct interest to the modeling and prediction of deleterious LPI in ICF. It is not currently computationally feasible to simulate these Debye length-scale phenomena in 3D across experimental scales. Analytically-derived and/or numerically benchmarked models of processes occurring at scales finer than the fluid simulation grid offer a path forward. We demonstrate the impact of a range of kinetic processes on plasma reflectivity via models included in the LPI simulation code pF3D. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Thosar, Archana; Patra, Amit; Bhattacharyya, Souvik
2008-07-01
Design of a nonlinear control system for a Variable Air Volume Air Conditioning (VAVAC) plant through feedback linearization is presented in this article. VAVAC systems attempt to reduce building energy consumption while maintaining the primary role of air conditioning. The temperature of the space is maintained at a constant level by establishing a balance between the cooling load generated in the space and the air supply delivered to meet the load. The dynamic model of a VAVAC plant is derived and formulated as a MIMO bilinear system. Feedback linearization is applied for decoupling and linearization of the nonlinear model. Simulation results for a laboratory scale plant are presented to demonstrate the potential of keeping comfort and maintaining energy optimal performance by this methodology. Results obtained with a conventional PI controller and a feedback linearizing controller are compared and the superiority of the proposed approach is clearly established.
Atmospheric and oceanographic research review, 1979
NASA Technical Reports Server (NTRS)
1980-01-01
Papers generated by atmospheric, oceanographic, and climatological research performed during 1979 at the Goddard Laboratory for Atmospheric Sciences are presented. The GARP/global weather research is aimed at developing techniques for the utilization and analysis of the FGGE data sets. Observing system studies were aimed at developing a GLAS TIROS N sounding retrieval system and preparing for the joint NOAA/NASA AMTS simulation study. The climate research objective is to support the development and effective utilization of space acquired data systems by developing the GLAS GCM for short range climate predictions, studies of the sensitivity of climate to boundary conditions, and predictability studies. Ocean/air interaction studies concentrated on the development of models for the prediction of upper ocean currents, temperatures, sea state, mixed layer depths, and upwelling zones, and on studies of the interactions of the atmospheric and oceanic circulation systems on time scales of a month or more.
2017 GTO Project review Laboratory Evaluation of EGS Shear Stimulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauer, Stephen J.
The objectives and purpose of this research has been to produce laboratory-based experimental and numerical analyses to provide a physics-based understanding of shear stimulation phenomena (hydroshearing) and its evolution during stimulation. Water was flowed along fractures in hot and stressed fractured rock, to promote slip. The controlled laboratory experiments provide a high resolution/high quality data resource for evaluation of analysis methods developed by DOE to assess EGS “behavior” during this stimulation process. Segments of the experimental program will provide data sets for model input parameters, i.e., material properties, and other segments of the experimental program will represent small scale physicalmore » models of an EGS system, which may be modeled. The coupled lab/analysis project has been a study of the response of a fracture in hot, water-saturated fractured rock to shear stress experiencing fluid flow. Under this condition, the fracture experiences a combination of potential pore pressure changes and fracture surface cooling, resulting in slip along the fracture. The laboratory work provides a means to assess the role of “hydroshearing” on permeability enhancement in reservoir stimulation. Using the laboratory experiments and results to define boundary and input/output conditions of pore pressure, thermal stress, fracture shear deformation and fluid flow, and models were developed and simulations completed by the University of Oklahoma team. The analysis methods are ones used on field scale problems. The sophisticated numerical models developed contain parameters present in the field. The analysis results provide insight into the role of fracture slip on permeability enhancement-“hydroshear” is to be obtained. The work will provide valuable input data to evaluate stimulation models, thus helping design effective EGS.« less
Feng, Sha; Vogelmann, Andrew M.; Li, Zhijin; ...
2015-01-20
Fine-resolution three-dimensional fields have been produced using the Community Gridpoint Statistical Interpolation (GSI) data assimilation system for the U.S. Department of Energy’s Atmospheric Radiation Measurement Program (ARM) Southern Great Plains region. The GSI system is implemented in a multi-scale data assimilation framework using the Weather Research and Forecasting model at a cloud-resolving resolution of 2 km. From the fine-resolution three-dimensional fields, large-scale forcing is derived explicitly at grid-scale resolution; a subgrid-scale dynamic component is derived separately, representing subgrid-scale horizontal dynamic processes. Analyses show that the subgrid-scale dynamic component is often a major component over the large-scale forcing for grid scalesmore » larger than 200 km. The single-column model (SCM) of the Community Atmospheric Model version 5 (CAM5) is used to examine the impact of the grid-scale and subgrid-scale dynamic components on simulated precipitation and cloud fields associated with a mesoscale convective system. It is found that grid-scale size impacts simulated precipitation, resulting in an overestimation for grid scales of about 200 km but an underestimation for smaller grids. The subgrid-scale dynamic component has an appreciable impact on the simulations, suggesting that grid-scale and subgrid-scale dynamic components should be considered in the interpretation of SCM simulations.« less
Using RSSCTs to predict field-scale GAC control of DBP formation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cummings, L.; Summers, R.S.
1994-06-01
The primary objective of this study was to evaluate the use of the rapid small-scale column test (RSSCT) for predicting the control of disinfection by-product (DBP) formation by granular activated carbon (GAC). DBP formation was assessed by using a simulated distribution system (SDS) test and measuring trihalomethanes and total organic halide in the influent and effluent of the laboratory- and field-scale columns. It was observed that for the water studied, the RSSCTs effectively predicted the nonabsorbable fraction, time to 50 percent breakthrough, and the shape of the breakthrough curve for DBP formation. The advantage of RSSCTs is that conclusions aboutmore » the amenability of a GAC for DBP control can be reached in a short time period instead of at the end of a long-term pilot study. The authors recommend that similar studies be conducted with a range of source waters because the effectiveness of GAC is site-specific.« less
Anaerobic co-digestion of high-strength organic wastes pretreated by thermal hydrolysis.
Choi, Gyucheol; Kim, Jaai; Lee, Seungyong; Lee, Changsoo
2018-06-01
Thermal hydrolysis (TH) pretreatment was investigated for the anaerobic digestion (AD) of a mixture of high-strength organic wastes (i.e., dewatered human feces, dewatered sewage sludge, and food wastewater) at laboratory scale to simulate a full-scale plant and evaluate its feasibility. The reactors maintained efficient and stable performance at a hydraulic retention time of 20 days, which may be not sufficient for the mesophilic AD of high-suspended-solid wastes, despite the temporal variations in organic load. The addition of FeCl 3 was effective in controlling H 2 S and resulted in significant changes in the microbial community structure, particularly the methanogens. The temporary interruption in feeding or temperature control led to immediate performance deterioration, but it recovered rapidly when normal operations were resumed. The overall results suggest that the AD process coupled with TH pretreatment can provide an efficient, robust, and resilient system to manage high-suspended-solid wastes, supporting the feasibility of its full-scale implementation. Copyright © 2018 Elsevier Ltd. All rights reserved.
Groundwater dynamics in a two-dimensional aquifer
NASA Astrophysics Data System (ADS)
Jules, Valentin; Devauchelle, Olivier; Lajeunesse, Eric
2017-11-01
During a rain event, water infiltrates into the ground where it flows slowly towards a river. The time scale and the geometry of this flow control the chemical composition and the discharge of the river. We use a tank filled with glass beads to simulate this process in a simplified laboratory experiment. A sprinkler pipe generates rain, which infiltrates into the porous material. Groundwater exits this laboratory aquifer through a side of the tank. Guérin et al. (2014) investigated the case of a quasi-horizontal flow. In nature, however, groundwater often follows non-horizontal flowlines. To create a vertical flow, we place the outlet of our experiment high above its bottom. We find that, during rainfall, the discharge Q increases as the rainfall rate R times the square root of time t (Q Rt 1 / 2). This laboratory aquifer thus responds linearly to the forcing. However, long after the rain has stopped, the discharge decreases as the inverse square of time (Q t-2), although linear systems of finite size typically relax exponentially. We investigate this surprising behavior using a combination of complex analysis and numerical methods.
Mars Science Laboratory Rover System Thermal Test
NASA Technical Reports Server (NTRS)
Novak, Keith S.; Kempenaar, Joshua E.; Liu, Yuanming; Bhandari, Pradeep; Dudik, Brenda A.
2012-01-01
On November 26, 2011, NASA launched a large (900 kg) rover as part of the Mars Science Laboratory (MSL) mission to Mars. The MSL rover is scheduled to land on Mars on August 5, 2012. Prior to launch, the Rover was successfully operated in simulated mission extreme environments during a 16-day long Rover System Thermal Test (STT). This paper describes the MSL Rover STT, test planning, test execution, test results, thermal model correlation and flight predictions. The rover was tested in the JPL 25-Foot Diameter Space Simulator Facility at the Jet Propulsion Laboratory (JPL). The Rover operated in simulated Cruise (vacuum) and Mars Surface environments (8 Torr nitrogen gas) with mission extreme hot and cold boundary conditions. A Xenon lamp solar simulator was used to impose simulated solar loads on the rover during a bounding hot case and during a simulated Mars diurnal test case. All thermal hardware was exercised and performed nominally. The Rover Heat Rejection System, a liquid-phase fluid loop used to transport heat in and out of the electronics boxes inside the rover chassis, performed better than predicted. Steady state and transient data were collected to allow correlation of analytical thermal models. These thermal models were subsequently used to predict rover thermal performance for the MSL Gale Crater landing site. Models predict that critical hardware temperatures will be maintained within allowable flight limits over the entire 669 Sol surface mission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chrzanowski, P; Walter, K
For the Laboratory and staff, 2006 was a year of outstanding achievements. As our many accomplishments in this annual report illustrate, the Laboratory's focus on important problems that affect our nation's security and our researchers breakthroughs in science and technology have led to major successes. As a national laboratory that is part of the Department of Energy's National Nuclear Security Administration (DOE/NNSA), Livermore is a key contributor to the Stockpile Stewardship Program for maintaining the safety, security, and reliability of the nation's nuclear weapons stockpile. The program has been highly successful, and our annual report features some of the Laboratory'smore » significant stockpile stewardship accomplishments in 2006. A notable example is a long-term study with Los Alamos National Laboratory, which found that weapon pit performance will not sharply degrade from the aging effects on plutonium. The conclusion was based on a wide range of nonnuclear experiments, detailed simulations, theoretical advances, and thorough analyses of the results of past nuclear tests. The study was a superb scientific effort. The continuing success of stockpile stewardship enabled NNSA in 2006 to lay out Complex 2030, a vision for a transformed nuclear weapons complex that is more responsive, cost efficient, and highly secure. One of the ways our Laboratory will help lead this transformation is through the design and development of reliable replacement warheads (RRWs). Compared to current designs, these warheads would have enhanced performance margins and security features and would be less costly to manufacture and maintain in a smaller, modernized production complex. In early 2007, NNSA selected Lawrence Livermore and Sandia National Laboratories-California to develop ''RRW-1'' for the U.S. Navy. Design efforts for the RRW, the plutonium aging work, and many other stockpile stewardship accomplishments rely on computer simulations performed on NNSA's Advanced Simulation and Computing (ASC) Program supercomputers at Livermore. ASC Purple and BlueGene/L, the world's fastest computer, together provide nearly a half petaflop (500 trillion operations per second) of computer power for use by the three NNSA national laboratories. Livermore-led teams were awarded the Gordon Bell Prize for Peak Performance in both 2005 and 2006. The winning simulations, run on BlueGene/L, investigated the properties of materials at the length and time scales of atomic interactions. The computing power that makes possible such detailed simulations provides unprecedented opportunities for scientific discovery. Laboratory scientists are meeting the extraordinary challenge of creating experimental capabilities to match the resolution of supercomputer simulations. Working with a wide range of collaborators, we are developing experimental tools that gather better data at the nanometer and subnanosecond scales. Applications range from imaging biomolecules to studying matter at extreme conditions of pressure and temperature. The premier high-energy-density experimental physics facility in the world will be the National Ignition Facility (NIF) when construction is completed in 2009. We are leading the national effort to perform the first fusion ignition experiments using NIF's 192-beam laser and prepare to explore some of the remaining important issues in weapons physics. With scientific colleagues from throughout the nation, we are also designing revolutionary experiments on NIF to advance the fields of astrophysics, planetary physics, and materials science. Mission-directed, multidisciplinary science and technology at Livermore is also focused on reducing the threat posed by the proliferation of weapons of mass destruction as well as their acquisition and use by terrorists. The Laboratory helps this important national effort by providing its unique expertise, integration analyses, and operational support to the Department of Homeland Security. For this vital facet of the Laboratory's national security mission, we are developing advanced technologies, such as a pocket-size explosives detector and an airborne persistent surveillance system, both of which earned R&D 100 Awards. Altogether, Livermore won seven R&D 100 Awards in 2006, the most for any organization. Emerging threats to national and global security go beyond defense and homeland security. Livermore pursues major scientific and technical advances to meet the need for a clean environment; clean, abundant energy; better water management; and improved human health. Our annual report highlights the link between human activities and the warming of tropical oceans, as well as techniques for imaging biological molecules and detecting bone cancer in its earliest stages. In addition, we showcase many scientific discoveries: distant planets, the composition of comets, a new superheavy element.« less
Behavior of U 3Si 2 Fuel and FeCrAl Cladding under Normal Operating and Accident Reactor Conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamble, Kyle Allan Lawrence; Hales, Jason Dean; Barani, Tommaso
2016-09-01
As part of the Department of Energy's Nuclear Energy Advanced Modeling and Simulation program, an Accident Tolerant Fuel High Impact Problem was initiated at the beginning of fiscal year 2015 to investigate the behavior of \\usi~fuel and iron-chromium-aluminum (FeCrAl) claddings under normal operating and accident reactor conditions. The High Impact Problem was created in response to the United States Department of Energy's renewed interest in accident tolerant materials after the events that occurred at the Fukushima Daiichi Nuclear Power Plant in 2011. The High Impact Problem is a multinational laboratory and university collaborative research effort between Idaho National Laboratory, Losmore » Alamos National Laboratory, Argonne National Laboratory, and the University of Tennessee, Knoxville. This report primarily focuses on the engineering scale research in fiscal year 2016 with brief summaries of the lower length scale developments in the areas of density functional theory, cluster dynamics, rate theory, and phase field being presented.« less
NASA Astrophysics Data System (ADS)
Schlegel, Nicole-Jeanne; Boening, Carmen; Larour, Eric; Limonadi, Daniel; Schodlok, Michael; Seroussi, Helene; Watkins, Michael
2017-04-01
Research and development activities at the Jet Propulsion Laboratory (JPL) currently support the creation of a framework to formally evaluate the observational needs within earth system science. One of the pilot projects of this effort aims to quantify uncertainties in global mean sea level rise projections, due to contributions from the continental ice sheets. Here, we take advantage of established uncertainty quantification tools embedded within the JPL-University of California at Irvine Ice Sheet System Model (ISSM). We conduct sensitivity and Monte-Carlo style sampling experiments on forward simulations of the Greenland and Antarctic ice sheets. By varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges, we assess the impact of the different parameter ranges on century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.
NASA Astrophysics Data System (ADS)
Yoon, H.; Dewers, T. A.; Valocchi, A. J.; Werth, C. J.
2011-12-01
Dissolved CO2 during geological CO2 storage may react with minerals in fractured rocks or confined aquifers and cause mineral precipitation. The overall rate of reaction can be affected by coupled processes among hydrodynamics, transport, and reactions at pore-scale. Pore-scale models of coupled fluid flow, reactive transport, and CaCO3 precipitation and dissolution are applied to account for transient experimental results of CaCO3 precipitation and dissolution under highly supersaturated conditions in a microfluidic pore network (i.e., micromodel). Pore-scale experiments in the micromodel are used as a basis for understanding coupled physics of systems perturbed by geological CO2 injection. In the micromodel, precipitation is induced by transverse mixing along the centerline in pore bodies. Overall, the pore-scale model qualitatively captured the governing physics of reactions such as precipitate morphology, precipitation rate, and maximum precipitation area in first few pore spaces. In particular, we found that proper estimation of the effective diffusion coefficient and the reactive surface area is necessary to adequately simulate precipitation and dissolution rates. As the model domain increases, the effect of flow patterns affected by precipitation on the overall reaction rate also increases. The model is also applied to account for the effect of different reaction rate laws on mineral precipitation and dissolution at pore-scale. Reaction rate laws tested include the linear rate law, nonlinear power law, and newly-developed rate law based on in-situ measurements at nano scale in the literature. Progress on novel methods for upscaling pore-scale models for reactive transport are discussed, and are being applied to mineral precipitation patterns observed in natural analogues. H.Y. and T. D. were supported as part of the Center for Frontiers of Subsurface Energy Security, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences under Award Number DE-SC0001114. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Shock Waves and Defects in Energetic Materials, a Match Made in MD Heaven
NASA Astrophysics Data System (ADS)
Wood, Mitchell; Kittell, David; Yarrington, Cole; Thompson, Aidan
2017-06-01
Shock wave interactions with defects, such as pores, are known to play a key role in the chemical initiation of energetic materials. In this talk the shock response of Hexanitrostilbene (HNS) is studied through large scale reactive molecular dynamics (RMD) simulations. These RMD simulations provide a unique opportunity to elucidate mechanisms of viscoplastic pore collapse which are often neglected in larger scale hydrodynamic models. A discussion of the macroscopic effects of this viscoplastic material response, such as its role in hot spot formation and eventual initiation, will be provided. Through this work we have been able to map a transition from purely viscoplastic to fluid-like pore collapse that is a function of shock strength, pore size and material strength. In addition, these findings are important reference data for the validation of future multi-scale modeling efforts of the shock response of heterogeneous materials. Examples of how these RMD results are translated into mesoscale models will also be addressed. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the US DOE NNSA under Contract No. DE- AC04-94AL85000.
Hierarchical coarse-graining strategy for protein-membrane systems to access mesoscopic scales
Ayton, Gary S.; Lyman, Edward
2014-01-01
An overall multiscale simulation strategy for large scale coarse-grain simulations of membrane protein systems is presented. The protein is modeled as a heterogeneous elastic network, while the lipids are modeled using the hybrid analytic-systematic (HAS) methodology, where in both cases atomistic level information obtained from molecular dynamics simulation is used to parameterize the model. A feature of this approach is that from the outset liposome length scales are employed in the simulation (i.e., on the order of ½ a million lipids plus protein). A route to develop highly coarse-grained models from molecular-scale information is proposed and results for N-BAR domain protein remodeling of a liposome are presented. PMID:20158037
Wind turbine wake interactions at field scale: An LES study of the SWiFT facility
NASA Astrophysics Data System (ADS)
Yang, Xiaolei; Boomsma, Aaron; Barone, Matthew; Sotiropoulos, Fotis
2014-06-01
The University of Minnesota Virtual Wind Simulator (VWiS) code is employed to simulate turbine/atmosphere interactions in the Scaled Wind Farm Technology (SWiFT) facility developed by Sandia National Laboratories in Lubbock, TX, USA. The facility presently consists of three turbines and the simulations consider the case of wind blowing from South such that two turbines are in the free stream and the third turbine in the direct wake of one upstream turbine with separation of 5 rotor diameters. Large-eddy simulation (LES) on two successively finer grids is carried out to examine the sensitivity of the computed solutions to grid refinement. It is found that the details of the break-up of the tip vortices into small-scale turbulence structures can only be resolved on the finer grid. It is also shown that the power coefficient CP of the downwind turbine predicted on the coarse grid is somewhat higher than that obtained on the fine mesh. On the other hand, the rms (root-mean-square) of the CP fluctuations are nearly the same on both grids, although more small-scale turbulence structures are resolved upwind of the downwind turbine on the finer grid.
Validation of the Fully-Coupled Air-Sea-Wave COAMPS System
NASA Astrophysics Data System (ADS)
Smith, T.; Campbell, T. J.; Chen, S.; Gabersek, S.; Tsu, J.; Allard, R. A.
2017-12-01
A fully-coupled, air-sea-wave numerical model, COAMPS®, has been developed by the Naval Research Laboratory to further enhance understanding of oceanic, atmospheric, and wave interactions. The fully-coupled air-sea-wave system consists of an atmospheric component with full physics parameterizations, an ocean model, NCOM (Navy Coastal Ocean Model), and two wave components, SWAN (Simulating Waves Nearshore) and WaveWatch III. Air-sea interactions between the atmosphere and ocean components are accomplished through bulk flux formulations of wind stress and sensible and latent heat fluxes. Wave interactions with the ocean include the Stokes' drift, surface radiation stresses, and enhancement of the bottom drag coefficient in shallow water due to the wave orbital velocities at the bottom. In addition, NCOM surface currents are provided to SWAN and WaveWatch III to simulate wave-current interaction. The fully-coupled COAMPS system was executed for several regions at both regional and coastal scales for the entire year of 2015, including the U.S. East Coast, Western Pacific, and Hawaii. Validation of COAMPS® includes observational data comparisons and evaluating operational performance on the High Performance Computing (HPC) system for each of these regions.
Preliminary SAGE Simulations of Volcanic Jets Into a Stratified Atmosphere
NASA Astrophysics Data System (ADS)
Peterson, A. H.; Wohletz, K. H.; Ogden, D. E.; Gisler, G. R.; Glatzmaier, G. A.
2007-12-01
The SAGE (SAIC Adaptive Grid Eulerian) code employs adaptive mesh refinement in solving Eulerian equations of complex fluid flow desirable for simulation of volcanic eruptions. The goal of modeling volcanic eruptions is to better develop a code's predictive capabilities in order to understand the dynamics that govern the overall behavior of real eruption columns. To achieve this goal, we focus on the dynamics of underexpended jets, one of the fundamental physical processes important to explosive eruptions. Previous simulations of laboratory jets modeled in cylindrical coordinates were benchmarked with simulations in CFDLib (Los Alamos National Laboratory), which solves the full Navier-Stokes equations (includes viscous stress tensor), and showed close agreement, indicating that adaptive mesh refinement used in SAGE may offset the need for explicit calculation of viscous dissipation.We compare gas density contours of these previous simulations with the same initial conditions in cylindrical and Cartesian geometries to laboratory experiments to determine both the validity of the model and the robustness of the code. The SAGE results in both geometries are within several percent of the experiments for position and density of the incident (intercepting) and reflected shocks, slip lines, shear layers, and Mach disk. To expand our study into a volcanic regime, we simulate large-scale jets in a stratified atmosphere to establish the code's ability to model a sustained jet into a stable atmosphere.
Comparison Between Simulated and Experimentally Measured Performance of a Four Port Wave Rotor
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Wilson, Jack; Welch, Gerard E.
2007-01-01
Performance and operability testing has been completed on a laboratory-scale, four-port wave rotor, of the type suitable for use as a topping cycle on a gas turbine engine. Many design aspects, and performance estimates for the wave rotor were determined using a time-accurate, one-dimensional, computational fluid dynamics-based simulation code developed specifically for wave rotors. The code follows a single rotor passage as it moves past the various ports, which in this reference frame become boundary conditions. This paper compares wave rotor performance predicted with the code to that measured during laboratory testing. Both on and off-design operating conditions were examined. Overall, the match between code and rig was found to be quite good. At operating points where there were disparities, the assumption of larger than expected internal leakage rates successfully realigned code predictions and laboratory measurements. Possible mechanisms for such leakage rates are discussed.
Improved simulation of tropospheric ozone by a global-multi-regional two-way coupling model system
NASA Astrophysics Data System (ADS)
Yan, Yingying; Lin, Jintai; Chen, Jinxuan; Hu, Lu
2016-02-01
Small-scale nonlinear chemical and physical processes over pollution source regions affect the tropospheric ozone (O3), but these processes are not captured by current global chemical transport models (CTMs) and chemistry-climate models that are limited by coarse horizontal resolutions (100-500 km, typically 200 km). These models tend to contain large (and mostly positive) tropospheric O3 biases in the Northern Hemisphere. Here we use the recently built two-way coupling system of the GEOS-Chem CTM to simulate the regional and global tropospheric O3 in 2009. The system couples the global model (at 2.5° long. × 2° lat.) and its three nested models (at 0.667° long. × 0.5° lat.) covering Asia, North America and Europe, respectively. Specifically, the nested models take lateral boundary conditions (LBCs) from the global model, better capture small-scale processes and feed back to modify the global model simulation within the nested domains, with a subsequent effect on their LBCs. Compared to the global model alone, the two-way coupled system better simulates the tropospheric O3 both within and outside the nested domains, as found by evaluation against a suite of ground (1420 sites from the World Data Centre for Greenhouse Gases (WDCGG), the United States National Oceanic and Atmospheric Administration (NOAA) Earth System Research Laboratory Global Monitoring Division (GMD), the Chemical Coordination Centre of European Monitoring and Evaluation Programme (EMEP), and the United States Environmental Protection Agency Air Quality System (AQS)), aircraft (the High-performance Instrumented Airborne Platform for Environmental Research (HIAPER) Pole-to-Pole Observations (HIPPO) and Measurement of Ozone and Water Vapor by Airbus In- Service Aircraft (MOZAIC)) and satellite measurements (two Ozone Monitoring Instrument (OMI) products). The two-way coupled simulation enhances the correlation in day-to-day variation of afternoon mean surface O3 with the ground measurements from 0.53 to 0.68, and it reduces the mean model bias from 10.8 to 6.7 ppb. Regionally, the coupled system reduces the bias by 4.6 ppb over Europe, 3.9 ppb over North America and 3.1 ppb over other regions. The two-way coupling brings O3 vertical profiles much closer to the HIPPO (for remote areas) and MOZAIC (for polluted regions) data, reducing the tropospheric (0-9 km) mean bias by 3-10 ppb at most MOZAIC sites and by 5.3 ppb for HIPPO profiles. The two-way coupled simulation also reduces the global tropospheric column ozone by 3.0 DU (9.5 %, annual mean), bringing them closer to the OMI data in all seasons. Additionally, the two-way coupled simulation also reduces the global tropospheric mean hydroxyl radical by 5 % with improved estimates of methyl chloroform and methane lifetimes. Simulation improvements are more significant in the Northern Hemisphere, and are mainly driven by improved representation of spatial inhomogeneity in chemistry/emissions. Within the nested domains, the two-way coupled simulation reduces surface ozone biases relative to typical GEOS-Chem one-way nested simulations, due to much improved LBCs. The bias reduction is 1-7 times the bias reduction from the global to the one-way nested simulation. Improving model representations of small-scale processes is important for understanding the global and regional tropospheric chemistry.
NASA Astrophysics Data System (ADS)
Zonta, Daniele; Pozzi, Matteo; Wu, Huayong; Inaudi, Daniele
2008-03-01
This paper introduces a concept of smart structural elements for the real-time condition monitoring of bridges. These are prefabricated reinforced concrete elements embedding a permanent sensing system and capable of self-diagnosis when in operation. The real-time assessment is automatically controlled by a numerical algorithm founded on Bayesian logic: the method assigns a probability to each possible damage scenario, and estimates the statistical distribution of the damage parameters involved (such as location and extent). To verify the effectiveness of the technology, we produced and tested in the laboratory a reduced-scale smart beam prototype. The specimen is 3.8 m long and has cross-section 0.3 by 0.5m, and has been prestressed using a Dywidag bar, in such a way as to control the preload level. The sensor system includes a multiplexed version of SOFO interferometric sensors mounted on a composite bar, along with a number of traditional metal-foil strain gauges. The method allowed clear recognition of increasing fault states, simulated on the beam by gradually reducing the prestress level.
Simulation and flavor compound analysis of dealcoholized beer via one-step vacuum distillation.
Andrés-Iglesias, Cristina; García-Serna, Juan; Montero, Olimpio; Blanco, Carlos A
2015-10-01
The coupled operation of vacuum distillation process to produce alcohol free beer at laboratory scale and Aspen HYSYS simulation software was studied to define the chemical changes during the dealcoholization process in the aroma profiles of 2 different lager beers. At the lab-scale process, 2 different parameters were chosen to dealcoholize beer samples, 102mbar at 50°C and 200mbar at 67°C. Samples taken at different steps of the process were analyzed by HS-SPME-GC-MS focusing on the concentration of 7 flavor compounds, 5 alcohols and 2 esters. For simulation process, the EoS parameters of the Wilson-2 property package were adjusted to the experimental data and one more pressure was tested (60mbar). Simulation methods represent a viable alternative to predict results of the volatile compound composition of a final dealcoholized beer. Copyright © 2015 Elsevier Ltd. All rights reserved.
Large-eddy simulation of a boundary layer with concave streamwise curvature
NASA Technical Reports Server (NTRS)
Lund, Thomas S.
1994-01-01
Turbulence modeling continues to be one of the most difficult problems in fluid mechanics. Existing prediction methods are well developed for certain classes of simple equilibrium flows, but are still not entirely satisfactory for a large category of complex non-equilibrium flows found in engineering practice. Direct and large-eddy simulation (LES) approaches have long been believed to have great potential for the accurate prediction of difficult turbulent flows, but the associated computational cost has been prohibitive for practical problems. This remains true for direct simulation but is no longer clear for large-eddy simulation. Advances in computer hardware, numerical methods, and subgrid-scale modeling have made it possible to conduct LES for flows or practical interest at Reynolds numbers in the range of laboratory experiments. The objective of this work is to apply ES and the dynamic subgrid-scale model to the flow of a boundary layer over a concave surface.
Judgements of relative noisiness of a supersonic transport and several commercial-service aircraft
NASA Technical Reports Server (NTRS)
Powell, C. A.
1977-01-01
Two laboratory experiments were conducted on the relative noisiness of takeoff and landing operations of a supersonic transport and several other aircraft in current commercial service. A total of 96 subjects made noisiness judgments on 120 tape-recorded flyover noises in the outdoor-acoustic-simulation experiment; 32 different subjects made judgments on the noises in the indoor-acoustic-simulation experiment. The judgments were made by using the method of numerical category scaling. The effective perceived noise level underestimated the noisiness of the supersonic transport by 3.5 db. For takeoff operations, no difference was found between the noisiness of the supersonic transport and the group of other aircraft for the A-weighted rating scale; however, for landing operations, the noisiness of the supersonic transport was overestimated by 3.7 db. Very high correlation was found between the outdoor-simulation experiment and the indoor-simulation experiment.
Frignani, M; Mostacci, D; Rocchi, F; Sumini, M
2005-01-01
Between 2001 and 2003 a 3.2 kJ dense plasma focus (DPF) device has been built at the Montecuccolino Laboratory of the Department of Energy, Nuclear and Environmental Control Engineering (DIENCA) of the University of Bologna. A DPF is a pulsed device in which deuterium nuclear fusion reactions can be obtained through the pinching effects of electromagnetic fields upon a dense plasma. The empirical scale law that governs the total D-D neutron yield from a single pulse of a DPF predicts for this machine a figure of approximately 10(7) fast neutrons per shot. The aim of the present work is to evaluate the role of backscattering of neutrons from the concrete walls surrounding the Montecuccolino DPF in total neutron yield measurements. The evaluation is performed by MCNP-5 simulations that are aimed at estimating the neutron spectra at a few points of interest in the laboratory, where neutron detectors will be placed during the experimental campaigns. Spectral information from the simulations is essential because the response of detectors is influenced by neutron energy. Comparisons are made with the simple r(-2) law, which holds for a DPF in infinite vacuum. The results from the simulations will ultimately be used both in the design and optimisation of the neutron detectors and in their final calibration and placement inside the laboratory.
Atmospheric Dispersion about a Heavy Gas Vapor Detention System.
NASA Astrophysics Data System (ADS)
Shin, Seong-Hee
Dispersion of liquefied natural gas (LNG) in the event of an accidental spill is a major concern in LNG storage and transport safety planning, hazard response, and facility siting. Falcon Series large scale LNG spill experiments were planned by Lawrence Livermore National Laboratory (LLNL) for the Department of Transportation (DOT) and the Gas Research Institute (GRI) as part of a joint government/industry study in 1987 to evaluate the effectiveness of vapor fences as a mitigating technique for accidental release of LNG and to assist in validating wind tunnel and numerical methods for vapor dispersion simulation. Post-field-spill wind-tunnel experiments were performed in Environmental Wind Tunnel (EWT) (1988, 1989) to augment the LNG Vapor Fence Program data obtained during the Falcon Test Series. The program included four different model length scales and two different simulant gases. The purpose of this program is to provide a basis for the analysis of the simulation of physical modeling tests using proper physical modeling techniques and to assist in the development and verification of analytical models. Field data and model data were compared and analyzed by surface pattern comparisons and statistical methods. A layer-averaged slab model developed by Meroney et al. (1988) (FENC23) was expanded to evaluate an enhanced entrainment model proposed for dense gas dispersion including the effect of vapor barriers, and the numerical model was simulated for Falcon tests without the fence and with the vapor fence to examine the effectiveness of vapor detention system on heavy gas dispersion. Model data and the field data were compared with the numerical model data, and degree of similarity between data were assessed.
Romm, H; Ainsbury, E; Bajinskis, A; Barnard, S; Barquinero, J F; Barrios, L; Beinke, C; Puig-Casanovas, R; Deperas-Kaminska, M; Gregoire, E; Oestreicher, U; Lindholm, C; Moquet, J; Rothkamm, K; Sommer, S; Thierens, H; Vral, A; Vandersickel, V; Wojcik, A
2014-05-01
In the case of a large scale radiation accident high throughput methods of biological dosimetry for population triage are needed to identify individuals requiring clinical treatment. The dicentric assay performed in web-based scoring mode may be a very suitable technique. Within the MULTIBIODOSE EU FP7 project a network is being established of 8 laboratories with expertise in dose estimations based on the dicentric assay. Here, the manual dicentric assay was tested in a web-based scoring mode. More than 23,000 high resolution images of metaphase spreads (only first mitosis) were captured by four laboratories and established as image galleries on the internet (cloud). The galleries included images of a complete dose effect curve (0-5.0 Gy) and three types of irradiation scenarios simulating acute whole body, partial body and protracted exposure. The blood samples had been irradiated in vitro with gamma rays at the University of Ghent, Belgium. Two laboratories provided image galleries from Fluorescence plus Giemsa stained slides (3 h colcemid) and the image galleries from the other two laboratories contained images from Giemsa stained preparations (24 h colcemid). Each of the 8 participating laboratories analysed 3 dose points of the dose effect curve (scoring 100 cells for each point) and 3 unknown dose points (50 cells) for each of the 3 simulated irradiation scenarios. At first all analyses were performed in a QuickScan Mode without scoring individual chromosomes, followed by conventional scoring (only complete cells, 46 centromeres). The calibration curves obtained using these two scoring methods were very similar, with no significant difference in the linear-quadratic curve coefficients. Analysis of variance showed a significant effect of dose on the yield of dicentrics, but no significant effect of the laboratories, different methods of slide preparation or different incubation times used for colcemid. The results obtained to date within the MULTIBIODOSE project by a network of 8 collaborating laboratories throughout Europe are very promising. The dicentric assay in the web based scoring mode as a high throughput scoring strategy is a useful application for biodosimetry in the case of a large scale radiation accident.
Lab Simulates Outdoor Algae Growth
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Algae can be turned into renewable biofuel, which is why scientists want to discover an inexpensive, fast-growing strain of algae. Scientists at Pacific Northwest National Laboratory have developed a system to speed up this search. The unique climate-simulating system uses temperature controls and multi-colored LED lights to mimic the constantly changing conditions of an outdoor algae pond. By simulating outdoor climates inside the lab, the system saves researchers time and expense.
NASA Astrophysics Data System (ADS)
Swanson, Ryan David
The advection-dispersion equation (ADE) fails to describe non-Fickian solute transport breakthrough curves (BTCs) in saturated porous media in both laboratory and field experiments, necessitating the use of other models. The dual-domain mass transfer (DDMT) model partitions the total porosity into mobile and less-mobile domains with an exchange of mass between the two domains, and this model can reproduce better fits to BTCs in many systems than ADE-based models. However, direct experimental estimation of DDMT model parameters remains elusive and model parameters are often calculated a posteriori by an optimization procedure. Here, we investigate the use of geophysical tools (direct-current resistivity, nuclear magnetic resonance, and complex conductivity) to estimate these model parameters directly. We use two different samples of the zeolite clinoptilolite, a material shown to demonstrate solute mass transfer due to a significant internal porosity, and provide the first evidence that direct-current electrical methods can track solute movement into and out of a less-mobile pore space in controlled laboratory experiments. We quantify the effects of assuming single-rate DDMT for multirate mass transfer systems. We analyze pore structures using material characterization methods (mercury porosimetry, scanning electron microscopy, and X-ray computer tomography), and compare these observations to geophysical measurements. Nuclear magnetic resonance in conjunction with direct-current resistivity measurements can constrain mobile and less-mobile porosities, but complex conductivity may have little value in relation to mass transfer despite the hypothesis that mass transfer and complex conductivity lengths scales are related. Finally, we conduct a geoelectrical monitored tracer test at the Macrodispersion Experiment (MADE) site in Columbus, MS. We relate hydraulic and electrical conductivity measurements to generate a 3D hydraulic conductivity field, and compare to hydraulic conductivity fields estimated through ordinary kriging and sequential Gaussian simulation. Time-lapse electrical measurements are used to verify or dismiss aspects of breakthrough curves for different hydraulic conductivity fields. Our results quantify the potential for geophysical measurements to infer on single-rate DDMT parameters, show site-specific relations between hydraulic and electrical conductivity, and track solute exchange into and out of less-mobile domains.
Building laboratory capacity to support HIV care in Nigeria: Harvard/APIN PEPFAR, 2004–2012
Hamel, Donald J.; Sankalé, Jean-Louis; Samuels, Jay Osi; Sarr, Abdoulaye D.; Chaplin, Beth; Ofuche, Eke; Meloni, Seema T.; Okonkwo, Prosper; Kanki, Phyllis J.
2015-01-01
Introduction From 2004–2012, the Harvard/AIDS Prevention Initiative in Nigeria, funded through the US President’s Emergency Plan for AIDS Relief programme, scaled up HIV care and treatment services in Nigeria. We describe the methodologies and collaborative processes developed to improve laboratory capacity significantly in a resource-limited setting. These methods were implemented at 35 clinic and laboratory locations. Methods Systems were established and modified to optimise numerous laboratory processes. These included strategies for clinic selection and management, equipment and reagent procurement, supply chains, laboratory renovations, equipment maintenance, electronic data management, quality development programmes and trainings. Results Over the eight-year programme, laboratories supported 160 000 patients receiving HIV care in Nigeria, delivering over 2.5 million test results, including regular viral load quantitation. External quality assurance systems were established for CD4+ cell count enumeration, blood chemistries and viral load monitoring. Laboratory equipment platforms were improved and standardised and use of point-of-care analysers was expanded. Laboratory training workshops supported laboratories toward increasing staff skills and improving overall quality. Participation in a World Health Organisation-led African laboratory quality improvement system resulted in significant gains in quality measures at five laboratories. Conclusions Targeted implementation of laboratory development processes, during simultaneous scale-up of HIV treatment programmes in a resource-limited setting, can elicit meaningful gains in laboratory quality and capacity. Systems to improve the physical laboratory environment, develop laboratory staff, create improvements to reduce costs and increase quality are available for future health and laboratory strengthening programmes. We hope that the strategies employed may inform and encourage the development of other laboratories in resource-limited settings. PMID:26900573
A compositional reservoir simulator on distributed memory parallel computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rame, M.; Delshad, M.
1995-12-31
This paper presents the application of distributed memory parallel computes to field scale reservoir simulations using a parallel version of UTCHEM, The University of Texas Chemical Flooding Simulator. The model is a general purpose highly vectorized chemical compositional simulator that can simulate a wide range of displacement processes at both field and laboratory scales. The original simulator was modified to run on both distributed memory parallel machines (Intel iPSC/960 and Delta, Connection Machine 5, Kendall Square 1 and 2, and CRAY T3D) and a cluster of workstations. A domain decomposition approach has been taken towards parallelization of the code. Amore » portion of the discrete reservoir model is assigned to each processor by a set-up routine that attempts a data layout as even as possible from the load-balance standpoint. Each of these subdomains is extended so that data can be shared between adjacent processors for stencil computation. The added routines that make parallel execution possible are written in a modular fashion that makes the porting to new parallel platforms straight forward. Results of the distributed memory computing performance of Parallel simulator are presented for field scale applications such as tracer flood and polymer flood. A comparison of the wall-clock times for same problems on a vector supercomputer is also presented.« less
MECHANISMS OF INORGANIC PARTICLE FORMATION DURING SUSPENSION HEATING OF SIMULATED AQEOUS WASTES
The paper gives results of measurements of metal partitioning between the fine condensation aerosol and the larger particles produced during rapid heating of polydisperse droplet streams of aqueous solutions containing nitrates of Cd, Pb, and Ni in a laboratory scale furnace. rim...
System Design Considerations for Microcomputer Based Instructional Laboratories.
1986-04-01
when wrong procedures are tried as well as correct procedures. This is sometimes called " free play " simulation. While this form of simulation...steps are performed correctly. Unlike " free play " system simulations, the student must perform the operation in an approved manner. 28 V. Technical...Supports free play exercises o Typically does not tutor a student o Used for skill development and performance measurement Task Simulation o Computer
NASA Technical Reports Server (NTRS)
Svoboda, James S.; Kachmar, Brian A.
1993-01-01
The design and performance of a rain fade simulation/counteraction system on a laboratory simulated 30/20 GHz, time division multiple access (TDMA) satellite communications testbed is evaluated. Severe rain attenuation of electromagnetic radiation at 30/20 GHz occurs due to the carrier wavelength approaching the water droplet size. Rain in the downlink path lowers the signal power present at the receiver, resulting in a higher number of bit errors induced in the digital ground terminal. The laboratory simulation performed at NASA Lewis Research Center uses a programmable PIN diode attenuator to simulate 20 GHz satellite downlink geographic rain fade profiles. A computer based network control system monitors the downlink power and informs the network of any power threshold violations, which then prompts the network to issue commands that temporarily increase the gain of the satellite based traveling wave tube (TWT) amplifier. After the rain subsides, the network returns the TWT to the normal energy conserving power mode. Bit error rate (BER) data taken at the receiving ground terminal serves as a measure of the severity of rain degradation, and also evaluates the extent to which the network can improve the faded channel.
NASA Astrophysics Data System (ADS)
Gandhi, Rahul K.; Hopkins, Gary D.; Goltz, Mark N.; Gorelick, Steven M.; McCarty, Perry L.
2002-04-01
We present an analysis of an extensively monitored full-scale field demonstration of in situ treatment of trichloroethylene (TCE) contamination by aerobic cometabolic biodegradation. The demonstration was conducted at Edwards Air Force Base in southern California. There are two TCE-contaminated aquifers at the site, separated from one another by a clay aquitard. The treatment system consisted of two recirculating wells located 10 m apart. Each well was screened in both of the contaminated aquifers. Toluene, oxygen, and hydrogen peroxide were added to the water in both wells. At one well, water was pumped from the upper aquifer to the lower aquifer. In the other well, pumping was from the lower to the upper aquifer. This resulted in a ``conveyor belt'' flow system with recirculation between the two aquifers. The treatment system was successfully operated for a 410 day period. We explore how well a finite element reactive transport model can describe the key processes in an engineered field system. Our model simulates TCE, toluene, oxygen, hydrogen peroxide, and microbial growth/death. Simulated processes include advective-dispersive transport, biodegradation, the inhibitory effect of hydrogen peroxide on biomass growth, and oxygen degassing. Several parameter values were fixed to laboratory values or values from previous modeling studies. The remaining six parameter values were obtained by calibrating the model to 7213 TCE concentration data and 6997 dissolved oxygen concentration data collected during the demonstration using a simulation-regression procedure. In this complex flow field involving reactive transport, TCE and dissolved oxygen concentration histories are matched very well by the calibrated model. Both simulated and observed toluene concentrations display similar high-frequency oscillations due to pulsed toluene injection approximately one half hour during each 8 hour period. Simulation results indicate that over the course of the demonstration, 6.9 kg of TCE was degraded and that in the upper aquifer a region 40 m wide extending 25 m down gradient of the treatment system was cleaned up to less than 100 μg L-1 from initial concentrations of approximately 700 μg L-1. A smaller region was cleaned up to less than 30 μg L-1. Simulations indicate that the cleaned up area in the upper aquifer would continue to expand for as long as treatment was continued.
Reference Manual for the System Advisor Model's Wind Power Performance Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeman, J.; Jorgenson, J.; Gilman, P.
2014-08-01
This manual describes the National Renewable Energy Laboratory's System Advisor Model (SAM) wind power performance model. The model calculates the hourly electrical output of a single wind turbine or of a wind farm. The wind power performance model requires information about the wind resource, wind turbine specifications, wind farm layout (if applicable), and costs. In SAM, the performance model can be coupled to one of the financial models to calculate economic metrics for residential, commercial, or utility-scale wind projects. This manual describes the algorithms used by the wind power performance model, which is available in the SAM user interface andmore » as part of the SAM Simulation Core (SSC) library, and is intended to supplement the user documentation that comes with the software.« less
Simulation studies for the evaluation of health information technologies: experiences and results.
Ammenwerth, Elske; Hackl, Werner O; Binzer, Kristine; Christoffersen, Tue E H; Jensen, Sanne; Lawton, Kitta; Skjoet, Peter; Nohr, Christian
It is essential for new health information technologies (IT) to undergo rigorous evaluations to ensure they are effective and safe for use in real-world situations. However, evaluation of new health IT is challenging, as field studies are often not feasible when the technology being evaluated is not sufficiently mature. Laboratory-based evaluations have also been shown to have insufficient external validity. Simulation studies seem to be a way to bridge this gap. The aim of this study was to evaluate, using a simulation methodology, the impact of a new prototype of an electronic medication management system on the appropriateness of prescriptions and drug-related activities, including laboratory test ordering or medication changes. This article presents the results of a controlled simulation study with 50 simulation runs, including ten doctors and five simulation patients, and discusses experiences and lessons learnt while conducting the study. Although the new electronic medication management system showed tendencies to improve medication safety when compared with the standard system, this tendency was not significant. Altogether, five distinct situations were identified where the new medication management system did help to improve medication safety. This simulation study provided a good compromise between internal validity and external validity. However, several challenges need to be addressed when undertaking simulation evaluations including: preparation of adequate test cases; training of participants before using unfamiliar applications; consideration of time, effort and costs of conducting the simulation; technical maturity of the evaluated system; and allowing adequate preparation of simulation scenarios and simulation setting. Simulation studies are an interesting but time-consuming approach, which can be used to evaluate newly developed health IT systems, particularly those systems that are not yet sufficiently mature to undergo field evaluation studies.
Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale
Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv
2015-01-01
X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner. PMID:26169570