Sample records for laboratory-scale system simulating

  1. Computational simulation of laboratory-scale volcanic jets

    NASA Astrophysics Data System (ADS)

    Solovitz, S.; Van Eaton, A. R.; Mastin, L. G.; Herzog, M.

    2017-12-01

    Volcanic eruptions produce ash clouds that may travel great distances, significantly impacting aviation and communities downwind. Atmospheric hazard forecasting relies partly on numerical models of the flow physics, which incorporate data from eruption observations and analogue laboratory tests. As numerical tools continue to increase in complexity, they must be validated to fine-tune their effectiveness. Since eruptions are relatively infrequent and challenging to observe in great detail, analogue experiments can provide important insights into expected behavior over a wide range of input conditions. Unfortunately, laboratory-scale jets cannot easily attain the high Reynolds numbers ( 109) of natural volcanic eruption columns. Comparisons between the computational models and analogue experiments can help bridge this gap. In this study, we investigate a 3-D volcanic plume model, the Active Tracer High-resolution Atmospheric Model (ATHAM), which has been used to simulate a variety of eruptions. However, it has not been previously validated using laboratory-scale data. We conducted numerical simulations of three flows that we have studied in the laboratory: a vertical jet in a quiescent environment, a vertical jet in horizontal cross flow, and a particle-laden jet. We considered Reynolds numbers from 10,000 to 50,000, jet-to-cross flow velocity ratios of 2 to 10, and particle mass loadings of up to 25% of the exit mass flow rate. Vertical jet simulations produce Gaussian velocity profiles in the near exit region by 3 diameters downstream, matching the mean experimental profiles. Simulations of air entrainment are of the correct order of magnitude, but they show decreasing entrainment with vertical distance from the vent. Cross flow simulations reproduce experimental trajectories for the jet centerline initially, although confinement appears to impact the response later. Particle-laden simulations display minimal variation in concentration profiles between cases with

  2. Note: Measurement system for the radiative forcing of greenhouse gases in a laboratory scale.

    PubMed

    Kawamura, Yoshiyuki

    2016-01-01

    The radiative forcing of the greenhouse gases has been studied being based on computational simulations or the observation of the real atmosphere meteorologically. In order to know the greenhouse effect more deeply and to study it from various viewpoints, the study on it in a laboratory scale is important. We have developed a direct measurement system for the infrared back radiation from the carbon dioxide (CO2) gas. The system configuration is similar with that of the practical earth-atmosphere-space system. Using this system, the back radiation from the CO2 gas was directly measured in a laboratory scale, which roughly coincides with meteorologically predicted value.

  3. Predicting the performance uncertainty of a 1-MW pilot-scale carbon capture system after hierarchical laboratory-scale calibration and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zhijie; Lai, Canhai; Marcy, Peter William

    2017-05-01

    A challenging problem in designing pilot-scale carbon capture systems is to predict, with uncertainty, the adsorber performance and capture efficiency under various operating conditions where no direct experimental data exist. Motivated by this challenge, we previously proposed a hierarchical framework in which relevant parameters of physical models were sequentially calibrated from different laboratory-scale carbon capture unit (C2U) experiments. Specifically, three models of increasing complexity were identified based on the fundamental physical and chemical processes of the sorbent-based carbon capture technology. Results from the corresponding laboratory experiments were used to statistically calibrate the physical model parameters while quantifying some of theirmore » inherent uncertainty. The parameter distributions obtained from laboratory-scale C2U calibration runs are used in this study to facilitate prediction at a larger scale where no corresponding experimental results are available. In this paper, we first describe the multiphase reactive flow model for a sorbent-based 1-MW carbon capture system then analyze results from an ensemble of simulations with the upscaled model. The simulation results are used to quantify uncertainty regarding the design’s predicted efficiency in carbon capture. In particular, we determine the minimum gas flow rate necessary to achieve 90% capture efficiency with 95% confidence.« less

  4. Hydrodynamic Scalings: from Astrophysics to Laboratory

    NASA Astrophysics Data System (ADS)

    Ryutov, D. D.; Remington, B. A.

    2000-05-01

    A surprisingly general hydrodynamic similarity has been recently described in Refs. [1,2]. One can call it the Euler similarity because it works for the Euler equations (with MHD effects included). Although the dissipation processes are assumed to be negligible, the presence of shocks is allowed. For the polytropic medium (i.e., the medium where the energy density is proportional to the pressure), an evolution of an arbitrarily chosen 3D initial state can be scaled to another system, if a single dimensionless parameter (the Euler number) is the same for both initial states. The Euler similarity allows one to properly design laboratory experiments modeling astrophysical phenomena. We discuss several examples of such experiments related to the physics of supernovae [3]. For the problems with a single spatial scale, the condition of the smallness of dissipative processes can be adequately described in terms of the Reynolds, Peclet, and magnetic Reynolds numbers related to this scale (all three numbers must be large). However, if the system develops small-scale turbulence, dissipation may become important at these smaller scales, thereby affecting the gross behavior of the system. We analyze the corresponding constraints. We discuss also constraints imposed by the presence of interfaces between the substances with different polytropic index. Another set of similarities governs evolution of photoevaporation fronts in astrophysics. Convenient scaling laws exist in situations where the density of the ablated material is very low compared to the bulk density. We conclude that a number of hydrodynamical problems related to such objects as the Eagle Nebula can be adequately simulated in the laboratory. We discuss also possible scalings for radiative astrophysical jets (see Ref. [3] and references therein). This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract W-7405-Eng-48

  5. Development of space simulation / net-laboratory system

    NASA Astrophysics Data System (ADS)

    Usui, H.; Matsumoto, H.; Ogino, T.; Fujimoto, M.; Omura, Y.; Okada, M.; Ueda, H. O.; Murata, T.; Kamide, Y.; Shinagawa, H.; Watanabe, S.; Machida, S.; Hada, T.

    A research project for the development of space simulation / net-laboratory system was approved by Japan Science and Technology Corporation (JST) in the category of Research and Development for Applying Advanced Computational Science and Technology(ACT-JST) in 2000. This research project, which continues for three years, is a collaboration with an astrophysical simulation group as well as other space simulation groups which use MHD and hybrid models. In this project, we develop a proto type of unique simulation system which enables us to perform simulation runs by providing or selecting plasma parameters through Web-based interface on the internet. We are also developing an on-line database system for space simulation from which we will be able to search and extract various information such as simulation method and program, manuals, and typical simulation results in graphic or ascii format. This unique system will help the simulation beginners to start simulation study without much difficulty or effort, and contribute to the promotion of simulation studies in the STP field. In this presentation, we will report the overview and the current status of the project.

  6. LASSIE: simulating large-scale models of biochemical systems on GPUs.

    PubMed

    Tangherloni, Andrea; Nobile, Marco S; Besozzi, Daniela; Mauri, Giancarlo; Cazzaniga, Paolo

    2017-05-10

    Mathematical modeling and in silico analysis are widely acknowledged as complementary tools to biological laboratory methods, to achieve a thorough understanding of emergent behaviors of cellular processes in both physiological and perturbed conditions. Though, the simulation of large-scale models-consisting in hundreds or thousands of reactions and molecular species-can rapidly overtake the capabilities of Central Processing Units (CPUs). The purpose of this work is to exploit alternative high-performance computing solutions, such as Graphics Processing Units (GPUs), to allow the investigation of these models at reduced computational costs. LASSIE is a "black-box" GPU-accelerated deterministic simulator, specifically designed for large-scale models and not requiring any expertise in mathematical modeling, simulation algorithms or GPU programming. Given a reaction-based model of a cellular process, LASSIE automatically generates the corresponding system of Ordinary Differential Equations (ODEs), assuming mass-action kinetics. The numerical solution of the ODEs is obtained by automatically switching between the Runge-Kutta-Fehlberg method in the absence of stiffness, and the Backward Differentiation Formulae of first order in presence of stiffness. The computational performance of LASSIE are assessed using a set of randomly generated synthetic reaction-based models of increasing size, ranging from 64 to 8192 reactions and species, and compared to a CPU-implementation of the LSODA numerical integration algorithm. LASSIE adopts a novel fine-grained parallelization strategy to distribute on the GPU cores all the calculations required to solve the system of ODEs. By virtue of this implementation, LASSIE achieves up to 92× speed-up with respect to LSODA, therefore reducing the running time from approximately 1 month down to 8 h to simulate models consisting in, for instance, four thousands of reactions and species. Notably, thanks to its smaller memory footprint, LASSIE

  7. EPOS-WP16: A Platform for European Multi-scale Laboratories

    NASA Astrophysics Data System (ADS)

    Spiers, Chris; Drury, Martyn; Kan-Parker, Mirjam; Lange, Otto; Willingshofer, Ernst; Funiciello, Francesca; Rosenau, Matthias; Scarlato, Piergiorgio; Sagnotti, Leonardo; W16 Participants

    2016-04-01

    The participant countries in EPOS embody a wide range of world-class laboratory infrastructures ranging from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue modeling and paleomagnetic laboratories. Most data produced by the various laboratory centres and networks are presently available only in limited "final form" in publications. As such many data remain inaccessible and/or poorly preserved. However, the data produced at the participating laboratories are crucial to serving society's need for geo-resources exploration and for protection against geo-hazards. Indeed, to model resource formation and system behaviour during exploitation, we need an understanding from the molecular to the continental scale, based on experimental data. This contribution will describe the work plans that the laboratories community in Europe is making, in the context of EPOS. The main objectives are: - To collect and harmonize available and emerging laboratory data on the properties and processes controlling rock system behaviour at multiple scales, in order to generate products accessible and interoperable through services for supporting research activities. - To co-ordinate the development, integration and trans-national usage of the major solid Earth Science laboratory centres and specialist networks. The length scales encompassed by the infrastructures included range from the nano- and micrometer levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetre sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. - To provide products and services supporting research into Geo-resources and Geo-storage, Geo-hazards and Earth System Evolution.

  8. Evaluation of Surface Runoff Generation Processes Using a Rainfall Simulator: A Small Scale Laboratory Experiment

    NASA Astrophysics Data System (ADS)

    Danáčová, Michaela; Valent, Peter; Výleta, Roman

    2017-12-01

    of 5 mm/min was used to irrigate a corrupted soil sample. The experiment was undertaken for several different slopes, under the condition of no vegetation cover. The results of the rainfall simulation experiment complied with the expectations of a strong relationship between the slope gradient, and the amount of surface runoff generated. The experiments with higher slope gradients were characterised by larger volumes of surface runoff generated, and by shorter times after which it occurred. The experiments with rainfall simulators in both laboratory and field conditions play an important role in better understanding of runoff generation processes. The results of such small scale experiments could be used to estimate some of the parameters of complex hydrological models, which are used to model rainfall-runoff and erosion processes at catchment scale.

  9. Simulating Extraterrestrial Ices in the Laboratory

    NASA Astrophysics Data System (ADS)

    Berisford, D. F.; Carey, E. M.; Hand, K. P.; Choukroun, M.

    2017-12-01

    Several ongoing experiments at JPL attempt to simulate the ice environment for various regimes associated with icy moons. The Europa Penitent Ice Experiment (EPIX) simulates the surface environment of an icy moon, to investigate the physics of ice surface morphology growth. This experiment features half-meter-scale cryogenic ice samples, cryogenic radiative sink environment, vacuum conditions, and diurnal cycling solar simulation. The experiment also includes several smaller fixed-geometry vacuum chambers for ice simulation at Earth-like and intermediate temperature and vacuum conditions for development of surface morphology growth scaling relations. Additionally, an ice cutting facility built on a similar platform provides qualitative data on the mechanical behavior of cryogenic ice with impurities under vacuum, and allows testing of ice cutting/sampling tools relevant for landing spacecraft. A larger cutting facility is under construction at JPL, which will provide more quantitative data and allow full-scale sampling tool tests. Another facility, the JPL Ice Physics Laboratory, features icy analog simulant preparation abilities that range icy solar system objects such as Mars, Ceres and the icy satellites of Saturn and Jupiter. In addition, the Ice Physics Lab has unique facilities for Icy Analog Tidal Simulation and Rheological Studies of Cryogenic Icy Slurries, as well as equipment to perform thermal and mechanical properties testing on icy analog materials and their response to sinusoidal tidal stresses.

  10. Reflectivity of the atmosphere-inhomogeneous surfaces system Laboratory simulation

    NASA Technical Reports Server (NTRS)

    Mekler, Y.; Kaufman, Y. J.; Fraser, R. S.

    1984-01-01

    Theoretical two- and three-dimensional solutions of the radiative transfer equation have been applied to the earth-atmosphere system. Such solutions have not been verified experimentally. A laboratory experiment simulates such a system to test the theory. The atmosphere was simulated by latex spheres suspended in water and the ground by a nonuniform surface, half white and half black. A stable radiation source provided uniform illumination over the hydrosol. The upward radiance along a line orthogonal to the boundary of the two-halves field was recorded for different amounts of the hydrosol. The simulation is a well-defined radiative transfer experiment to test radiative transfer models involving nonuniform surfaces. Good agreement is obtained between the measured and theoretical results.

  11. Simulating Laboratory Procedures.

    ERIC Educational Resources Information Center

    Baker, J. E.; And Others

    1986-01-01

    Describes the use of computer assisted instruction in a medical microbiology course. Presents examples of how computer assisted instruction can present case histories in which the laboratory procedures are simulated. Discusses an authoring system used to prepare computer simulations and provides one example of a case history dealing with fractured…

  12. Laboratory study of sonic booms and their scaling laws. [ballistic range simulation

    NASA Technical Reports Server (NTRS)

    Toong, T. Y.

    1974-01-01

    This program undertook to seek a basic understanding of non-linear effects associated with caustics, through laboratory simulation experiments of sonic booms in a ballistic range and a coordinated theoretical study of scaling laws. Two cases of superbooms or enhanced sonic booms at caustics have been studied. The first case, referred to as acceleration superbooms, is related to the enhanced sonic booms generated during the acceleration maneuvers of supersonic aircrafts. The second case, referred to as refraction superbooms, involves the superbooms that are generated as a result of atmospheric refraction. Important theoretical and experimental results are briefly reported.

  13. MHD scaling: from astrophysics to the laboratory

    NASA Astrophysics Data System (ADS)

    Ryutov, Dmitri

    2000-10-01

    During the last few years, considerable progress has been made in simulating astrophysical phenomena in laboratory experiments with high power lasers [1]. Astrophysical phenomena that have drawn particular interest include supernovae explosions; young supernova remnants; galactic jets; the formation of fine structures in late supernova remnants by instabilities; and the ablation driven evolution of molecular clouds illuminated by nearby bright stars, which may affect star formation. A question may arise as to what extent the laser experiments, which deal with targets of a spatial scale 0.01 cm and occur at a time scale of a few nanoseconds, can reproduce phenomena occurring at spatial scales of a million or more kilometers and time scales from hours to many years. Quite remarkably, if dissipative processes (like, e.g., viscosity, Joule dissipation, etc.) are subdominant in both systems, and the matter behaves as a polytropic gas, there exists a broad hydrodynamic similarity (the ``Euler similarity" of Ref. [2]) that allows a direct scaling of laboratory results to astrophysical phenomena. Following a review of relevant earlier work (in particular, [3]-[5]), discussion is presented of the details of the Euler similarity related to the presence of shocks and to a special case of a strong drive. After that, constraints stemming from possible development of small-scale turbulence are analyzed. Generalization of the Euler similarity to the case of a gas with spatially varying polytropic index is presented. A possibility of scaled simulations of ablation front dynamics is one more topic covered in this paper. It is shown that, with some additional constraints, a simple similarity exists. This, in particular, opens up the possibility of scaled laboratory simulation of the aforementioned ablation (photoevaporation) fronts. A nonlinear transformation [6] that establishes a duality between implosion and explosion processes is also discussed in the paper. 1. B.A. Remington et

  14. Novel laboratory simulations of astrophysical jets

    NASA Astrophysics Data System (ADS)

    Brady, Parrish Clawson

    This thesis was motivated by the promise that some physical aspects of astrophysical jets and collimation processes can be scaled to laboratory parameters through hydrodynamic scaling laws. The simulation of astrophysical jet phenomena with laser-produced plasmas was attractive because the laser- target interaction can inject energetic, repeatable plasma into an external environment. Novel laboratory simulations of astrophysical jets involved constructing and using the YOGA laser, giving a 1064 nm, 8 ns pulse laser with energies up to 3.7 + 0.2 J . Laser-produced plasmas were characterized using Schlieren, interferometry and ICCD photography for their use in simulating jet and magnetosphere physics. The evolution of the laser-produced plasma in various conditions was compared with self-similar solutions and HYADES computer simulations. Millimeter-scale magnetized collimated outflows were produced by a centimeter scale cylindrically symmetric electrode configuration triggered by a laser-produced plasma. A cavity with a flared nozzle surrounded the center electrode and the electrode ablation created supersonic uncollimated flows. This flow became collimated when the center electrode changed from an anodeto a cathode. The plasma jets were in axially directed permanent magnetic fields with strengths up to 5000 Gauss. The collimated magnetized jets were 0.1-0. 3 cm wide, up to 2.0 cm long, and had velocities of ~4.0 × 10 6 cm/s. The dynamics of the evolution of the jet were compared qualitatively and quantitatively with fluxtube simulations from Bellan's formulation [6] giving a calculated estimate of ~2.6 × 10 6 cm/s for jet evolution velocity and evidence for jet rotation. The density measured with interferometry was 1.9 ± 0.2 × 10 17 cm -3 compared with 2.1 × 10 16 cm -3 calculated with Bellan's pressure balance formulation. Kinks in the jet column were produced consistent with the Kruskal-Shafranov condition which allowed stable and symmetric jets to form with

  15. Installation of Computerized Procedure System and Advanced Alarm System in the Human Systems Simulation Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Blanc, Katya Lee; Spielman, Zachary Alexander; Rice, Brandon Charles

    2016-04-01

    This report describes the installation of two advanced control room technologies, an advanced alarm system and a computerized procedure system, into the Human Systems Simulation Laboratory (HSSL). Installation of these technologies enables future phases of this research by providing a platform to systematically evaluate the effect of these technologies on operator and plant performance.

  16. Simulating flow in karst aquifers at laboratory and sub-regional scales using MODFLOW-CFP

    NASA Astrophysics Data System (ADS)

    Gallegos, Josue Jacob; Hu, Bill X.; Davis, Hal

    2013-12-01

    Groundwater flow in a well-developed karst aquifer dominantly occurs through bedding planes, fractures, conduits, and caves created by and/or enlarged by dissolution. Conventional groundwater modeling methods assume that groundwater flow is described by Darcian principles where primary porosity (i.e. matrix porosity) and laminar flow are dominant. However, in well-developed karst aquifers, the assumption of Darcian flow can be questionable. While Darcian flow generally occurs in the matrix portion of the karst aquifer, flow through conduits can be non-laminar where the relation between specific discharge and hydraulic gradient is non-linear. MODFLOW-CFP is a relatively new modeling program that accounts for non-laminar and laminar flow in pipes, like karst caves, within an aquifer. In this study, results from MODFLOW-CFP are compared to those from MODFLOW-2000/2005, a numerical code based on Darcy's law, to evaluate the accuracy that CFP can achieve when modeling flows in karst aquifers at laboratory and sub-regional (Woodville Karst Plain, Florida, USA) scales. In comparison with laboratory experiments, simulation results by MODFLOW-CFP are more accurate than MODFLOW 2005. At the sub-regional scale, MODFLOW-CFP was more accurate than MODFLOW-2000 for simulating field measurements of peak flow at one spring and total discharges at two springs for an observed storm event.

  17. Simulations of Tornadoes, Tropical Cyclones, MJOs, and QBOs, using GFDL's multi-scale global climate modeling system

    NASA Astrophysics Data System (ADS)

    Lin, Shian-Jiann; Harris, Lucas; Chen, Jan-Huey; Zhao, Ming

    2014-05-01

    A multi-scale High-Resolution Atmosphere Model (HiRAM) is being developed at NOAA/Geophysical Fluid Dynamics Laboratory. The model's dynamical framework is the non-hydrostatic extension of the vertically Lagrangian finite-volume dynamical core (Lin 2004, Monthly Wea. Rev.) constructed on a stretchable (via Schmidt transformation) cubed-sphere grid. Physical parametrizations originally designed for IPCC-type climate predictions are in the process of being modified and made more "scale-aware", in an effort to make the model suitable for multi-scale weather-climate applications, with horizontal resolution ranging from 1 km (near the target high-resolution region) to as low as 400 km (near the antipodal point). One of the main goals of this development is to enable simulation of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously thought impossible. We will present preliminary results, covering a very wide spectrum of temporal-spatial scales, ranging from simulation of tornado genesis (hours), Madden-Julian Oscillations (intra-seasonal), topical cyclones (seasonal), to Quasi Biennial Oscillations (intra-decadal), using the same global multi-scale modeling system.

  18. Virtual Earth System Laboratory (VESL): Effective Visualization of Earth System Data and Process Simulations

    NASA Astrophysics Data System (ADS)

    Quinn, J. D.; Larour, E. Y.; Cheng, D. L. C.; Halkides, D. J.

    2016-12-01

    The Virtual Earth System Laboratory (VESL) is a Web-based tool, under development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. It contains features geared toward a range of applications, spanning research and outreach. It offers an intuitive user interface, in which model inputs are changed using sliders and other interactive components. Current capabilities include simulation of polar ice sheet responses to climate forcing, based on NASA's Ice Sheet System Model (ISSM). We believe that the visualization of data is most effective when tailored to the target audience, and that many of the best practices for modern Web design/development can be applied directly to the visualization of data: use of negative space, color schemes, typography, accessibility standards, tooltips, etc cetera. We present our prototype website, and invite input from potential users, including researchers, educators, and students.

  19. Design of a laboratory scale fluidized bed reactor

    NASA Astrophysics Data System (ADS)

    Wikström, E.; Andersson, P.; Marklund, S.

    1998-04-01

    The aim of this project was to construct a laboratory scale fluidized bed reactor that simulates the behavior of full scale municipal solid waste combustors. The design of this reactor is thoroughly described. The size of the laboratory scale fluidized bed reactor is 5 kW, which corresponds to a fuel-feeding rate of approximately 1 kg/h. The reactor system consists of four parts: a bed section, a freeboard section, a convector (postcombustion zone), and an air pollution control (APC) device system. The inside diameter of the reactor is 100 mm at the bed section and it widens to 200 mm in diameter in the freeboard section; the total height of the reactor is 1760 mm. The convector part consists of five identical sections; each section is 2700 mm long and has an inside diameter of 44.3 mm. The reactor is flexible regarding the placement and number of sampling ports. At the beginning of the first convector unit and at the end of each unit there are sampling ports for organic micropollutants (OMP). This makes it possible to study the composition of the flue gases at various residence times. Sampling ports for inorganic compounds and particulate matter are also placed in the convector section. All operating parameters, reactor temperatures, concentrations of CO, CO2, O2, SO2, NO, and NO2 are continuously measured and stored at selected intervals for further evaluation. These unique features enable full control over the fuel feed, air flows, and air distribution as well as over the temperature profile. Elaborate details are provided regarding the configuration of the fuel-feeding systems, the fluidized bed, the convector section, and the APC device. This laboratory reactor enables detailed studies of the formation mechanisms of OMP, such as polychlorinated dibenzo-p-dioxins (PCDDs), polychlorinated dibenzofurans (PCDFs), poly-chlorinated biphenyls (PCBs), and polychlorinated benzenes (PCBzs). With this system formation mechanisms of OMP occurring in both the combustion

  20. Simulation of large scale motions and small scale structures in planetary atmospheres and oceans: From laboratory to space experiments on ISS

    NASA Astrophysics Data System (ADS)

    Egbers, Christoph; Futterer, Birgit; Zaussinger, Florian; Harlander, Uwe

    2014-05-01

    Baroclinic waves are responsible for the transport of heat and momentum in the oceans, in the Earth's atmosphere as well as in other planetary atmospheres. The talk will give an overview on possibilities to simulate such large scale as well as co-existing small scale structures with the help of well defined laboratory experiments like the baroclinic wave tank (annulus experiment). The analogy between the Earth's atmosphere and the rotating cylindrical annulus experiment only driven by rotation and differential heating between polar and equatorial regions is obvious. From the Gulf stream single vortices seperate from time to time. The same dynamics and the co-existence of small and large scale structures and their separation can be also observed in laboratory experiments as in the rotating cylindrical annulus experiment. This experiment represents the mid latitude dynamics quite well and is part as a central reference experiment in the German-wide DFG priority research programme ("METSTRÖM", SPP 1276) yielding as a benchmark for lot of different numerical methods. On the other hand, those laboratory experiments in cylindrical geometry are limited due to the fact, that the surface and real interaction between polar and equatorial region and their different dynamics can not be really studied. Therefore, I demonstrate how to use the very successful Geoflow I and Geoflow II space experiment hardware on ISS with future modifications for simulations of small and large scale planetary atmospheric motion in spherical geometry with differential heating between inner and outer spheres as well as between the polar and equatorial regions. References: Harlander, U., Wenzel, J., Wang, Y., Alexandrov, K. & Egbers, Ch., 2012, Simultaneous PIV- and thermography measurements of partially blocked flow in a heated rotating annulus, Exp. in Fluids, 52 (4), 1077-1087 Futterer, B., Krebs, A., Plesa, A.-C., Zaussinger, F., Hollerbach, R., Breuer, D. & Egbers, Ch., 2013, Sheet-like and

  1. Continuous microalgal cultivation in a laboratory-scale photobioreactor under seasonal day-night irradiation: experiments and simulation.

    PubMed

    Bertucco, Alberto; Beraldi, Mariaelena; Sforza, Eleonora

    2014-08-01

    In this work, the production of Scenedesmus obliquus in a continuous flat-plate laboratory-scale photobioreactor (PBR) under alternated day-night cycles was tested both experimentally and theoretically. Variation of light intensity according to the four seasons of the year were simulated experimentally by a tunable LED lamp, and effects on microalgal growth and productivity were measured to evaluate the conversion efficiency of light energy into biomass during the different seasons. These results were used to validate a mathematical model for algae growth that can be applied to simulate a large-scale production unit, carried out in a flat-plate PBR of similar geometry. The cellular concentration in the PBR was calculated in both steady-state and transient conditions, and the value of the maintenance kinetic term was correlated to experimental profiles. The relevance of this parameter was finally outlined.

  2. Simulation of General Physics laboratory exercise

    NASA Astrophysics Data System (ADS)

    Aceituno, P.; Hernández-Aceituno, J.; Hernández-Cabrera, A.

    2015-01-01

    Laboratory exercises are an important part of general Physics teaching, both during the last years of high school and the first year of college education. Due to the need to acquire enough laboratory equipment for all the students, and the widespread access to computers rooms in teaching, we propose the development of computer simulated laboratory exercises. A representative exercise in general Physics is the calculation of the gravity acceleration value, through the free fall motion of a metal ball. Using a model of the real exercise, we have developed an interactive system which allows students to alter the starting height of the ball to obtain different fall times. The simulation was programmed in ActionScript 3, so that it can be freely executed in any operative system; to ensure the accuracy of the calculations, all the input parameters of the simulations were modelled using digital measurement units, and to allow a statistical management of the resulting data, measurement errors are simulated through limited randomization.

  3. NLS Flight Simulation Laboratory (FSL) documentation

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Flight Simulation Laboratory (FSL) Electronic Documentation System design consists of modification and utilization of the MSFC Integrated Engineering System (IES), translation of the existing FSL documentation to an electronic format, and generation of new drawings to represent the Engine Flight Simulation Laboratory design and implementation. The intent of the electronic documentation is to provide ease of access, local print/plot capabilities, as well as the ability to correct and/or modify the stored data by network users who are authorized to access this information.

  4. Slurry spray distribution within a simulated laboratory scale spray dryer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bertone, P.C.

    1979-12-20

    It was found that the distribution of liquid striking the sides of a simulated room temperature spray dryer was not significantly altered by the choice of nozles, nor by a variation in nozzle operating conditions. Instead, it was found to be a function of the spray dryer's configuration. A cocurrent flow of air down the drying cylinder, not possible with PNL's closed top, favorably altered the spray distribution by both decreasing the amount of liquid striking the interior of the cylinder from 72 to 26% of the feed supplied, and by shifting the zone of maximum impact from 1.0 tomore » 1.7 feet from the nozzle. These findings led to the redesign of the laboratory scale spray dryer to be tested at the Savannah River Plant. The diameter of the drying chamber was increased from 5 to 8 inches, and a cocurrent flow of air was established with a closed recycle. Finally, this investigation suggested a drying scheme which offers all the advantages of spray drying without many of its limitations.« less

  5. Construction of the Propulsion Systems Laboratory No. 1 and 2

    NASA Image and Video Library

    1951-01-21

    Construction of the Propulsion Systems Laboratory No. 1 and 2 at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory. When it began operation in late 1952, the Propulsion Systems Laboratory was the NACA’s most powerful facility for testing full-scale engines at simulated flight altitudes. The facility contained two altitude simulating test chambers which were a technological combination of the static sea-level test stands and the complex Altitude Wind Tunnel, which recreated actual flight conditions on a larger scale. NACA Lewis began designing the new facility in 1947 as part of a comprehensive plan to improve the altitude testing capabilities across the lab. The exhaust, refrigeration, and combustion air systems from all the major test facilities were linked. In this way, different facilities could be used to complement the capabilities of one another. Propulsion Systems Laboratory construction began in late summer 1949 with the installation of an overhead exhaust pipe connecting the facility to the Altitude Wind Tunnel and Engine Research Building. The large test section pieces arriving in early 1951, when this photograph was taken. The two primary coolers for the altitude exhaust are in place within the framework near the center of the photograph.

  6. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  7. Laboratory simulation of space plasma phenomena*

    NASA Astrophysics Data System (ADS)

    Amatucci, B.; Tejero, E. M.; Ganguli, G.; Blackwell, D.; Enloe, C. L.; Gillman, E.; Walker, D.; Gatling, G.

    2017-12-01

    Laboratory devices, such as the Naval Research Laboratory's Space Physics Simulation Chamber, are large-scale experiments dedicated to the creation of large-volume plasmas with parameters realistically scaled to those found in various regions of the near-Earth space plasma environment. Such devices make valuable contributions to the understanding of space plasmas by investigating phenomena under carefully controlled, reproducible conditions, allowing for the validation of theoretical models being applied to space data. By working in collaboration with in situ experimentalists to create realistic conditions scaled to those found during the observations of interest, the microphysics responsible for the observed events can be investigated in detail not possible in space. To date, numerous investigations of phenomena such as plasma waves, wave-particle interactions, and particle energization have been successfully performed in the laboratory. In addition to investigations such as plasma wave and instability studies, the laboratory devices can also make valuable contributions to the development and testing of space plasma diagnostics. One example is the plasma impedance probe developed at NRL. Originally developed as a laboratory diagnostic, the sensor has now been flown on a sounding rocket, is included on a CubeSat experiment, and will be included on the DoD Space Test Program's STP-H6 experiment on the International Space Station. In this presentation, we will describe several examples of the laboratory investigation of space plasma waves and instabilities and diagnostic development. *This work supported by the NRL Base Program.

  8. Cloud-enabled large-scale land surface model simulations with the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.

    2017-12-01

    Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and

  9. Facilitating Co-Design for Extreme-Scale Systems Through Lightweight Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engelmann, Christian; Lauer, Frank

    This work focuses on tools for investigating algorithm performance at extreme scale with millions of concurrent threads and for evaluating the impact of future architecture choices to facilitate the co-design of high-performance computing (HPC) architectures and applications. The approach focuses on lightweight simulation of extreme-scale HPC systems with the needed amount of accuracy. The prototype presented in this paper is able to provide this capability using a parallel discrete event simulation (PDES), such that a Message Passing Interface (MPI) application can be executed at extreme scale, and its performance properties can be evaluated. The results of an initial prototype aremore » encouraging as a simple 'hello world' MPI program could be scaled up to 1,048,576 virtual MPI processes on a four-node cluster, and the performance properties of two MPI programs could be evaluated at up to 16,384 virtual MPI processes on the same system.« less

  10. Using the Human Systems Simulation Laboratory at Idaho National Laboratory for Safety Focused Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joe, Jeffrey .C; Boring, Ronald L.

    Under the United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program, researchers at Idaho National Laboratory (INL) have been using the Human Systems Simulation Laboratory (HSSL) to conduct critical safety focused Human Factors research and development (R&D) for the nuclear industry. The LWRS program has the overall objective to develop the scientific basis to extend existing nuclear power plant (NPP) operating life beyond the current 60-year licensing period and to ensure their long-term reliability, productivity, safety, and security. One focus area for LWRS is the NPP main control room (MCR), because many of the instrumentation andmore » control (I&C) system technologies installed in the MCR, while highly reliable and safe, are now difficult to replace and are therefore limiting the operating life of the NPP. This paper describes how INL researchers use the HSSL to conduct Human Factors R&D on modernizing or upgrading these I&C systems in a step-wise manner, and how the HSSL has addressed a significant gap in how to upgrade systems and technologies that are built to last, and therefore require careful integration of analog and new advanced digital technologies.« less

  11. Laboratory-Scale Simulation and Real-Time Tracking of a Microbial Contamination Event and Subsequent Shock-Chlorination in Drinking Water

    PubMed Central

    Besmer, Michael D.; Sigrist, Jürg A.; Props, Ruben; Buysschaert, Benjamin; Mao, Guannan; Boon, Nico; Hammes, Frederik

    2017-01-01

    Rapid contamination of drinking water in distribution and storage systems can occur due to pressure drop, backflow, cross-connections, accidents, and bio-terrorism. Small volumes of a concentrated contaminant (e.g., wastewater) can contaminate large volumes of water in a very short time with potentially severe negative health impacts. The technical limitations of conventional, cultivation-based microbial detection methods neither allow for timely detection of such contaminations, nor for the real-time monitoring of subsequent emergency remediation measures (e.g., shock-chlorination). Here we applied a newly developed continuous, ultra high-frequency flow cytometry approach to track a rapid pollution event and subsequent disinfection of drinking water in an 80-min laboratory scale simulation. We quantified total (TCC) and intact (ICC) cell concentrations as well as flow cytometric fingerprints in parallel in real-time with two different staining methods. The ingress of wastewater was detectable almost immediately (i.e., after 0.6% volume change), significantly changing TCC, ICC, and the flow cytometric fingerprint. Shock chlorination was rapid and detected in real time, causing membrane damage in the vast majority of bacteria (i.e., drop of ICC from more than 380 cells μl-1 to less than 30 cells μl-1 within 4 min). Both of these effects as well as the final wash-in of fresh tap water followed calculated predictions well. Detailed and highly quantitative tracking of microbial dynamics at very short time scales and for different characteristics (e.g., concentration, membrane integrity) is feasible. This opens up multiple possibilities for targeted investigation of a myriad of bacterial short-term dynamics (e.g., disinfection, growth, detachment, operational changes) both in laboratory-scale research and full-scale system investigations in practice. PMID:29085343

  12. The Analysis, Numerical Simulation, and Diagnosis of Extratropical Weather Systems

    DTIC Science & Technology

    1999-09-30

    The Analysis, Numerical Simulation, and Diagnosis of Extratropical Weather Systems Dr. Melvyn A. Shapiro NOAA/Environmental Technology Laboratory...formulation, and numerical prediction of the life cycles of synoptic-scale and mesoscale extratropical weather systems, including the influence of planetary...scale inter-annual and intra-seasonal variability on their evolution. These weather systems include: extratropical oceanic and land-falling cyclones

  13. Simulation of the 3-D Evolution of Electron Scale Magnetic Reconnection - Motivated by Laboratory Experiments Predictions for MMS

    NASA Astrophysics Data System (ADS)

    Buechner, J.; Jain, N.; Sharma, A.

    2013-12-01

    The four s/c of the Magnetospheric Multiscale (MMS) mission, to be launched in 2014, will use the Earth's magnetosphere as a laboratory to study the microphysics of three fundamental plasma processes. One of them is magnetic reconnection, an essentially multi-scale process. While laboratory experiments and past theoretical investigations have shown that important processes necessary to understand magnetic reconnection take place at electron scales the MMS mission for the first time will be able to resolve these scales by in space observations. For the measurement strategy of MMS it is important to make specific predictions of the behavior of current sheets with a thickness of the order of the electron skin depth which play an important role in the evolution of collisionless magnetic reconnection. Since these processes are highly nonlinear and non-local numerical simulation is needed to specify the current sheet evolution. Here we present new results about the nonlinear evolution of electron-scale current sheets starting from the linear stage and using 3-D electron-magnetohydrodynamic (EMHD) simulations. The growth rates of the simulated instabilities compared well with the growth rates obtained from linear theory. Mechanisms and conditions of the formation of flux ropes and of current filamentation will be discussed in comparison with the results of fully kinetic simulations. In 3D the X- and O-point configurations of the magnetic field formed in reconnection planes alternate along the out-of-reconnection-plane direction with the wavelength of the unstable mode. In the presence of multiple reconnection sites, the out-of-plane magnetic field can develop nested structure of quadrupoles in reconnection planes, similar to the 2-D case, but now with variations in the out-of-plane direction. The structures of the electron flow and magnetic field in 3-D simulations will be compared with those in 2-D simulations to discriminate the essentially 3D features. We also discuss

  14. Interfacing Space Communications and Navigation Network Simulation with Distributed System Integration Laboratories (DSIL)

    NASA Technical Reports Server (NTRS)

    Jennings, Esther H.; Nguyen, Sam P.; Wang, Shin-Ywan; Woo, Simon S.

    2008-01-01

    NASA's planned Lunar missions will involve multiple NASA centers where each participating center has a specific role and specialization. In this vision, the Constellation program (CxP)'s Distributed System Integration Laboratories (DSIL) architecture consist of multiple System Integration Labs (SILs), with simulators, emulators, testlabs and control centers interacting with each other over a broadband network to perform test and verification for mission scenarios. To support the end-to-end simulation and emulation effort of NASA' exploration initiatives, different NASA centers are interconnected to participate in distributed simulations. Currently, DSIL has interconnections among the following NASA centers: Johnson Space Center (JSC), Kennedy Space Center (KSC), Marshall Space Flight Center (MSFC) and Jet Propulsion Laboratory (JPL). Through interconnections and interactions among different NASA centers, critical resources and data can be shared, while independent simulations can be performed simultaneously at different NASA locations, to effectively utilize the simulation and emulation capabilities at each center. Furthermore, the development of DSIL can maximally leverage the existing project simulation and testing plans. In this work, we describe the specific role and development activities at JPL for Space Communications and Navigation Network (SCaN) simulator using the Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) tool to simulate communications effects among mission assets. Using MACHETE, different space network configurations among spacecrafts and ground systems of various parameter sets can be simulated. Data that is necessary for tracking, navigation, and guidance of spacecrafts such as Crew Exploration Vehicle (CEV), Crew Launch Vehicle (CLV), and Lunar Relay Satellite (LRS) and orbit calculation data are disseminated to different NASA centers and updated periodically using the High Level Architecture (HLA). In

  15. EPOS-WP16: A coherent and collaborative network of Solid Earth Multi-scale laboratories

    NASA Astrophysics Data System (ADS)

    Calignano, Elisa; Rosenau, Matthias; Lange, Otto; Spiers, Chris; Willingshofer, Ernst; Drury, Martyn; van Kan-Parker, Mirjam; Elger, Kirsten; Ulbricht, Damian; Funiciello, Francesca; Trippanera, Daniele; Sagnotti, Leonardo; Scarlato, Piergiorgio; Tesei, Telemaco; Winkler, Aldo

    2017-04-01

    Laboratory facilities are an integral part of Earth Science research. The diversity of methods employed in such infrastructures reflects the multi-scale nature of the Earth system and is essential for the understanding of its evolution, for the assessment of geo-hazards and for the sustainable exploitation of geo-resources. In the frame of EPOS (European Plate Observing System), the Working Package 16 represents a developing community of European Geoscience Multi-scale laboratories. The participant and collaborating institutions (Utrecht University, GFZ, RomaTre University, INGV, NERC, CSIC-ICTJA, CNRS, LMU, C4G-UBI, ETH, CNR*) embody several types of laboratory infrastructures, engaged in different fields of interest of Earth Science: from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue tectonic and geodynamic modelling and paleomagnetic laboratories. The length scales encompassed by these infrastructures range from the nano- and micrometre levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetres-sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. The aim of WP16 is to provide two services by the year 2019: first, providing virtual access to data from laboratories (data service) and, second, providing physical access to laboratories (transnational access, TNA). Regarding the development of a data service, the current status is such that most data produced by the various laboratory centres and networks are available only in limited "final form" in publications, many data remain inaccessible and/or poorly preserved. Within EPOS the TCS Multi-scale laboratories is collecting and harmonizing available and emerging laboratory data on the properties and process controlling rock system behaviour at all relevant scales, in order to generate products accessible and interoperable through services for supporting

  16. Kinetic turbulence simulations at extreme scale on leadership-class systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Bei; Ethier, Stephane; Tang, William

    2013-01-01

    Reliable predictive simulation capability addressing confinement properties in magnetically confined fusion plasmas is critically-important for ITER, a 20 billion dollar international burning plasma device under construction in France. The complex study of kinetic turbulence, which can severely limit the energy confinement and impact the economic viability of fusion systems, requires simulations at extreme scale for such an unprecedented device size. Our newly optimized, global, ab initio particle-in-cell code solving the nonlinear equations underlying gyrokinetic theory achieves excellent performance with respect to "time to solution" at the full capacity of the IBM Blue Gene/Q on 786,432 cores of Mira at ALCFmore » and recently of the 1,572,864 cores of Sequoia at LLNL. Recent multithreading and domain decomposition optimizations in the new GTC-P code represent critically important software advances for modern, low memory per core systems by enabling routine simulations at unprecedented size (130 million grid points ITER-scale) and resolution (65 billion particles).« less

  17. Development of a parallel FE simulator for modeling the whole trans-scale failure process of rock from meso- to engineering-scale

    NASA Astrophysics Data System (ADS)

    Li, Gen; Tang, Chun-An; Liang, Zheng-Zhao

    2017-01-01

    Multi-scale high-resolution modeling of rock failure process is a powerful means in modern rock mechanics studies to reveal the complex failure mechanism and to evaluate engineering risks. However, multi-scale continuous modeling of rock, from deformation, damage to failure, has raised high requirements on the design, implementation scheme and computation capacity of the numerical software system. This study is aimed at developing the parallel finite element procedure, a parallel rock failure process analysis (RFPA) simulator that is capable of modeling the whole trans-scale failure process of rock. Based on the statistical meso-damage mechanical method, the RFPA simulator is able to construct heterogeneous rock models with multiple mechanical properties, deal with and represent the trans-scale propagation of cracks, in which the stress and strain fields are solved for the damage evolution analysis of representative volume element by the parallel finite element method (FEM) solver. This paper describes the theoretical basis of the approach and provides the details of the parallel implementation on a Windows - Linux interactive platform. A numerical model is built to test the parallel performance of FEM solver. Numerical simulations are then carried out on a laboratory-scale uniaxial compression test, and field-scale net fracture spacing and engineering-scale rock slope examples, respectively. The simulation results indicate that relatively high speedup and computation efficiency can be achieved by the parallel FEM solver with a reasonable boot process. In laboratory-scale simulation, the well-known physical phenomena, such as the macroscopic fracture pattern and stress-strain responses, can be reproduced. In field-scale simulation, the formation process of net fracture spacing from initiation, propagation to saturation can be revealed completely. In engineering-scale simulation, the whole progressive failure process of the rock slope can be well modeled. It is

  18. Laboratory formation of a scaled protostellar jet by coaligned poloidal magnetic field.

    PubMed

    Albertazzi, B; Ciardi, A; Nakatsutsumi, M; Vinci, T; Béard, J; Bonito, R; Billette, J; Borghesi, M; Burkley, Z; Chen, S N; Cowan, T E; Herrmannsdörfer, T; Higginson, D P; Kroll, F; Pikuz, S A; Naughton, K; Romagnani, L; Riconda, C; Revet, G; Riquier, R; Schlenvoigt, H-P; Skobelev, I Yu; Faenov, A Ya; Soloviev, A; Huarte-Espinosa, M; Frank, A; Portugall, O; Pépin, H; Fuchs, J

    2014-10-17

    Although bipolar jets are seen emerging from a wide variety of astrophysical systems, the issue of their formation and morphology beyond their launching is still under study. Our scaled laboratory experiments, representative of young stellar object outflows, reveal that stable and narrow collimation of the entire flow can result from the presence of a poloidal magnetic field whose strength is consistent with observations. The laboratory plasma becomes focused with an interior cavity. This gives rise to a standing conical shock from which the jet emerges. Following simulations of the process at the full astrophysical scale, we conclude that it can also explain recently discovered x-ray emission features observed in low-density regions at the base of protostellar jets, such as the well-studied jet HH 154. Copyright © 2014, American Association for the Advancement of Science.

  19. Virtual Earth System Laboratory (VESL): A Virtual Research Environment for The Visualization of Earth System Data and Process Simulations

    NASA Astrophysics Data System (ADS)

    Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.

    2017-12-01

    The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.

  20. A multi-scaled approach for simulating chemical reaction systems.

    PubMed

    Burrage, Kevin; Tian, Tianhai; Burrage, Pamela

    2004-01-01

    In this paper we give an overview of some very recent work, as well as presenting a new approach, on the stochastic simulation of multi-scaled systems involving chemical reactions. In many biological systems (such as genetic regulation and cellular dynamics) there is a mix between small numbers of key regulatory proteins, and medium and large numbers of molecules. In addition, it is important to be able to follow the trajectories of individual molecules by taking proper account of the randomness inherent in such a system. We describe different types of simulation techniques (including the stochastic simulation algorithm, Poisson Runge-Kutta methods and the balanced Euler method) for treating simulations in the three different reaction regimes: slow, medium and fast. We then review some recent techniques on the treatment of coupled slow and fast reactions for stochastic chemical kinetics and present a new approach which couples the three regimes mentioned above. We then apply this approach to a biologically inspired problem involving the expression and activity of LacZ and LacY proteins in E. coli, and conclude with a discussion on the significance of this work. Copyright 2004 Elsevier Ltd.

  1. Computer simulation of thermal and fluid systems for MIUS integration and subsystems test /MIST/ laboratory. [Modular Integrated Utility System

    NASA Technical Reports Server (NTRS)

    Rochelle, W. C.; Liu, D. K.; Nunnery, W. J., Jr.; Brandli, A. E.

    1975-01-01

    This paper describes the application of the SINDA (systems improved numerical differencing analyzer) computer program to simulate the operation of the NASA/JSC MIUS integration and subsystems test (MIST) laboratory. The MIST laboratory is designed to test the integration capability of the following subsystems of a modular integrated utility system (MIUS): (1) electric power generation, (2) space heating and cooling, (3) solid waste disposal, (4) potable water supply, and (5) waste water treatment. The SINDA/MIST computer model is designed to simulate the response of these subsystems to externally impressed loads. The computer model determines the amount of recovered waste heat from the prime mover exhaust, water jacket and oil/aftercooler and from the incinerator. This recovered waste heat is used in the model to heat potable water, for space heating, absorption air conditioning, waste water sterilization, and to provide for thermal storage. The details of the thermal and fluid simulation of MIST including the system configuration, modes of operation modeled, SINDA model characteristics and the results of several analyses are described.

  2. A comparison of traditional physical laboratory and computer-simulated laboratory experiences in relation to engineering undergraduate students' conceptual understandings of a communication systems topic

    NASA Astrophysics Data System (ADS)

    Javidi, Giti

    2005-07-01

    This study was designed to investigate an alternative to the use of traditional physical laboratory activities in a communication systems course. Specifically, this study examined whether as an alternative, computer simulation is as effective as physical laboratory activities in teaching college-level electronics engineering education students about the concepts of signal transmission, modulation and demodulation. Eighty undergraduate engineering students participated in the study, which was conducted at a southeastern four-year university. The students were randomly assigned to two groups. The groups were compared on understanding the concepts, remembering the concepts, completion time of the lab experiments and perception toward the laboratory experiments. The physical group's (n = 40) treatment was to conduct laboratory experiments in a physical laboratory. The students in this group used equipment in a controlled electronics laboratory. The Simulation group's (n = 40) treatment was to conduct similar experiments in a PC laboratory. The students in this group used a simulation program in a controlled PC lab. At the completion of the treatment, scores on a validated conceptual test were collected once after the treatment and again three weeks after the treatment. Attitude surveys and qualitative study were administered at the completion of the treatment. The findings revealed significant differences, in favor of the simulation group, between the two groups on both the conceptual post-test and the follow-up test. The findings also revealed significant correlation between simulation groups' attitude toward the simulation program and their post-test scores. Moreover, there was a significant difference between the two groups on their attitude toward their laboratory experience in favor of the simulation group. In addition, there was significant difference between the two groups on their lab completion time in favor of the simulation group. At the same time, the

  3. Communication interval selection in distributed heterogeneous simulation of large-scale dynamical systems

    NASA Astrophysics Data System (ADS)

    Lucas, Charles E.; Walters, Eric A.; Jatskevich, Juri; Wasynczuk, Oleg; Lamm, Peter T.

    2003-09-01

    In this paper, a new technique useful for the numerical simulation of large-scale systems is presented. This approach enables the overall system simulation to be formed by the dynamic interconnection of the various interdependent simulations, each representing a specific component or subsystem such as control, electrical, mechanical, hydraulic, or thermal. Each simulation may be developed separately using possibly different commercial-off-the-shelf simulation programs thereby allowing the most suitable language or tool to be used based on the design/analysis needs. These subsystems communicate the required interface variables at specific time intervals. A discussion concerning the selection of appropriate communication intervals is presented herein. For the purpose of demonstration, this technique is applied to a detailed simulation of a representative aircraft power system, such as that found on the Joint Strike Fighter (JSF). This system is comprised of ten component models each developed using MATLAB/Simulink, EASY5, or ACSL. When the ten component simulations were distributed across just four personal computers (PCs), a greater than 15-fold improvement in simulation speed (compared to the single-computer implementation) was achieved.

  4. Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System

    NASA Astrophysics Data System (ADS)

    He, Qing; Li, Hong

    Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.

  5. A tide prediction and tide height control system for laboratory mesocosms

    PubMed Central

    Long, Jeremy D.

    2015-01-01

    Experimental mesocosm studies of rocky shore and estuarine intertidal systems may benefit from the application of natural tide cycles to better replicate variation in immersion time, water depth, and attendant fluctuations in abiotic and edaphic conditions. Here we describe a stand-alone microcontroller tide prediction open-source software program, coupled with a mechanical tidal elevation control system, which allows continuous adjustment of aquarium water depths in synchrony with local tide cycles. We used this system to monitor the growth of Spartina foliosa marsh cordgrass and scale insect herbivores at three simulated shore elevations in laboratory mesocosms. Plant growth decreased with increasing shore elevation, while scale insect population growth on the plants was not strongly affected by immersion time. This system shows promise for a range of laboratory mesocosm studies where natural tide cycling could impact organism performance or behavior, while the tide prediction system could additionally be utilized in field experiments where treatments need to be applied at certain stages of the tide cycle. PMID:26623195

  6. PRACTICAL SIMULATION OF COMPOSTING IN THE LABORATORY

    EPA Science Inventory

    A closed incubation system was developed for laboratory simulation of composting conditions at the interior of a large compost pile. A conductive heat flux control system (CHFC) was used to adjust the temperature of the internal wall to that of the compost center and compensate f...

  7. Validation of mathematical model for CZ process using small-scale laboratory crystal growth furnace

    NASA Astrophysics Data System (ADS)

    Bergfelds, Kristaps; Sabanskis, Andrejs; Virbulis, Janis

    2018-05-01

    The present material is focused on the modelling of small-scale laboratory NaCl-RbCl crystal growth furnace. First steps towards fully transient simulations are taken in the form of stationary simulations that deal with the optimization of material properties to match the model to experimental conditions. For this purpose, simulation software primarily used for the modelling of industrial-scale silicon crystal growth process was successfully applied. Finally, transient simulations of the crystal growth are presented, giving a sufficient agreement to experimental results.

  8. A new synoptic scale resolving global climate simulation using the Community Earth System Model

    NASA Astrophysics Data System (ADS)

    Small, R. Justin; Bacmeister, Julio; Bailey, David; Baker, Allison; Bishop, Stuart; Bryan, Frank; Caron, Julie; Dennis, John; Gent, Peter; Hsu, Hsiao-ming; Jochum, Markus; Lawrence, David; Muñoz, Ernesto; diNezio, Pedro; Scheitlin, Tim; Tomas, Robert; Tribbia, Joseph; Tseng, Yu-heng; Vertenstein, Mariana

    2014-12-01

    High-resolution global climate modeling holds the promise of capturing planetary-scale climate modes and small-scale (regional and sometimes extreme) features simultaneously, including their mutual interaction. This paper discusses a new state-of-the-art high-resolution Community Earth System Model (CESM) simulation that was performed with these goals in mind. The atmospheric component was at 0.25° grid spacing, and ocean component at 0.1°. One hundred years of "present-day" simulation were completed. Major results were that annual mean sea surface temperature (SST) in the equatorial Pacific and El-Niño Southern Oscillation variability were well simulated compared to standard resolution models. Tropical and southern Atlantic SST also had much reduced bias compared to previous versions of the model. In addition, the high resolution of the model enabled small-scale features of the climate system to be represented, such as air-sea interaction over ocean frontal zones, mesoscale systems generated by the Rockies, and Tropical Cyclones. Associated single component runs and standard resolution coupled runs are used to help attribute the strengths and weaknesses of the fully coupled run. The high-resolution run employed 23,404 cores, costing 250 thousand processor-hours per simulated year and made about two simulated years per day on the NCAR-Wyoming supercomputer "Yellowstone."

  9. Model Scaling of Hydrokinetic Ocean Renewable Energy Systems

    NASA Astrophysics Data System (ADS)

    von Ellenrieder, Karl; Valentine, William

    2013-11-01

    Numerical simulations are performed to validate a non-dimensional dynamic scaling procedure that can be applied to subsurface and deeply moored systems, such as hydrokinetic ocean renewable energy devices. The prototype systems are moored in water 400 m deep and include: subsurface spherical buoys moored in a shear current and excited by waves; an ocean current turbine excited by waves; and a deeply submerged spherical buoy in a shear current excited by strong current fluctuations. The corresponding model systems, which are scaled based on relative water depths of 10 m and 40 m, are also studied. For each case examined, the response of the model system closely matches the scaled response of the corresponding full-sized prototype system. The results suggest that laboratory-scale testing of complete ocean current renewable energy systems moored in a current is possible. This work was supported by the U.S. Southeast National Marine Renewable Energy Center (SNMREC).

  10. Laboratory simulation of rocket-borne D-region blunt probe flows

    NASA Technical Reports Server (NTRS)

    Kaplan, L. B.

    1977-01-01

    The flow of weakly ionized plasmas that is similar to the flow that occurs over rocket-borne blunt probes as they pass through the lower ionosphere has been simulated in a scaled laboratory environment, and electron collection D region blunt probe theories have been evaluated.

  11. Quantifying the role that laboratory experiment sample scale has on observed material properties and mechanistic behaviors that cause well systems to fail

    NASA Astrophysics Data System (ADS)

    Huerta, N. J.; Fahrman, B.; Rod, K. A.; Fernandez, C. A.; Crandall, D.; Moore, J.

    2017-12-01

    Laboratory experiments provide a robust method to analyze well integrity. Experiments are relatively cheap, controlled, and repeatable. However, simplifying assumptions, apparatus limitations, and scaling are ubiquitous obstacles for translating results from the bench to the field. We focus on advancing the correlation between laboratory results and field conditions by characterizing how failure varies with specimen geometry using two experimental approaches. The first approach is designed to measure the shear bond strength between steel and cement in a down-scaled (< 3" diameter) well geometry. We use several cylindrical casing-cement-casing geometries that either mimic the scaling ratios found in the field or maximize the amount of metal and cement in the sample. We subject the samples to thermal shock cycles to simulate damage to the interfaces from operations. The bond was then measured via a push-out test. We found that not only did expected parameters, e.g. curing time, play a role in shear-bond strength but also that scaling of the geometry was important. The second approach is designed to observe failure of the well system due to pressure applied on the inside of a lab-scale (1.5" diameter) cylindrical casing-cement-rock geometry. The loading apparatus and sample are housed within an industrial X-ray CT scanner capable of imaging the system while under pressure. Radial tension cracks were observed in the cement after an applied internal pressure of 3000 psi and propagated through the cement and into the rock as pressure was increased. Based on our current suite of tests we find that the relationship between sample diameters and thicknesses is an important consideration when observing the strength and failure of well systems. The test results contribute to our knowledge of well system failure, evaluation and optimization of new cements, as well as the applicability of using scaled-down tests as a proxy for understanding field-scale conditions.

  12. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  13. A comparison of refuse attenuation in laboratory and field scale lysimeters.

    PubMed

    Youcai, Zhao; Luochun, Wang; Renhua, Hua; Dimin, Xu; Guowei, Gu

    2002-01-01

    For this study, small and middle scale laboratory lysimeters, and a large scale field lysimeter in situ in Shanghai Refuse Landfill, with refuse weights of 187,600 and 10,800,000 kg, respectively, were created. These lysimeters are compared in terms of leachate quality (pH, concentrations of COD, BOD and NH3-N), refuse composition (biodegradable matter and volatile solid) and surface settlement for a monitoring period of 0-300 days. The objectives of this study were to explore both the similarities and disparities between laboratory and field scale lysimeters, and to compare degradation behaviors of refuse at the intensive reaction phase in the different scale lysimeters. Quantitative relationships of leachate quality and refuse composition with placement time show that degradation behaviors of refuse seem to depend heavily on the scales of the lysimeters and the parameters of concern, especially in the starting period of 0-6 months. However, some similarities exist between laboratory and field lysimeters after 4-6 months of placement because COD and BOD concentrations in leachate in the field lysimeter decrease regularly in a parallel pattern with those in the laboratory lysimeters. NH3-N, volatile solid (VS) and biodegradable matter (BDM) also gradually decrease in parallel in this intensive reaction phase for all scale lysimeters as refuse ages. Though the concrete data are different among the different scale lysimeters, it may be considered that laboratory lysimeters with sufficient scale are basically applicable for a rough simulation of a real landfill, especially for illustrating the degradation pattern and mechanism. Settlement of refuse surface is roughly proportional to the initial refuse height.

  14. Scaling and pedotransfer in numerical simulations of flow and transport in soils

    USDA-ARS?s Scientific Manuscript database

    Flow and transport parameters of soils in numerical simulations need to be defined at the support scale of computational grid cells. Such support scale can substantially differ from the support scale in laboratory or field measurements of flow and transport parameters. The scale-dependence of flow a...

  15. Numerical simulation of small-scale thermal convection in the atmosphere

    NASA Technical Reports Server (NTRS)

    Somerville, R. C. J.

    1973-01-01

    A Boussinesq system is integrated numerically in three dimensions and time in a study of nonhydrostatic convection in the atmosphere. Simulation of cloud convection is achieved by the inclusion of parametrized effects of latent heat and small-scale turbulence. The results are compared with the cell structure observed in Rayleigh-Benard laboratory conversion experiments in air. At a Rayleigh number of 4000, the numerical model adequately simulates the experimentally observed evolution, including some prominent transients of a flow from a randomly perturbed initial conductive state into the final state of steady large-amplitude two-dimensional rolls. At Rayleigh number 9000, the model reproduces the experimentally observed unsteady equilibrium of vertically coherent oscillatory waves superimposed on rolls.

  16. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  17. Laboratory observations and simulations of phase reddening

    NASA Astrophysics Data System (ADS)

    Schröder, S. E.; Grynko, Ye.; Pommerol, A.; Keller, H. U.; Thomas, N.; Roush, T. L.

    2014-09-01

    The visible reflectance spectrum of many Solar System bodies changes with changing viewing geometry for reasons not fully understood. It is often observed to redden (increasing spectral slope) with increasing solar phase angle, an effect known as phase reddening. Only once, in an observation of the martian surface by the Viking 1 lander, was reddening observed up to a certain phase angle with bluing beyond, making the reflectance ratio as a function of phase angle shaped like an arch. However, in laboratory experiments this arch-shape is frequently encountered. To investigate this, we measured the bidirectional reflectance of particulate samples of several common rock types in the 400-1000 nm wavelength range and performed ray-tracing simulations. We confirm the occurrence of the arch for surfaces that are forward scattering, i.e. are composed of semi-transparent particles and are smooth on the scale of the particles, and for which the reflectance increases from the lower to the higher wavelength in the reflectance ratio. The arch shape is reproduced by the simulations, which assume a smooth surface. However, surface roughness on the scale of the particles, such as the Hapke and van Horn (Hapke, B., van Horn, H. [1963]. J. Geophys. Res. 68, 4545-4570) fairy castles that can spontaneously form when sprinkling a fine powder, leads to monotonic reddening. A further consequence of this form of microscopic roughness (being indistinct without the use of a microscope) is a flattening of the disk function at visible wavelengths, i.e. Lommel-Seeliger-type scattering. The experiments further reveal monotonic reddening for reflectance ratios at near-IR wavelengths. The simulations fail to reproduce this particular reddening, and we suspect that it results from roughness on the surface of the particles. Given that the regolith of atmosphereless Solar System bodies is composed of small particles, our results indicate that the prevalence of monotonic reddening and Lommel

  18. Simulated Laboratory in Digital Logic.

    ERIC Educational Resources Information Center

    Cleaver, Thomas G.

    Design of computer circuits used to be a pencil and paper task followed by laboratory tests, but logic circuit design can now be done in half the time as the engineer accesses a program which simulates the behavior of real digital circuits, and does all the wiring and testing on his computer screen. A simulated laboratory in digital logic has been…

  19. RAM simulation model for SPH/RSV systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.C.; Primm, A.H.; Nelson, S.C.

    1995-12-31

    The US Army`s Project Manager, Crusader is sponsoring the development of technologies that apply to the Self-Propelled Howitzer (SPH), formerly the Advanced Field Artillery System (AFAS), and Resupply Vehicle (RSV), formerly the Future Armored Resupply Vehicle (FARV), weapon system. Oak Ridge National Laboratory (ORNL) is currently performing developmental work in support of the SPH/PSV Crusader system. Supportive analyses of reliability, availability, and maintainability (RAM) aspects were also performed for the SPH/RSV effort. During FY 1994 and FY 1995 OPNL conducted a feasibility study to demonstrate the application of simulation modeling for RAM analysis of the Crusader system. Following completion ofmore » the feasibility study, a full-scale RAM simulation model of the Crusader system was developed for both the SPH and PSV. This report provides documentation for the simulation model as well as instructions in the proper execution and utilization of the model for the conduct of RAM analyses.« less

  20. BlazeDEM3D-GPU A Large Scale DEM simulation code for GPUs

    NASA Astrophysics Data System (ADS)

    Govender, Nicolin; Wilke, Daniel; Pizette, Patrick; Khinast, Johannes

    2017-06-01

    Accurately predicting the dynamics of particulate materials is of importance to numerous scientific and industrial areas with applications ranging across particle scales from powder flow to ore crushing. Computational discrete element simulations is a viable option to aid in the understanding of particulate dynamics and design of devices such as mixers, silos and ball mills, as laboratory scale tests comes at a significant cost. However, the computational time required to simulate an industrial scale simulation which consists of tens of millions of particles can take months to complete on large CPU clusters, making the Discrete Element Method (DEM) unfeasible for industrial applications. Simulations are therefore typically restricted to tens of thousands of particles with highly detailed particle shapes or a few million of particles with often oversimplified particle shapes. However, a number of applications require accurate representation of the particle shape to capture the macroscopic behaviour of the particulate system. In this paper we give an overview of the recent extensions to the open source GPU based DEM code, BlazeDEM3D-GPU, that can simulate millions of polyhedra and tens of millions of spheres on a desktop computer with a single or multiple GPUs.

  1. Additional confirmation of the validity of laboratory simulation of cloud radiances

    NASA Technical Reports Server (NTRS)

    Davis, J. M.; Cox, S. K.

    1986-01-01

    The results of a laboratory experiment are presented that provide additional verification of the methodology adopted for simulation of the radiances reflected from fields of optically thick clouds using the Cloud Field Optical Simulator (CFOS) at Colorado State University. The comparison of these data with their theoretically derived counterparts indicates that the crucial mechanism of cloud-to-cloud radiance field interaction is accurately simulated in the CFOS experiments and adds confidence to the manner in which the optical depth is scaled.

  2. Recirculation System for Geothermal Energy Recovery in Sedimentary Formations: Laboratory Experiments and Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Elkhoury, J. E.; Detwiler, R. L.; Serajian, V.; Bruno, M. S.

    2012-12-01

    Geothermal energy resources are more widespread than previously thought and have the potential for providing a significant amount of sustainable clean energy worldwide. In particular, hot permeable sedimentary formations provide many advantages over traditional geothermal recovery and enhanced geothermal systems in low permeability crystalline formations. These include: (1) eliminating the need for hydraulic fracturing, (2) significant reduction in risk for induced seismicity, (3) reducing the need for surface wastewater disposal, (4) contributing to decreases in greenhouse gases, and (5) potential use for CO2 sequestration. Advances in horizontal drilling, completion, and production technology from the oil and gas industry can now be applied to unlock these geothermal resources. Here, we present experimental results from a laboratory scale circulation system and numerical simulations aimed at quantifying the heat transfer capacity of sedimentary rocks. Our experiments consist of fluid flow through a saturated and pressurized sedimentary disc of 23-cm diameter and 3.8-cm thickness heated along its circumference at a constant temperature. Injection and production ports are 7.6-cm apart in the center of the disc. We used DI de-aired water and mineral oil as working fluids and explored temperatures from 20 to 150 oC and flow rates from 2 to 30 ml/min. We performed experiments on sandstone samples (Castlegate and Kirby) with different porosity, permeability and thermal conductivity to evaluate the effect of hydraulic and thermal properties on the heat transfer capacity of sediments. The producing fluid temperature followed an exponential form with time scale transients between 15 and 45 min. Steady state outflow temperatures varied between 60% and 95% of the set boundary temperature, higher percentages were observed for lower temperatures and flow rates. We used the flow and heat transport simulator TOUGH2 to develop a numerical model of our laboratory setting. Given

  3. Phase Transitions and Scaling in Systems Far from Equilibrium

    NASA Astrophysics Data System (ADS)

    Täuber, Uwe C.

    2017-03-01

    Scaling ideas and renormalization group approaches proved crucial for a deep understanding and classification of critical phenomena in thermal equilibrium. Over the past decades, these powerful conceptual and mathematical tools were extended to continuous phase transitions separating distinct nonequilibrium stationary states in driven classical and quantum systems. In concordance with detailed numerical simulations and laboratory experiments, several prominent dynamical universality classes have emerged that govern large-scale, long-time scaling properties both near and far from thermal equilibrium. These pertain to genuine specific critical points as well as entire parameter space regions for steady states that display generic scale invariance. The exploration of nonstationary relaxation properties and associated physical aging scaling constitutes a complementary potent means to characterize cooperative dynamics in complex out-of-equilibrium systems. This review describes dynamic scaling features through paradigmatic examples that include near-equilibrium critical dynamics, driven lattice gases and growing interfaces, correlation-dominated reaction-diffusion systems, and basic epidemic models.

  4. Comparisons of Mixed-Phase Icing Cloud Simulations with Experiments Conducted at the NASA Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Bartkus, Tadas; Tsao, Jen-Ching; Struk, Peter

    2017-01-01

    This paper builds on previous work that compares numerical simulations of mixed-phase icing clouds with experimental data. The model couples the thermal interaction between ice particles and water droplets of the icing cloud with the flowing air of an icing wind tunnel for simulation of NASA Glenn Research Centers (GRC) Propulsion Systems Laboratory (PSL). Measurements were taken during the Fundamentals of Ice Crystal Icing Physics Tests at the PSL tunnel in March 2016. The tests simulated ice-crystal and mixed-phase icing that relate to ice accretions within turbofan engines.

  5. Comparative performance of different scale-down simulators of substrate gradients in Penicillium chrysogenum cultures: the need of a biological systems response analysis.

    PubMed

    Wang, Guan; Zhao, Junfei; Haringa, Cees; Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang; Deshmukh, Amit T; van Gulik, Walter; Heijnen, Joseph J; Noorman, Henk J

    2018-05-01

    In a 54 m 3 large-scale penicillin fermentor, the cells experience substrate gradient cycles at the timescales of global mixing time about 20-40 s. Here, we used an intermittent feeding regime (IFR) and a two-compartment reactor (TCR) to mimic these substrate gradients at laboratory-scale continuous cultures. The IFR was applied to simulate substrate dynamics experienced by the cells at full scale at timescales of tens of seconds to minutes (30 s, 3 min and 6 min), while the TCR was designed to simulate substrate gradients at an applied mean residence time (τc) of 6 min. A biological systems analysis of the response of an industrial high-yielding P. chrysogenum strain has been performed in these continuous cultures. Compared to an undisturbed continuous feeding regime in a single reactor, the penicillin productivity (q PenG ) was reduced in all scale-down simulators. The dynamic metabolomics data indicated that in the IFRs, the cells accumulated high levels of the central metabolites during the feast phase to actively cope with external substrate deprivation during the famine phase. In contrast, in the TCR system, the storage pool (e.g. mannitol and arabitol) constituted a large contribution of carbon supply in the non-feed compartment. Further, transcript analysis revealed that all scale-down simulators gave different expression levels of the glucose/hexose transporter genes and the penicillin gene clusters. The results showed that q PenG did not correlate well with exposure to the substrate regimes (excess, limitation and starvation), but there was a clear inverse relation between q PenG and the intracellular glucose level. © 2018 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  6. A Laboratory Scale Vortex Generator for Simulation of Martian Dust Devils.

    NASA Astrophysics Data System (ADS)

    Balme, M.; Greeley, R.; Mickelson, B.; Iversen, J.; Beardmore, G.; Metzger, S.

    2001-12-01

    Martian dust particles are a few microns in diameter. Current Martian ambient wind speeds appear to be insufficient to lift such fine particles and are marginal to entrain even the optimum particles sizes for threshold (100-160mm diameter). Instead, dust devils were suggested as a local source of airborne particles and have been observed on Mars both from orbit and from lander data. Dust devils lift particles through enhanced local wind speeds and by a pressure drop often associated with the vortex which provides `lift'. This study seeks to 1) quantify the relative importance of enhanced wind speed versus pressure drop lift in dust devil entrainment threshold; 2) measure the mass transport potential of dust devils; 3) investigate the effects of surface roughness and topography on dust devil morphology; 4) quantify the overall effects of low atmospheric pressure on the formation, structure and entrainment processes of dust devils. To investigate the particle lifting properties of dust devils, a laboratory vortex generator was fabricated. It consists of a large vertical cylinder (45 and 75cm in diameter) containing a motor-driven rotor comprised of four vertical blades. Beneath the cylinder is a 2.4 by 2.4 m tabletop containing 14 differential pressure transducer ports used to measure the surface pressure structure of the vortex. Both the distance between the cylinder and the tabletop and the height of the blades within the cylinder can be varied. By controlling these variables and the angular velocity of the blades, a wide range of geometries and intensities of atmospheric vortices can be achieved. The apparatus is portable for use both under terrestrial atmospheric conditions and in the NASA-Ames Research Center Mars Surface Wind Tunnel facility to simulate Martian atmospheric conditions. The laboratory simulation is preferable to a numerical model because direct measurements of dust lifting threshold can be made and holds several advantages over terrestrial field

  7. Manufacturing Laboratory | Energy Systems Integration Facility | NREL

    Science.gov Websites

    Manufacturing Laboratory Manufacturing Laboratory Researchers in the Energy Systems Integration Facility's Manufacturing Laboratory develop methods and technologies to scale up renewable energy technology manufacturing capabilities. Photo of researchers and equipment in the Manufacturing Laboratory. Capability Hubs

  8. Laboratory evaluation of the pointing stability of the ASPS Vernier System

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The annular suspension and pointing system (ASPS) is an end-mount experiment pointing system designed for use in the space shuttle. The results of the ASPS Vernier System (AVS) pointing stability tests conducted in a laboratory environment are documented. A simulated zero-G suspension was used to support the test payload in the laboratory. The AVS and the suspension were modelled and incorporated into a simulation of the laboratory test. Error sources were identified and pointing stability sensitivities were determined via simulation. Statistical predictions of laboratory test performance were derived and compared to actual laboratory test results. The predicted mean pointing stability during simulated shuttle disturbances was 1.22 arc seconds; the actual mean laboratory test pointing stability was 1.36 arc seconds. The successful prediction of laboratory test results provides increased confidence in the analytical understanding of the AVS magnetic bearing technology and allows confident prediction of in-flight performance. Computer simulations of ASPS, operating in the shuttle disturbance environment, predict in-flight pointing stability errors less than 0.01 arc seconds.

  9. Laboratory simulation of field-aligned currents

    NASA Technical Reports Server (NTRS)

    Wessel, Frank J.; Rostoker, Norman

    1993-01-01

    A summary of progress during the period Apr. 1992 to Mar. 1993 is provided. Objectives of the research are (1) to simulate, via laboratory experiments, the three terms of the field-aligned current equation; (2) to simulate auroral-arc formation processes by configuring the boundary conditions of the experimental chamber and plasma parameters to produce highly localized return currents at the end of a field-aligned current system; and (3) to extrapolate these results, using theoretical and computational techniques, to the problem of magnetospheric-ionospheric coupling and to compare them with published literature signatures of auroral-arc phenomena.

  10. Observations and laboratory simulations of tornadoes in complex topographical regions

    NASA Astrophysics Data System (ADS)

    Karstens, Christopher Daniel

    Aerial photos taken along the damage paths of the Joplin, MO, and Tuscaloosa-Birmingham, AL, tornadoes of 2011 captured and preserved several unique patterns of damage. In particular, a few distinct tree-fall patterns were noted along the Tuscaloosa-Birmingham tornado track that appeared highly influenced by the underlying topography. One such region was the focus of a damage survey and motivated laboratory vortex simulations with a 3-D foam representation of the underlying topography, in addition to simulations performed with idealized 2D topographic features, using Iowa State University's tornado simulator. The purpose of this dissertation is to explore various aspects related to the interaction of a tornado or a tornado-like vortex with its underlying topography. Three topics are examined: 1) Analysis of tornado-induced tree-fall using aerial photography from the Joplin, MO, and Tuscaloosa-Birmingham, AL, tornadoes of 2011, 2) Laboratory investigation of topographical influences on a simulated tornado-like vortex, and 3) On the use of non-standard EF-scale damage indicators to categorize tornadoes.

  11. A system dynamics approach to analyze laboratory test errors.

    PubMed

    Guo, Shijing; Roudsari, Abdul; Garcez, Artur d'Avila

    2015-01-01

    Although many researches have been carried out to analyze laboratory test errors during the last decade, it still lacks a systemic view of study, especially to trace errors during test process and evaluate potential interventions. This study implements system dynamics modeling into laboratory errors to trace the laboratory error flows and to simulate the system behaviors while changing internal variable values. The change of the variables may reflect a change in demand or a proposed intervention. A review of literature on laboratory test errors was given and provided as the main data source for the system dynamics model. Three "what if" scenarios were selected for testing the model. System behaviors were observed and compared under different scenarios over a period of time. The results suggest system dynamics modeling has potential effectiveness of helping to understand laboratory errors, observe model behaviours, and provide a risk-free simulation experiments for possible strategies.

  12. Quasi-real-time end-to-end simulations of ELT-scale adaptive optics systems on GPUs

    NASA Astrophysics Data System (ADS)

    Gratadour, Damien

    2011-09-01

    Our team has started the development of a code dedicated to GPUs for the simulation of AO systems at the E-ELT scale. It uses the CUDA toolkit and an original binding to Yorick (an open source interpreted language) to provide the user with a comprehensive interface. In this paper we present the first performance analysis of our simulation code, showing its ability to provide Shack-Hartmann (SH) images and measurements at the kHz scale for VLT-sized AO system and in quasi-real-time (up to 70 Hz) for ELT-sized systems on a single top-end GPU. The simulation code includes multiple layers atmospheric turbulence generation, ray tracing through these layers, image formation at the focal plane of every sub-apertures of a SH sensor using either natural or laser guide stars and centroiding on these images using various algorithms. Turbulence is generated on-the-fly giving the ability to simulate hours of observations without the need of loading extremely large phase screens in the global memory. Because of its performance this code additionally provides the unique ability to test real-time controllers for future AO systems under nominal conditions.

  13. Analysis and simulation of a magnetic bearing suspension system for a laboratory model annular momentum control device

    NASA Technical Reports Server (NTRS)

    Groom, N. J.; Woolley, C. T.; Joshi, S. M.

    1981-01-01

    A linear analysis and the results of a nonlinear simulation of a magnetic bearing suspension system which uses permanent magnet flux biasing are presented. The magnetic bearing suspension is part of a 4068 N-m-s (3000 lb-ft-sec) laboratory model annular momentum control device (AMCD). The simulation includes rigid body rim dynamics, linear and nonlinear axial actuators, linear radial actuators, axial and radial rim warp, and power supply and power driver current limits.

  14. A novel method of multi-scale simulation of macro-scale deformation and microstructure evolution on metal forming

    NASA Astrophysics Data System (ADS)

    Huang, Shiquan; Yi, Youping; Li, Pengchuan

    2011-05-01

    In recent years, multi-scale simulation technique of metal forming is gaining significant attention for prediction of the whole deformation process and microstructure evolution of product. The advances of numerical simulation at macro-scale level on metal forming are remarkable and the commercial FEM software, such as Deform2D/3D, has found a wide application in the fields of metal forming. However, the simulation method of multi-scale has little application due to the non-linearity of microstructure evolution during forming and the difficulty of modeling at the micro-scale level. This work deals with the modeling of microstructure evolution and a new method of multi-scale simulation in forging process. The aviation material 7050 aluminum alloy has been used as example for modeling of microstructure evolution. The corresponding thermal simulated experiment has been performed on Gleeble 1500 machine. The tested specimens have been analyzed for modeling of dislocation density, nucleation and growth of recrystallization(DRX). The source program using cellular automaton (CA) method has been developed to simulate the grain nucleation and growth, in which the change of grain topology structure caused by the metal deformation was considered. The physical fields at macro-scale level such as temperature field, stress and strain fields, which can be obtained by commercial software Deform 3D, are coupled with the deformed storage energy at micro-scale level by dislocation model to realize the multi-scale simulation. This method was explained by forging process simulation of the aircraft wheel hub forging. Coupled the results of Deform 3D with CA results, the forging deformation progress and the microstructure evolution at any point of forging could be simulated. For verifying the efficiency of simulation, experiments of aircraft wheel hub forging have been done in the laboratory and the comparison of simulation and experiment result has been discussed in details.

  15. Linking pore-scale and basin-scale effects on diffusive methane transport in hydrate bearing environments through multi-scale reservoir simulations

    NASA Astrophysics Data System (ADS)

    Nole, M.; Daigle, H.; Cook, A.; Malinverno, A.; Hillman, J. I. T.

    2016-12-01

    We explore the gas hydrate-generating capacity of diffusive methane transport induced by solubility gradients due to pore size contrasts in lithologically heterogeneous marine sediments. Through the use of 1D, 2D, and 3D reactive transport simulations, we investigate scale-dependent processes in diffusion-dominated gas hydrate systems. These simulations all track a sand body, or series of sands, surrounded by clays as they are buried through the gas hydrate stability zone. Methane is sourced by microbial methanogenesis in the clays surrounding the sand layers. In 1D, simulations performed in a Lagrangian reference frame demonstrate that gas hydrate in thin sands (3.6 m thick) can occur in high saturations (upward of 70%) at the edges of sand bodies within the upper 400 meters below the seafloor. Diffusion of methane toward the center of the sand layer depends on the concentration gradient within the sand: broader sand pore size distributions with smaller median pore sizes enhance diffusive action toward the sand's center. Incorporating downhole log- and laboratory-derived sand pore size distributions, gas hydrate saturations in the center of the sand can reach 20% of the hydrate saturations at the sand's edges. Furthermore, we show that hydrate-free zones exist immediately above and below the sand and are approximately 5 m thick, depending on the sand-clay solubility contrast. A moving reference frame is also adopted in 2D, and the angle of gravity is rotated relative to the grid system to simulate a dipping sand layer. This is important to minimize diffusive edge effects or numerical diffusion that might be associated with a dipping sand in an Eulerian grid system oriented orthogonal to gravity. Two-dimensional simulations demonstrate the tendency for gas hydrate to accumulate downdip in a sand body because of greater methane transport at depth due to larger sand-clay solubility contrasts. In 3D, basin-scale simulations illuminate how convergent sand layers in a

  16. Linking pore-scale and basin-scale effects on diffusive methane transport in hydrate bearing environments through multi-scale reservoir simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nole, Michael; Daigle, Hugh; Cook, Ann

    We explore the gas hydrate-generating capacity of diffusive methane transport induced by solubility gradients due to pore size contrasts in lithologically heterogeneous marine sediments. Through the use of 1D, 2D, and 3D reactive transport simulations, we investigate scale-dependent processes in diffusion-dominated gas hydrate systems. These simulations all track a sand body, or series of sands, surrounded by clays as they are buried through the gas hydrate stability zone. Methane is sourced by microbial methanogenesis in the clays surrounding the sand layers. In 1D, simulations performed in a Lagrangian reference frame demonstrate that gas hydrate in thin sands (3.6 m thick)more » can occur in high saturations (upward of 70%) at the edges of sand bodies within the upper 400 meters below the seafloor. Diffusion of methane toward the center of the sand layer depends on the concentration gradient within the sand: broader sand pore size distributions with smaller median pore sizes enhance diffusive action toward the sand’s center. Incorporating downhole log- and laboratory-derived sand pore size distributions, gas hydrate saturations in the center of the sand can reach 20% of the hydrate saturations at the sand’s edges. Furthermore, we show that hydrate-free zones exist immediately above and below the sand and are approximately 5 m thick, depending on the sand-clay solubility contrast. A moving reference frame is also adopted in 2D, and the angle of gravity is rotated relative to the grid system to simulate a dipping sand layer. This is important to minimize diffusive edge effects or numerical diffusion that might be associated with a dipping sand in an Eulerian grid system oriented orthogonal to gravity. Two-dimensional simulations demonstrate the tendency for gas hydrate to accumulate downdip in a sand body because of greater methane transport at depth due to larger sand-clay solubility contrasts. In 3D, basin-scale simulations illuminate how convergent sand

  17. Experiences Integrating Transmission and Distribution Simulations for DERs with the Integrated Grid Modeling System (IGMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmintier, Bryan; Hale, Elaine; Hodge, Bri-Mathias

    2016-08-11

    This paper discusses the development of, approaches for, experiences with, and some results from a large-scale, high-performance-computer-based (HPC-based) co-simulation of electric power transmission and distribution systems using the Integrated Grid Modeling System (IGMS). IGMS was developed at the National Renewable Energy Laboratory (NREL) as a novel Independent System Operator (ISO)-to-appliance scale electric power system modeling platform that combines off-the-shelf tools to simultaneously model 100s to 1000s of distribution systems in co-simulation with detailed ISO markets, transmission power flows, and AGC-level reserve deployment. Lessons learned from the co-simulation architecture development are shared, along with a case study that explores the reactivemore » power impacts of PV inverter voltage support on the bulk power system.« less

  18. Laboratory Modelling of Volcano Plumbing Systems: a review

    NASA Astrophysics Data System (ADS)

    Galland, Olivier; Holohan, Eoghan P.; van Wyk de Vries, Benjamin; Burchardt, Steffi

    2015-04-01

    Earth scientists have, since the XIX century, tried to replicate or model geological processes in controlled laboratory experiments. In particular, laboratory modelling has been used study the development of volcanic plumbing systems, which sets the stage for volcanic eruptions. Volcanic plumbing systems involve complex processes that act at length scales of microns to thousands of kilometres and at time scales from milliseconds to billions of years, and laboratory models appear very suitable to address them. This contribution reviews laboratory models dedicated to study the dynamics of volcano plumbing systems (Galland et al., Accepted). The foundation of laboratory models is the choice of relevant model materials, both for rock and magma. We outline a broad range of suitable model materials used in the literature. These materials exhibit very diverse rheological behaviours, so their careful choice is a crucial first step for the proper experiment design. The second step is model scaling, which successively calls upon: (1) the principle of dimensional analysis, and (2) the principle of similarity. The dimensional analysis aims to identify the dimensionless physical parameters that govern the underlying processes. The principle of similarity states that "a laboratory model is equivalent to his geological analogue if the dimensionless parameters identified in the dimensional analysis are identical, even if the values of the governing dimensional parameters differ greatly" (Barenblatt, 2003). The application of these two steps ensures a solid understanding and geological relevance of the laboratory models. In addition, this procedure shows that laboratory models are not designed to exactly mimic a given geological system, but to understand underlying generic processes, either individually or in combination, and to identify or demonstrate physical laws that govern these processes. From this perspective, we review the numerous applications of laboratory models to

  19. Fast parametric relationships for the large-scale reservoir simulation of mixed CH 4-CO 2 gas hydrate systems

    DOE PAGES

    Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.

    2017-03-27

    A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO 2-CH 4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this paper, we present a set ofmore » fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. Finally, the mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.« less

  20. Fast parametric relationships for the large-scale reservoir simulation of mixed CH4-CO2 gas hydrate systems

    NASA Astrophysics Data System (ADS)

    Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.

    2017-06-01

    A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO2-CH4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this work, we present a set of fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. The mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.

  1. Fast parametric relationships for the large-scale reservoir simulation of mixed CH 4-CO 2 gas hydrate systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reagan, Matthew T.; Moridis, George J.; Seim, Katie S.

    A recent Department of Energy field test on the Alaska North Slope has increased interest in the ability to simulate systems of mixed CO 2-CH 4 hydrates. However, the physically realistic simulation of mixed-hydrate simulation is not yet a fully solved problem. Limited quantitative laboratory data leads to the use of various ab initio, statistical mechanical, or other mathematic representations of mixed-hydrate phase behavior. Few of these methods are suitable for inclusion in reservoir simulations, particularly for systems with large number of grid elements, 3D systems, or systems with complex geometric configurations. In this paper, we present a set ofmore » fast parametric relationships describing the thermodynamic properties and phase behavior of a mixed methane-carbon dioxide hydrate system. We use well-known, off-the-shelf hydrate physical properties packages to generate a sufficiently large dataset, select the most convenient and efficient mathematical forms, and fit the data to those forms to create a physical properties package suitable for inclusion in the TOUGH+ family of codes. Finally, the mapping of the phase and thermodynamic space reveals the complexity of the mixed-hydrate system and allows understanding of the thermodynamics at a level beyond what much of the existing laboratory data and literature currently offer.« less

  2. Communication Simulations for Power System Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuller, Jason C.; Ciraci, Selim; Daily, Jeffrey A.

    2013-05-29

    New smart grid technologies and concepts, such as dynamic pricing, demand response, dynamic state estimation, and wide area monitoring, protection, and control, are expected to require considerable communication resources. As the cost of retrofit can be high, future power grids will require the integration of high-speed, secure connections with legacy communication systems, while still providing adequate system control and security. While considerable work has been performed to create co-simulators for the power domain with load models and market operations, limited work has been performed in integrating communications directly into a power domain solver. The simulation of communication and power systemsmore » will become more important as the two systems become more inter-related. This paper will discuss ongoing work at Pacific Northwest National Laboratory to create a flexible, high-speed power and communication system co-simulator for smart grid applications. The framework for the software will be described, including architecture considerations for modular, high performance computing and large-scale scalability (serialization, load balancing, partitioning, cross-platform support, etc.). The current simulator supports the ns-3 (telecommunications) and GridLAB-D (distribution systems) simulators. Ongoing and future work will be described, including planned future expansions for a traditional transmission solver. A test case using the co-simulator, utilizing a transactive demand response system created for the Olympic Peninsula and AEP gridSMART demonstrations, requiring two-way communication between distributed and centralized market devices, will be used to demonstrate the value and intended purpose of the co-simulation environment.« less

  3. Characterization of seismic properties across scales: from the laboratory- to the field scale

    NASA Astrophysics Data System (ADS)

    Grab, Melchior; Quintal, Beatriz; Caspari, Eva; Maurer, Hansruedi; Greenhalgh, Stewart

    2016-04-01

    When exploring geothermal systems, the main interest is on factors controlling the efficiency of the heat exchanger. This includes the energy state of the pore fluids and the presence of permeable structures building part of the fluid transport system. Seismic methods are amongst the most common exploration techniques to image the deep subsurface in order to evaluate such a geothermal heat exchanger. They make use of the fact that a seismic wave caries information on the properties of the rocks in the subsurface through which it passes. This enables the derivation of the stiffness and the density of the host rock from the seismic velocities. Moreover, it is well-known that the seismic waveforms are modulated while propagating trough the subsurface by visco-elastic effects due to wave induced fluid flow, hence, delivering information about the fluids in the rock's pore space. To constrain the interpretation of seismic data, that is, to link seismic properties with the fluid state and host rock permeability, it is common practice to measure the rock properties of small rock specimens in the laboratory under in-situ conditions. However, in magmatic geothermal systems or in systems situated in the crystalline basement, the host rock is often highly impermeable and fluid transport predominately takes place in fracture networks, consisting of fractures larger than the rock samples investigated in the laboratory. Therefore, laboratory experiments only provide the properties of relatively intact rock and an up-scaling procedure is required to characterize the seismic properties of large rock volumes containing fractures and fracture networks and to study the effects of fluids in such fractured rock. We present a technique to parameterize fractured rock volumes as typically encountered in Icelandic magmatic geothermal systems, by combining laboratory experiments with effective medium calculations. The resulting models can be used to calculate the frequency-dependent bulk

  4. Accuracy of finite-difference modeling of seismic waves : Simulation versus laboratory measurements

    NASA Astrophysics Data System (ADS)

    Arntsen, B.

    2017-12-01

    The finite-difference technique for numerical modeling of seismic waves is still important and for some areas extensively used.For exploration purposes is finite-difference simulation at the core of both traditional imaging techniques such as reverse-time migration and more elaborate Full-Waveform Inversion techniques.The accuracy and fidelity of finite-difference simulation of seismic waves are hard to quantify and meaningfully error analysis is really onlyeasily available for simplistic media. A possible alternative to theoretical error analysis is provided by comparing finite-difference simulated data with laboratory data created using a scale model. The advantage of this approach is the accurate knowledge of the model, within measurement precision, and the location of sources and receivers.We use a model made of PVC immersed in water and containing horizontal and tilted interfaces together with several spherical objects to generateultrasonic pressure reflection measurements. The physical dimensions of the model is of the order of a meter, which after scaling represents a model with dimensions of the order of 10 kilometer and frequencies in the range of one to thirty hertz.We find that for plane horizontal interfaces the laboratory data can be reproduced by the finite-difference scheme with relatively small error, but for steeply tilted interfaces the error increases. For spherical interfaces the discrepancy between laboratory data and simulated data is sometimes much more severe, to the extent that it is not possible to simulate reflections from parts of highly curved bodies. The results are important in view of the fact that finite-difference modeling is often at the core of imaging and inversion algorithms tackling complicatedgeological areas with highly curved interfaces.

  5. A multi-scale experimental and simulation approach for fractured subsurface systems

    NASA Astrophysics Data System (ADS)

    Viswanathan, H. S.; Carey, J. W.; Frash, L.; Karra, S.; Hyman, J.; Kang, Q.; Rougier, E.; Srinivasan, G.

    2017-12-01

    Fractured systems play an important role in numerous subsurface applications including hydraulic fracturing, carbon sequestration, geothermal energy and underground nuclear test detection. Fractures that range in scale from microns to meters and their structure control the behavior of these systems which provide over 85% of our energy and 50% of US drinking water. Determining the key mechanisms in subsurface fractured systems has been impeded due to the lack of sophisticated experimental methods to measure fracture aperture and connectivity, multiphase permeability, and chemical exchange capacities at the high temperature, pressure, and stresses present in the subsurface. In this study, we developed and use microfluidic and triaxial core flood experiments required to reveal the fundamental dynamics of fracture-fluid interactions. In addition we have developed high fidelity fracture propagation and discrete fracture network flow models to simulate these fractured systems. We also have developed reduced order models of these fracture simulators in order to conduct uncertainty quantification for these systems. We demonstrate an integrated experimental/modeling approach that allows for a comprehensive characterization of fractured systems and develop models that can be used to optimize the reservoir operating conditions over a range of subsurface conditions.

  6. Modeling, simulation, and analysis at Sandia National Laboratories for health care systems

    NASA Astrophysics Data System (ADS)

    Polito, Joseph

    1994-12-01

    Modeling, Simulation, and Analysis are special competencies of the Department of Energy (DOE) National Laboratories which have been developed and refined through years of national defense work. Today, many of these skills are being applied to the problem of understanding the performance of medical devices and treatments. At Sandia National Laboratories we are developing models at all three levels of health care delivery: (1) phenomenology models for Observation and Test, (2) model-based outcomes simulations for Diagnosis and Prescription, and (3) model-based design and control simulations for the Administration of Treatment. A sampling of specific applications include non-invasive sensors for blood glucose, ultrasonic scanning for development of prosthetics, automated breast cancer diagnosis, laser burn debridement, surgical staple deformation, minimally invasive control for administration of a photodynamic drug, and human-friendly decision support aids for computer-aided diagnosis. These and other projects are being performed at Sandia with support from the DOE and in cooperation with medical research centers and private companies. Our objective is to leverage government engineering, modeling, and simulation skills with the biotechnical expertise of the health care community to create a more knowledge-rich environment for decision making and treatment.

  7. Computer Based Simulation of Laboratory Experiments.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Examines computer based simulations of practical laboratory experiments in engineering. Discusses the aims and achievements of lab work (cognitive, process, psychomotor, and affective); types of simulations (model building and behavioral); and the strengths and weaknesses of simulations. Describes the development of a centrifugal pump simulation,…

  8. Achieving across-laboratory replicability in psychophysical scaling

    PubMed Central

    Ward, Lawrence M.; Baumann, Michael; Moffat, Graeme; Roberts, Larry E.; Mori, Shuji; Rutledge-Taylor, Matthew; West, Robert L.

    2015-01-01

    It is well known that, although psychophysical scaling produces good qualitative agreement between experiments, precise quantitative agreement between experimental results, such as that routinely achieved in physics or biology, is rarely or never attained. A particularly galling example of this is the fact that power function exponents for the same psychological continuum, measured in different laboratories but ostensibly using the same scaling method, magnitude estimation, can vary by a factor of three. Constrained scaling (CS), in which observers first learn a standardized meaning for a set of numerical responses relative to a standard sensory continuum and then make magnitude judgments of other sensations using the learned response scale, has produced excellent quantitative agreement between individual observers’ psychophysical functions. Theoretically it could do the same for across-laboratory comparisons, although this needs to be tested directly. We compared nine different experiments from four different laboratories as an example of the level of across experiment and across-laboratory agreement achievable using CS. In general, we found across experiment and across-laboratory agreement using CS to be significantly superior to that typically obtained with conventional magnitude estimation techniques, although some of its potential remains to be realized. PMID:26191019

  9. GCR Simulator Reference Field and a Spectral Approach for Laboratory Simulation

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Blattnig, Steve R.; Norbury, John W.; Rusek, Adam; La Tessa, Chiara; Walker, Steven A.

    2015-01-01

    The galactic cosmic ray (GCR) simulator at the NASA Space Radiation Laboratory (NSRL) is intended to deliver the broad spectrum of particles and energies encountered in deep space to biological targets in a controlled laboratory setting. In this work, certain aspects of simulating the GCR environment in the laboratory are discussed. Reference field specification and beam selection strategies at NSRL are the main focus, but the analysis presented herein may be modified for other facilities. First, comparisons are made between direct simulation of the external, free space GCR field and simulation of the induced tissue field behind shielding. It is found that upper energy constraints at NSRL limit the ability to simulate the external, free space field directly (i.e. shielding placed in the beam line in front of a biological target and exposed to a free space spectrum). Second, variation in the induced tissue field associated with shielding configuration and solar activity is addressed. It is found that the observed variation is likely within the uncertainty associated with representing any GCR reference field with discrete ion beams in the laboratory, given current facility constraints. A single reference field for deep space missions is subsequently identified. Third, an approach for selecting beams at NSRL to simulate the designated reference field is presented. Drawbacks of the proposed methodology are discussed and weighed against alternative simulation strategies. The neutron component and track structure characteristics of the simulated field are discussed in this context.

  10. Laboratory-Scale Evidence for Lightning-Mediated Gene Transfer in Soil

    PubMed Central

    Demanèche, Sandrine; Bertolla, Franck; Buret, François; Nalin, Renaud; Sailland, Alain; Auriol, Philippe; Vogel, Timothy M.; Simonet, Pascal

    2001-01-01

    Electrical fields and current can permeabilize bacterial membranes, allowing for the penetration of naked DNA. Given that the environment is subjected to regular thunderstorms and lightning discharges that induce enormous electrical perturbations, the possibility of natural electrotransformation of bacteria was investigated. We demonstrated with soil microcosm experiments that the transformation of added bacteria could be increased locally via lightning-mediated current injection. The incorporation of three genes coding for antibiotic resistance (plasmid pBR328) into the Escherichia coli strain DH10B recipient previously added to soil was observed only after the soil had been subjected to laboratory-scale lightning. Laboratory-scale lightning had an electrical field gradient (700 versus 600 kV m−1) and current density (2.5 versus 12.6 kA m−2) similar to those of full-scale lightning. Controls handled identically except for not being subjected to lightning produced no detectable antibiotic-resistant clones. In addition, simulated storm cloud electrical fields (in the absence of current) did not produce detectable clones (transformation detection limit, 10−9). Natural electrotransformation might be a mechanism involved in bacterial evolution. PMID:11472916

  11. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic

  12. Enabling UAS Research at the NASA EAV Laboratory

    NASA Technical Reports Server (NTRS)

    Ippolito, Corey A.

    2015-01-01

    The Exploration Aerial Vehicles (EAV) Laboratory at NASA Ames Research Center leads research into intelligent autonomy and advanced control systems, bridging the gap between simulation and full-scale technology through flight test experimentation on unmanned sub-scale test vehicles.

  13. The effect of entrapped nonaqueous phase liquids on tracer transport in heterogeneous porous media: Laboratory experiments at the intermediate scale

    USGS Publications Warehouse

    Barth, Gilbert R.; Illangasekare, T.H.; Rajaram, H.

    2003-01-01

    This work considers the applicability of conservative tracers for detecting high-saturation nonaqueous-phase liquid (NAPL) entrapment in heterogeneous systems. For this purpose, a series of experiments and simulations was performed using a two-dimensional heterogeneous system (10??1.2 m), which represents an intermediate scale between laboratory and field scales. Tracer tests performed prior to injecting the NAPL provide the baseline response of the heterogeneous porous medium. Two NAPL spill experiments were performed and the entrapped-NAPL saturation distribution measured in detail using a gamma-ray attenuation system. Tracer tests following each of the NAPL spills produced breakthrough curves (BTCs) reflecting the impact of entrapped NAPL on conservative transport. To evaluate significance, the impact of NAPL entrapment on the conservative-tracer breakthrough curves was compared to simulated breakthrough curve variability for different realizations of the heterogeneous distribution. Analysis of the results reveals that the NAPL entrapment has a significant impact on the temporal moments of conservative-tracer breakthrough curves. ?? 2003 Elsevier B.V. All rights reserved.

  14. A Wireless Communications Systems Laboratory Course

    ERIC Educational Resources Information Center

    Guzelgoz, Sabih; Arslan, Huseyin

    2010-01-01

    A novel wireless communications systems laboratory course is introduced. The course teaches students how to design, test, and simulate wireless systems using modern instrumentation and computer-aided design (CAD) software. One of the objectives of the course is to help students understand the theoretical concepts behind wireless communication…

  15. High performance computing in biology: multimillion atom simulations of nanoscale systems

    PubMed Central

    Sanbonmatsu, K. Y.; Tung, C.-S.

    2007-01-01

    Computational methods have been used in biology for sequence analysis (bioinformatics), all-atom simulation (molecular dynamics and quantum calculations), and more recently for modeling biological networks (systems biology). Of these three techniques, all-atom simulation is currently the most computationally demanding, in terms of compute load, communication speed, and memory load. Breakthroughs in electrostatic force calculation and dynamic load balancing have enabled molecular dynamics simulations of large biomolecular complexes. Here, we report simulation results for the ribosome, using approximately 2.64 million atoms, the largest all-atom biomolecular simulation published to date. Several other nanoscale systems with different numbers of atoms were studied to measure the performance of the NAMD molecular dynamics simulation program on the Los Alamos National Laboratory Q Machine. We demonstrate that multimillion atom systems represent a 'sweet spot' for the NAMD code on large supercomputers. NAMD displays an unprecedented 85% parallel scaling efficiency for the ribosome system on 1024 CPUs. We also review recent targeted molecular dynamics simulations of the ribosome that prove useful for studying conformational changes of this large biomolecular complex in atomic detail. PMID:17187988

  16. Small-scale multi-axial hybrid simulation of a shear-critical reinforced concrete frame

    NASA Astrophysics Data System (ADS)

    Sadeghian, Vahid; Kwon, Oh-Sung; Vecchio, Frank

    2017-10-01

    This study presents a numerical multi-scale simulation framework which is extended to accommodate hybrid simulation (numerical-experimental integration). The framework is enhanced with a standardized data exchange format and connected to a generalized controller interface program which facilitates communication with various types of laboratory equipment and testing configurations. A small-scale experimental program was conducted using a six degree-of-freedom hydraulic testing equipment to verify the proposed framework and provide additional data for small-scale testing of shearcritical reinforced concrete structures. The specimens were tested in a multi-axial hybrid simulation manner under a reversed cyclic loading condition simulating earthquake forces. The physical models were 1/3.23-scale representations of a beam and two columns. A mixed-type modelling technique was employed to analyze the remainder of the structures. The hybrid simulation results were compared against those obtained from a large-scale test and finite element analyses. The study found that if precautions are taken in preparing model materials and if the shear-related mechanisms are accurately considered in the numerical model, small-scale hybrid simulations can adequately simulate the behaviour of shear-critical structures. Although the findings of the study are promising, to draw general conclusions additional test data are required.

  17. Evaluation of Variable Refrigerant Flow Systems Performance on Oak Ridge National Laboratory s Flexible Research Platform: Part 3 Simulation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Im, Piljae; Cho, Heejin; Kim, Dongsu

    2016-08-01

    This report provides second-year project simulation results for the multi-year project titled “Evaluation of Variable Refrigeration Flow (VRF) system on Oak Ridge National Laboratory (ORNL)’s Flexible Research Platform (FRP).”

  18. Simulating Nationwide Pandemics: Applying the Multi-scale Epidemiologic Simulation and Analysis System to Human Infectious Diseases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dombroski, M; Melius, C; Edmunds, T

    2008-09-24

    This study uses the Multi-scale Epidemiologic Simulation and Analysis (MESA) system developed for foreign animal diseases to assess consequences of nationwide human infectious disease outbreaks. A literature review identified the state of the art in both small-scale regional models and large-scale nationwide models and characterized key aspects of a nationwide epidemiological model. The MESA system offers computational advantages over existing epidemiological models and enables a broader array of stochastic analyses of model runs to be conducted because of those computational advantages. However, it has only been demonstrated on foreign animal diseases. This paper applied the MESA modeling methodology to humanmore » epidemiology. The methodology divided 2000 US Census data at the census tract level into school-bound children, work-bound workers, elderly, and stay at home individuals. The model simulated mixing among these groups by incorporating schools, workplaces, households, and long-distance travel via airports. A baseline scenario with fixed input parameters was run for a nationwide influenza outbreak using relatively simple social distancing countermeasures. Analysis from the baseline scenario showed one of three possible results: (1) the outbreak burned itself out before it had a chance to spread regionally, (2) the outbreak spread regionally and lasted a relatively long time, although constrained geography enabled it to eventually be contained without affecting a disproportionately large number of people, or (3) the outbreak spread through air travel and lasted a long time with unconstrained geography, becoming a nationwide pandemic. These results are consistent with empirical influenza outbreak data. The results showed that simply scaling up a regional small-scale model is unlikely to account for all the complex variables and their interactions involved in a nationwide outbreak. There are several limitations of the methodology that should be explored in

  19. Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, C. K.; Tzeferacos, P.; Lamb, D.

    X-ray images from the Chandra X-ray Observatory show that the South-East jet in the Crab nebula changes direction every few years. This remarkable phenomenon is also observed in jets associated with pulsar wind nebulae and other astrophysical objects, and therefore is a fundamental feature of astrophysical jet evolution that needs to be understood. Theoretical modeling and numerical simulations have suggested that this phenomenon may be a consequence of magnetic fields (B) and current-driven magnetohydrodynamic (MHD) instabilities taking place in the jet, but until now there has been no verification of this process in a controlled laboratory environment. Here we reportmore » the first such experiments, using scaled laboratory plasma jets generated by high-power lasers to model the Crab jet and monoenergetic-proton radiography to provide direct visualization and measurement of magnetic fields and their behavior. The toroidal magnetic field embedded in the supersonic jet triggered plasma instabilities and resulted in considerable deflections throughout the jet propagation, mimicking the kinks in the Crab jet. We also demonstrated that these kinks are stabilized by high jet velocity, consistent with the observation that instabilities alter the jet orientation but do not disrupt the overall jet structure. We successfully modeled these laboratory experiments with a validated three-dimensional (3D) numerical simulation, which in conjunction with the experiments provide compelling evidence that we have an accurate model of the most important physics of magnetic fields and MHD instabilities in the observed, kinked jet in the Crab nebula. The experiments initiate a novel approach in the laboratory for visualizing fields and instabilities associated with jets observed in various astrophysical objects, ranging from stellar to extragalactic systems. We expect that future work along this line will have important impact on the study and understanding of such fundamental

  20. Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet

    DOE PAGES

    Li, C. K.; Tzeferacos, P.; Lamb, D.; ...

    2016-10-07

    X-ray images from the Chandra X-ray Observatory show that the South-East jet in the Crab nebula changes direction every few years. This remarkable phenomenon is also observed in jets associated with pulsar wind nebulae and other astrophysical objects, and therefore is a fundamental feature of astrophysical jet evolution that needs to be understood. Theoretical modeling and numerical simulations have suggested that this phenomenon may be a consequence of magnetic fields (B) and current-driven magnetohydrodynamic (MHD) instabilities taking place in the jet, but until now there has been no verification of this process in a controlled laboratory environment. Here we reportmore » the first such experiments, using scaled laboratory plasma jets generated by high-power lasers to model the Crab jet and monoenergetic-proton radiography to provide direct visualization and measurement of magnetic fields and their behavior. The toroidal magnetic field embedded in the supersonic jet triggered plasma instabilities and resulted in considerable deflections throughout the jet propagation, mimicking the kinks in the Crab jet. We also demonstrated that these kinks are stabilized by high jet velocity, consistent with the observation that instabilities alter the jet orientation but do not disrupt the overall jet structure. We successfully modeled these laboratory experiments with a validated three-dimensional (3D) numerical simulation, which in conjunction with the experiments provide compelling evidence that we have an accurate model of the most important physics of magnetic fields and MHD instabilities in the observed, kinked jet in the Crab nebula. The experiments initiate a novel approach in the laboratory for visualizing fields and instabilities associated with jets observed in various astrophysical objects, ranging from stellar to extragalactic systems. We expect that future work along this line will have important impact on the study and understanding of such fundamental

  1. Cross-flow turbines: progress report on physical and numerical model studies at large laboratory scale

    NASA Astrophysics Data System (ADS)

    Wosnik, Martin; Bachant, Peter

    2016-11-01

    Cross-flow turbines show potential in marine hydrokinetic (MHK) applications. A research focus is on accurately predicting device performance and wake evolution to improve turbine array layouts for maximizing overall power output, i.e., minimizing wake interference, or taking advantage of constructive wake interaction. Experiments were carried with large laboratory-scale cross-flow turbines D O (1 m) using a turbine test bed in a large cross-section tow tank, designed to achieve sufficiently high Reynolds numbers for the results to be Reynolds number independent with respect to turbine performance and wake statistics, such that they can be reliably extrapolated to full scale and used for model validation. Several turbines of varying solidity were employed, including the UNH Reference Vertical Axis Turbine (RVAT) and a 1:6 scale model of the DOE-Sandia Reference Model 2 (RM2) turbine. To improve parameterization in array simulations, an actuator line model (ALM) was developed to provide a computationally feasible method for simulating full turbine arrays inside Navier-Stokes models. Results are presented for the simulation of performance and wake dynamics of cross-flow turbines and compared with experiments and body-fitted mesh, blade-resolving CFD. Supported by NSF-CBET Grant 1150797, Sandia National Laboratories.

  2. Evaluation of malodor for automobile air conditioner evaporator by using laboratory-scale test cooling bench.

    PubMed

    Kim, Kyung Hwan; Kim, Sun Hwa; Jung, Young Rim; Kim, Man Goo

    2008-09-12

    As one of the measures to improve the environment in an automobile, malodor caused by the automobile air-conditioning system evaporator was evaluated and analyzed using laboratory-scale test cooling bench. The odor was simulated with an evaporator test cooling bench equipped with an airflow controller, air temperature and relative humidity controller. To simulate the same odor characteristics that occur from automobiles, one previously used automobile air conditioner evaporator associated with unpleasant odors was selected. The odor was evaluated by trained panels and collected with aluminum polyester bags. Collected samples were analyzed by thermal desorption into a cryotrap and subsequent gas chromatographic separation, followed by simultaneous olfactometry, flame ionization detector and identified by atomic emission detection and mass spectrometry. Compounds such as alcohols, aldehydes, and organic acids were identified as responsible odor-active compounds. Gas chromatography/flame ionization detection/olfactometry combined sensory method with instrumental analysis was very effective as an odor evaluation method in an automobile air-conditioning system evaporator.

  3. Software Integration in Multi-scale Simulations: the PUPIL System

    NASA Astrophysics Data System (ADS)

    Torras, J.; Deumens, E.; Trickey, S. B.

    2006-10-01

    The state of the art for computational tools in both computational chemistry and computational materials physics includes many algorithms and functionalities which are implemented again and again. Several projects aim to reduce, eliminate, or avoid this problem. Most such efforts seem to be focused within a particular specialty, either quantum chemistry or materials physics. Multi-scale simulations, by their very nature however, cannot respect that specialization. In simulation of fracture, for example, the energy gradients that drive the molecular dynamics (MD) come from a quantum mechanical treatment that most often derives from quantum chemistry. That “QM” region is linked to a surrounding “CM” region in which potentials yield the forces. The approach therefore requires the integration or at least inter-operation of quantum chemistry and materials physics algorithms. The same problem occurs in “QM/MM” simulations in computational biology. The challenge grows if pattern recognition or other analysis codes of some kind must be used as well. The most common mode of inter-operation is user intervention: codes are modified as needed and data files are managed “by hand” by the user (interactively and via shell scripts). User intervention is however inefficient by nature, difficult to transfer to the community, and prone to error. Some progress (e.g Sethna’s work at Cornell [C.R. Myers et al., Mat. Res. Soc. Symp. Proc., 538(1999) 509, C.-S. Chen et al., Poster presented at the Material Research Society Meeting (2000)]) has been made on using Python scripts to achieve a more efficient level of interoperation. In this communication we present an alternative approach to merging current working packages without the necessity of major recoding and with only a relatively light wrapper interface. The scheme supports communication among the different components required for a given multi-scale calculation and access to the functionalities of those components

  4. Multi-Modal Transportation System Simulation

    DOT National Transportation Integrated Search

    1971-01-01

    THE PRESENT STATUS OF A LABORATORY BEING DEVELOPED FOR REAL-TIME SIMULATION OF COMMAND AND CONTROL FUNCTIONS IN TRANSPORTATION SYSTEMS IS DISCUSSED. DETAILS ARE GIVEN ON THE SIMULATION MODELS AND ON PROGRAMMING TECHNIQUES USED IN DEFINING AND EVALUAT...

  5. Process and Learning Outcomes from Remotely-Operated, Simulated, and Hands-on Student Laboratories

    ERIC Educational Resources Information Center

    Corter, James E.; Esche, Sven K.; Chassapis, Constantin; Ma, Jing; Nickerson, Jeffrey V.

    2011-01-01

    A large-scale, multi-year, randomized study compared learning activities and outcomes for hands-on, remotely-operated, and simulation-based educational laboratories in an undergraduate engineering course. Students (N = 458) worked in small-group lab teams to perform two experiments involving stress on a cantilever beam. Each team conducted the…

  6. Quantifying uncertainty and computational complexity for pore-scale simulations

    NASA Astrophysics Data System (ADS)

    Chen, C.; Yuan, Z.; Wang, P.; Yang, X.; Zhenyan, L.

    2016-12-01

    Pore-scale simulation is an essential tool to understand the complex physical process in many environmental problems, from multi-phase flow in the subsurface to fuel cells. However, in practice, factors such as sample heterogeneity, data sparsity and in general, our insufficient knowledge of the underlying process, render many simulation parameters and hence the prediction results uncertain. Meanwhile, most pore-scale simulations (in particular, direct numerical simulation) incur high computational cost due to finely-resolved spatio-temporal scales, which further limits our data/samples collection. To address those challenges, we propose a novel framework based on the general polynomial chaos (gPC) and build a surrogate model representing the essential features of the underlying system. To be specific, we apply the novel framework to analyze the uncertainties of the system behavior based on a series of pore-scale numerical experiments, such as flow and reactive transport in 2D heterogeneous porous media and 3D packed beds. Comparing with recent pore-scale uncertainty quantification studies using Monte Carlo techniques, our new framework requires fewer number of realizations and hence considerably reduce the overall computational cost, while maintaining the desired accuracy.

  7. Direct numerical simulations of magmatic differentiation at the microscopic scale

    NASA Astrophysics Data System (ADS)

    Sethian, J.; Suckale, J.; Elkins-Tanton, L. T.

    2010-12-01

    A key question in the context of magmatic differentiation and fractional crystallization is the ability of crystals to decouple from the ambient fluid and sink or rise. Field data indicates a complex spectrum of behavior ranging from rapid sedimentation to continued entrainment. Theoretical and laboratory studies paint a similarly rich picture. The goal of this study is to provide a detailed numerical assessment of the competing effects of sedimentation and entrainment at the scale of individual crystals. The decision to simulate magmatic differentiation at the grain scale comes at the price of not being able to simultaneously solve for the convective velocity field at the macroscopic scale, but has the crucial advantage of enabling us to fully resolve the dynamics of the systems from first principles without requiring any simplifying assumptions. The numerical approach used in this study is a customized computational methodology developed specifically for simulations of solid-fluid coupling in geophysical systems. The algorithm relies on a two-step projection scheme: In the first step, we solve the multiple-phase Navier-Stokes or Stokes equation in both domains. In the second step, we project the velocity field in the solid domain onto a rigid-body motion by enforcing that the deformation tensor in the respective domain is zero. This procedure is also used to enforce the no-slip boundary-condition on the solid-fluid interface. We have extensively validated and benchmarked the method. Our preliminary results indicate that, not unexpectedly, the competing effects of sedimentation and entrainment depend sensitively on the size distribution of the crystals, the aspect ratio of individual crystals and the vigor of the ambient flow field. We provide a detailed scaling analysis and quantify our results in terms of the relevant non-dimensional numbers.

  8. Dislocation dynamics simulations of plasticity at small scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Caizhi

    2010-01-01

    As metallic structures and devices are being created on a dimension comparable to the length scales of the underlying dislocation microstructures, the mechanical properties of them change drastically. Since such small structures are increasingly common in modern technologies, there is an emergent need to understand the critical roles of elasticity, plasticity, and fracture in small structures. Dislocation dynamics (DD) simulations, in which the dislocations are the simulated entities, offer a way to extend length scales beyond those of atomistic simulations and the results from DD simulations can be directly compared with the micromechanical tests. The primary objective of this researchmore » is to use 3-D DD simulations to study the plastic deformation of nano- and micro-scale materials and understand the correlation between dislocation motion, interactions and the mechanical response. Specifically, to identify what critical events (i.e., dislocation multiplication, cross-slip, storage, nucleation, junction and dipole formation, pinning etc.) determine the deformation response and how these change from bulk behavior as the system decreases in size and correlate and improve our current knowledge of bulk plasticity with the knowledge gained from the direct observations of small-scale plasticity. Our simulation results on single crystal micropillars and polycrystalline thin films can march the experiment results well and capture the essential features in small-scale plasticity. Furthermore, several simple and accurate models have been developed following our simulation results and can reasonably predict the plastic behavior of small scale materials.« less

  9. Assessment of the Mars Science Laboratory Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Way, David W.; Davis, J. L.; Shidner, Jeremy D.

    2013-01-01

    On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was only the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and a novel and untested Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multi-body computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the vehicle.

  10. Preliminary assessment of the Mars Science Laboratory entry, descent, and landing simulation

    NASA Astrophysics Data System (ADS)

    Way, David W.

    On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and the novel Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multi-body computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the Entry, Descent, and Landing system.

  11. Full-scale laboratory validation of a wireless MEMS-based technology for damage assessment of concrete structures

    NASA Astrophysics Data System (ADS)

    Trapani, Davide; Zonta, Daniele; Molinari, Marco; Amditis, Angelos; Bimpas, Matthaios; Bertsch, Nicolas; Spiering, Vincent; Santana, Juan; Sterken, Tom; Torfs, Tom; Bairaktaris, Dimitris; Bairaktaris, Manos; Camarinopulos, Stefanos; Frondistou-Yannas, Mata; Ulieru, Dumitru

    2012-04-01

    This paper illustrates an experimental campaign conducted under laboratory conditions on a full-scale reinforced concrete three-dimensional frame instrumented with wireless sensors developed within the Memscon project. In particular it describes the assumptions which the experimental campaign was based on, the design of the structure, the laboratory setup and the results of the tests. The aim of the campaign was to validate the performance of Memscon sensing systems, consisting of wireless accelerometers and strain sensors, on a real concrete structure during construction and under an actual earthquake. Another aspect of interest was to assess the effectiveness of the full damage recognition procedure based on the data recorded by the sensors and the reliability of the Decision Support System (DSS) developed in order to provide the stakeholders recommendations for building rehabilitation and the costs of this. With these ends, a Eurocode 8 spectrum-compatible accelerogram with increasing amplitude was applied at the top of an instrumented concrete frame built in the laboratory. MEMSCON sensors were directly compared with wired instruments, based on devices available on the market and taken as references, during both construction and seismic simulation.

  12. Simulation of water-energy fluxes through small-scale reservoir systems under limited data availability

    NASA Astrophysics Data System (ADS)

    Papoulakos, Konstantinos; Pollakis, Giorgos; Moustakis, Yiannis; Markopoulos, Apostolis; Iliopoulou, Theano; Dimitriadis, Panayiotis; Koutsoyiannis, Demetris; Efstratiadis, Andreas

    2017-04-01

    Small islands are regarded as promising areas for developing hybrid water-energy systems that combine multiple sources of renewable energy with pumped-storage facilities. Essential element of such systems is the water storage component (reservoir), which implements both flow and energy regulations. Apparently, the representation of the overall water-energy management problem requires the simulation of the operation of the reservoir system, which in turn requires a faithful estimation of water inflows and demands of water and energy. Yet, in small-scale reservoir systems, this task in far from straightforward, since both the availability and accuracy of associated information is generally very poor. For, in contrast to large-scale reservoir systems, for which it is quite easy to find systematic and reliable hydrological data, in the case of small systems such data may be minor or even totally missing. The stochastic approach is the unique means to account for input data uncertainties within the combined water-energy management problem. Using as example the Livadi reservoir, which is the pumped storage component of the small Aegean island of Astypalaia, Greece, we provide a simulation framework, comprising: (a) a stochastic model for generating synthetic rainfall and temperature time series; (b) a stochastic rainfall-runoff model, whose parameters cannot be inferred through calibration and, thus, they are represented as correlated random variables; (c) a stochastic model for estimating water supply and irrigation demands, based on simulated temperature and soil moisture, and (d) a daily operation model of the reservoir system, providing stochastic forecasts of water and energy outflows. Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students

  13. Wellbore Completion Systems Containment Breach Solution Experiments at a Large Scale Underground Research Laboratory : Sealant placement & scale-up from Lab to Field

    NASA Astrophysics Data System (ADS)

    Goodman, H.

    2017-12-01

    This investigation seeks to develop sealant technology that can restore containment to completed wells that suffer CO2 gas leakages currently untreatable using conventional technologies. Experimentation is performed at the Mont Terri Underground Research Laboratory (MT-URL) located in NW Switzerland. The laboratory affords investigators an intermediate-scale test site that bridges the gap between the laboratory bench and full field-scale conditions. Project focus is the development of CO2 leakage remediation capability using sealant technology. The experimental concept includes design and installation of a field scale completion package designed to mimic well systems heating-cooling conditions that may result in the development of micro-annuli detachments between the casing-cement-formation boundaries (Figure 1). Of particular interest is to test novel sealants that can be injected in to relatively narrow micro-annuli flow-paths of less than 120 microns aperture. Per a special report on CO2 storage submitted to the IPCC[1], active injection wells, along with inactive wells that have been abandoned, are identified as one of the most probable sources of leakage pathways for CO2 escape to the surface. Origins of pressure leakage common to injection well and completions architecture often occur due to tensile cracking from temperature cycles, micro-annulus by casing contraction (differential casing to cement sheath movement) and cement sheath channel development. This discussion summarizes the experiment capability and sealant testing results. The experiment concludes with overcoring of the entire mock-completion test site to assess sealant performance in 2018. [1] IPCC Special Report on Carbon Dioxide Capture and Storage (September 2005), section 5.7.2 Processes and pathways for release of CO2 from geological storage sites, page 244

  14. Real-Time Rocket/Vehicle System Integrated Health Management Laboratory For Development and Testing of Health Monitoring/Management Systems

    NASA Technical Reports Server (NTRS)

    Aguilar, R.

    2006-01-01

    Pratt & Whitney Rocketdyne has developed a real-time engine/vehicle system integrated health management laboratory, or testbed, for developing and testing health management system concepts. This laboratory simulates components of an integrated system such as the rocket engine, rocket engine controller, vehicle or test controller, as well as a health management computer on separate general purpose computers. These general purpose computers can be replaced with more realistic components such as actual electronic controllers and valve actuators for hardware-in-the-loop simulation. Various engine configurations and propellant combinations are available. Fault or failure insertion capability on-the-fly using direct memory insertion from a user console is used to test system detection and response. The laboratory is currently capable of simulating the flow-path of a single rocket engine but work is underway to include structural and multiengine simulation capability as well as a dedicated data acquisition system. The ultimate goal is to simulate as accurately and realistically as possible the environment in which the health management system will operate including noise, dynamic response of the engine/engine controller, sensor time delays, and asynchronous operation of the various components. The rationale for the laboratory is also discussed including limited alternatives for demonstrating the effectiveness and safety of a flight system.

  15. Enabling parallel simulation of large-scale HPC network systems

    DOE PAGES

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.; ...

    2016-04-07

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  16. Enabling parallel simulation of large-scale HPC network systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubarak, Misbah; Carothers, Christopher D.; Ross, Robert B.

    Here, with the increasing complexity of today’s high-performance computing (HPC) architectures, simulation has become an indispensable tool for exploring the design space of HPC systems—in particular, networks. In order to make effective design decisions, simulations of these systems must possess the following properties: (1) have high accuracy and fidelity, (2) produce results in a timely manner, and (3) be able to analyze a broad range of network workloads. Most state-of-the-art HPC network simulation frameworks, however, are constrained in one or more of these areas. In this work, we present a simulation framework for modeling two important classes of networks usedmore » in today’s IBM and Cray supercomputers: torus and dragonfly networks. We use the Co-Design of Multi-layer Exascale Storage Architecture (CODES) simulation framework to simulate these network topologies at a flit-level detail using the Rensselaer Optimistic Simulation System (ROSS) for parallel discrete-event simulation. Our simulation framework meets all the requirements of a practical network simulation and can assist network designers in design space exploration. First, it uses validated and detailed flit-level network models to provide an accurate and high-fidelity network simulation. Second, instead of relying on serial time-stepped or traditional conservative discrete-event simulations that limit simulation scalability and efficiency, we use the optimistic event-scheduling capability of ROSS to achieve efficient and scalable HPC network simulations on today’s high-performance cluster systems. Third, our models give network designers a choice in simulating a broad range of network workloads, including HPC application workloads using detailed network traces, an ability that is rarely offered in parallel with high-fidelity network simulations« less

  17. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has tomore » gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform

  18. System reliability of randomly vibrating structures: Computational modeling and laboratory testing

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.; Ammanagi, S.; Manohar, C. S.

    2015-09-01

    The problem of determination of system reliability of randomly vibrating structures arises in many application areas of engineering. We discuss in this paper approaches based on Monte Carlo simulations and laboratory testing to tackle problems of time variant system reliability estimation. The strategy we adopt is based on the application of Girsanov's transformation to the governing stochastic differential equations which enables estimation of probability of failure with significantly reduced number of samples than what is needed in a direct simulation study. Notably, we show that the ideas from Girsanov's transformation based Monte Carlo simulations can be extended to conduct laboratory testing to assess system reliability of engineering structures with reduced number of samples and hence with reduced testing times. Illustrative examples include computational studies on a 10-degree of freedom nonlinear system model and laboratory/computational investigations on road load response of an automotive system tested on a four-post test rig.

  19. HPC simulations of grain-scale spallation to improve thermal spallation drilling

    NASA Astrophysics Data System (ADS)

    Walsh, S. D.; Lomov, I.; Wideman, T. W.; Potter, J.

    2012-12-01

    Thermal spallation drilling and related hard-rock hole opening techniques are transformative technologies with the potential to dramatically reduce the costs associated with EGS well drilling and improve the productivity of new and existing wells. In contrast to conventional drilling methods that employ mechanical means to penetrate rock, thermal spallation methods fragment rock into small pieces ("spalls") without contact via the rapid transmission of heat to the rock surface. State-of-the-art constitutive models of thermal spallation employ Weibull statistical failure theory to represent the relationship between rock heterogeneity and its propensity to produce spalls when heat is applied to the rock surface. These models have been successfully used to predict such factors as penetration rate, spall-size distribution and borehole radius from drilling jet velocity and applied heat flux. A properly calibrated Weibull model would permit design optimization of thermal spallation drilling under geothermal field conditions. However, although useful for predicting system response in a given context, Weibull models are by their nature empirically derived. In the past, the parameters used in these models were carefully determined from laboratory tests, and thus model applicability was limited by experimental scope. This becomes problematic, for example, if simulating spall production at depths relevant for geothermal energy production, or modeling thermal spallation drilling in new rock types. Nevertheless, with sufficient computational resources, Weibull models could be validated in the absence of experimental data by explicit small-scale simulations that fully resolve rock grains. This presentation will discuss how high-fidelity simulations can be used to inform Weibull models of thermal spallation, and what these simulations reveal about the processes driving spallation at the grain-scale - in particular, the role that inter-grain boundaries and micro-pores play in the

  20. WeaVR: a self-contained and wearable immersive virtual environment simulation system.

    PubMed

    Hodgson, Eric; Bachmann, Eric R; Vincent, David; Zmuda, Michael; Waller, David; Calusdian, James

    2015-03-01

    We describe WeaVR, a computer simulation system that takes virtual reality technology beyond specialized laboratories and research sites and makes it available in any open space, such as a gymnasium or a public park. Novel hardware and software systems enable HMD-based immersive virtual reality simulations to be conducted in any arbitrary location, with no external infrastructure and little-to-no setup or site preparation. The ability of the WeaVR system to provide realistic motion-tracked navigation for users, to improve the study of large-scale navigation, and to generate usable behavioral data is shown in three demonstrations. First, participants navigated through a full-scale virtual grocery store while physically situated in an open grass field. Trajectory data are presented for both normal tracking and for tracking during the use of redirected walking that constrained users to a predefined area. Second, users followed a straight path within a virtual world for distances of up to 2 km while walking naturally and being redirected to stay within the field, demonstrating the ability of the system to study large-scale navigation by simulating virtual worlds that are potentially unlimited in extent. Finally, the portability and pedagogical implications of this system were demonstrated by taking it to a regional high school for live use by a computer science class on their own school campus.

  1. Baccalaureate nursing students' perspectives of peer tutoring in simulation laboratory, a Q methodology study.

    PubMed

    Li, Ting; Petrini, Marcia A; Stone, Teresa E

    2018-02-01

    The study aim was to identify the perceived perspectives of baccalaureate nursing students toward the peer tutoring in the simulation laboratory. Insight into the nursing students' experiences and baseline data related to their perception of peer tutoring will assist to improve nursing education. Q methodology was applied to explore the students' perspectives of peer tutoring in the simulation laboratory. A convenience P-sample of 40 baccalaureate nursing students was used. Fifty-eight selected Q statements from each participant were classified into the shape of a normal distribution using an 11-point bipolar scale form with a range from -5 to +5. PQ Method software analyzed the collected data. Three discrete factors emerged: Factor I ("Facilitate or empower" knowledge acquisition), Factor II ("Safety Net" Support environment), and Factor III ("Mentoring" learn how to learn). The findings of this study support and indicate that peer tutoring is an effective supplementary strategy to promote baccalaureate students' knowledge acquisition, establishing a supportive safety net and facilitating their abilities to learn in the simulation laboratory. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...

  3. Efficacy of a novel educational curriculum using a simulation laboratory on resident performance of hysteroscopic sterilization.

    PubMed

    Chudnoff, Scott G; Liu, Connie S; Levie, Mark D; Bernstein, Peter; Banks, Erika H

    2010-09-01

    To assess whether a novel educational curriculum using a simulation teaching laboratory improves resident knowledge, comfort with, and surgical performance of hysteroscopic sterilization. An educational prospective, pretest/posttest study. The Montefiore Institute of Minimally Invasive Surgery Laboratory. PATIENT(S)/SUBJECT(S): Thirty-four OB/GYN residents in an academic medical center. Hysteroscopic sterilization simulation laboratory and a brief didactic lecture. Differences in scores on validated skill assessment tools: Task specific checklist, Global Rating Scale (GRS), pass fail assessment, and a multiple-choice examination to evaluate knowledge and attitude. In the entire cohort improvements were observed on all evaluation tools after the simulation laboratory, with 31% points (SD+/-11.5, 95% confidence interval [CI] 27.3-35.3) higher score on the written evaluation; 63% points (SD+/-15.7, 95% CI 57.8-68.8) higher score on the task specific checklist; and 54% points (SD+/-13.6, 95% CI 48.8-58.3) higher score on the GRS. Higher PGY status was correlated with better pretest performance, but was not statistically significant in posttest scores. Residents reported an improvement in comfort performing the procedure after the laboratory. Simulation laboratory teaching significantly improved resident knowledge, comfort level, and technical skill performance of hysteroscopic sterilization. Copyright (c) 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  4. Communication Systems Simulation Laboratory (CSSL): Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Schlesinger, Adam

    2012-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CSSL. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  5. Development of a large-scale isolation chamber system for the safe and humane care of medium-sized laboratory animals harboring infectious diseases*

    PubMed Central

    Pan, Xin; Qi, Jian-cheng; Long, Ming; Liang, Hao; Chen, Xiao; Li, Han; Li, Guang-bo; Zheng, Hao

    2010-01-01

    The close phylogenetic relationship between humans and non-human primates makes non-human primates an irreplaceable model for the study of human infectious diseases. In this study, we describe the development of a large-scale automatic multi-functional isolation chamber for use with medium-sized laboratory animals carrying infectious diseases. The isolation chamber, including the transfer chain, disinfection chain, negative air pressure isolation system, animal welfare system, and the automated system, is designed to meet all biological safety standards. To create an internal chamber environment that is completely isolated from the exterior, variable frequency drive blowers are used in the air-intake and air-exhaust system, precisely controlling the filtered air flow and providing an air-barrier protection. A double door transfer port is used to transfer material between the interior of the isolation chamber and the outside. A peracetic acid sterilizer and its associated pipeline allow for complete disinfection of the isolation chamber. All of the isolation chamber parameters can be automatically controlled by a programmable computerized menu, allowing for work with different animals in different-sized cages depending on the research project. The large-scale multi-functional isolation chamber provides a useful and safe system for working with infectious medium-sized laboratory animals in high-level bio-safety laboratories. PMID:20872984

  6. Primary Exhaust Cooler at the Propulsion Systems Laboratory

    NASA Image and Video Library

    1952-09-21

    One of the two primary coolers at the Propulsion Systems Laboratory at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory. Engines could be run in simulated altitude conditions inside the facility’s two 14-foot-diameter and 24-foot-long test chambers. The Propulsion Systems Laboratory was the nation’s only facility that could run large full-size engine systems in controlled altitude conditions. At the time of this photograph, construction of the facility had recently been completed. Although not a wind tunnel, the Propulsion Systems Laboratory generated high-speed airflow through the interior of the engine. The air flow was pushed through the system by large compressors, adjusted by heating or refrigerating equipment, and de-moisturized by air dryers. The exhaust system served two roles: reducing the density of the air in the test chambers to simulate high altitudes and removing hot gases exhausted by the engines being tested. It was necessary to reduce the temperature of the extremely hot engine exhaust before the air reached the exhauster equipment. As the air flow exited through exhaust section of the test chamber, it entered into the giant primary cooler seen in this photograph. Narrow fins or vanes inside the cooler were filled with water. As the air flow passed between the vanes, its heat was transferred to the cooling water. The cooling water was cycled out of the system, carrying with it much of the exhaust heat.

  7. Computer Simulations Improve University Instructional Laboratories1

    PubMed Central

    2004-01-01

    Laboratory classes are commonplace and essential in biology departments but can sometimes be cumbersome, unreliable, and a drain on time and resources. As university intakes increase, pressure on budgets and staff time can often lead to reduction in practical class provision. Frequently, the ability to use laboratory equipment, mix solutions, and manipulate test animals are essential learning outcomes, and “wet” laboratory classes are thus appropriate. In others, however, interpretation and manipulation of the data are the primary learning outcomes, and here, computer-based simulations can provide a cheaper, easier, and less time- and labor-intensive alternative. We report the evaluation of two computer-based simulations of practical exercises: the first in chromosome analysis, the second in bioinformatics. Simulations can provide significant time savings to students (by a factor of four in our first case study) without affecting learning, as measured by performance in assessment. Moreover, under certain circumstances, performance can be improved by the use of simulations (by 7% in our second case study). We concluded that the introduction of these simulations can significantly enhance student learning where consideration of the learning outcomes indicates that it might be appropriate. In addition, they can offer significant benefits to teaching staff. PMID:15592599

  8. Performance of a pilot-scale constructed wetland system for treating simulated ash basin water.

    PubMed

    Dorman, Lane; Castle, James W; Rodgers, John H

    2009-05-01

    A pilot-scale constructed wetland treatment system (CWTS) was designed and built to decrease the concentration and toxicity of constituents of concern in ash basin water from coal-burning power plants. The CWTS was designed to promote the following treatment processes for metals and metalloids: precipitation as non-bioavailable sulfides, co-precipitation with iron oxyhydroxides, and adsorption onto iron oxides. Concentrations of Zn, Cr, Hg, As, and Se in simulated ash basin water were reduced by the CWTS to less than USEPA-recommended water quality criteria. The removal efficiency (defined as the percent concentration decrease from influent to effluent) was dependent on the influent concentration of the constituent, while the extent of removal (defined as the concentration of a constituent of concern in the CWTS effluent) was independent of the influent concentration. Results from toxicity experiments illustrated that the CWTS eliminated influent toxicity with regard to survival and reduced influent toxicity with regard to reproduction. Reduction in potential for scale formation and biofouling was achieved through treatment of the simulated ash basin water by the pilot-scale CWTS.

  9. A virtual laboratory for the simulation of sustainable energy systems in a low energy building: A case study

    NASA Astrophysics Data System (ADS)

    Breen, M.; O'Donovan, A.; Murphy, M. D.; Delaney, F.; Hill, M.; Sullivan, P. D. O.

    2016-03-01

    The aim of this paper was to develop a virtual laboratory simulation platform of the National Building Retrofit Test-bed at the Cork Institute of Technology, Ireland. The building in question is a low-energy retrofit which is provided with electricity by renewable systems including photovoltaics and wind. It can be thought of as a living laboratory, as a number of internal and external building factors are recorded at regular intervals during human occupation. The analysis carried out in this paper demonstrated that, for the period from April to September 2015, the electricity provided by the renewable systems did not consistently match the building’s electricity requirements due to differing load profiles. It was concluded that the use of load shifting techniques may help to increase the percentage of renewable energy utilisation.

  10. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilke, Jeremiah J; Kenny, Joseph P.

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading frameworkmore » allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.« less

  11. Preliminary Assessment of the Mars Science Laboratory Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Way, David W.

    2013-01-01

    On August 5, 2012, the Mars Science Laboratory rover, Curiosity, successfully landed inside Gale Crater. This landing was only the seventh successful landing and fourth rover to be delivered to Mars. Weighing nearly one metric ton, Curiosity is the largest and most complex rover ever sent to investigate another planet. Safely landing such a large payload required an innovative Entry, Descent, and Landing system, which included the first guided entry at Mars, the largest supersonic parachute ever flown at Mars, and a novel and untested Sky Crane landing system. A complete, end-to-end, six degree-of-freedom, multibody computer simulation of the Mars Science Laboratory Entry, Descent, and Landing sequence was developed at the NASA Langley Research Center. In-flight data gathered during the successful landing is compared to pre-flight statistical distributions, predicted by the simulation. These comparisons provide insight into both the accuracy of the simulation and the overall performance of the vehicle.

  12. Scaling of Sediment Dynamics in a Reach-Scale Laboratory Model of a Sand-Bed Stream with Riparian Vegetation

    NASA Astrophysics Data System (ADS)

    Gorrick, S.; Rodriguez, J. F.

    2011-12-01

    A movable bed physical model was designed in a laboratory flume to simulate both bed and suspended load transport in a mildly sinuous sand-bed stream. Model simulations investigated the impact of different vegetation arrangements along the outer bank to evaluate rehabilitation options. Preserving similitude in the 1:16 laboratory model was very important. In this presentation the scaling approach, as well as the successes and challenges of the strategy are outlined. Firstly a near-bankfull flow event was chosen for laboratory simulation. In nature, bankfull events at the field site deposit new in-channel features but cause only small amounts of bank erosion. Thus the fixed banks in the model were not a drastic simplification. Next, and as in other studies, the flow velocity and turbulence measurements were collected in separate fixed bed experiments. The scaling of flow in these experiments was simply maintained by matching the Froude number and roughness levels. The subsequent movable bed experiments were then conducted under similar hydrodynamic conditions. In nature, the sand-bed stream is fairly typical; in high flows most sediment transport occurs in suspension and migrating dunes cover the bed. To achieve similar dynamics in the model equivalent values of the dimensionless bed shear stress and the particle Reynolds number were important. Close values of the two dimensionless numbers were achieved with lightweight sediments (R=0.3) including coal and apricot pips with a particle size distribution similar to that of the field site. Overall the moveable bed experiments were able to replicate the dominant sediment dynamics present in the stream during a bankfull flow and yielded relevant information for the analysis of the effects of riparian vegetation. There was a potential conflict in the strategy, in that grain roughness was exaggerated with respect to nature. The advantage of this strategy is that although grain roughness is exaggerated, the similarity of

  13. Design, experimental analysis, and unsteady Reynolds-averaged Navier-Stokes simulation of laboratory-scale counter-rotating vertical-axis turbines in marine environment

    NASA Astrophysics Data System (ADS)

    Doan, Minh; Padricelli, Claudrio; Obi, Shinnosuke; Totsuka, Yoshitaka

    2017-11-01

    We present the torque and power measurement of laboratory-scale counter-rotating vertical-axis hydrokinetic turbines, built around a magnetic hysteresis brake as the speed controller and a Hall-effect sensor as the rotational speed transducer. A couple of straight-three-bladed turbines were linked through a transmission of spur gears and timing pulleys and coupled to the electronic instrumentation via flexible shaft couplers. A total of 8 experiments in 2 configurations were conducted in the water channel facility (4-m long, 0.3-m wide, and 0.15-m deep). Power generation of the turbines (0.06-m rotor diameter) was measured and compared with that of single turbines of the same size. The wakes generated by these experiments were also measured by particle image velocimetry (PIV) and numerically simulated by unsteady Reynolds-averaged Navier-Stokes (URANS) simulation using OpenFOAM. Preliminary results from wake measurement indicated the mechanism of enhanced power production behind the counter-rotating configuration of vertical-axis turbines. Current address: Politecnico di Milano.

  14. 30 CFR 14.21 - Laboratory-scale flame test apparatus.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Laboratory-scale flame test apparatus. 14.21 Section 14.21 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING... Technical Requirements § 14.21 Laboratory-scale flame test apparatus. The principal parts of the apparatus...

  15. 30 CFR 14.21 - Laboratory-scale flame test apparatus.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Laboratory-scale flame test apparatus. 14.21 Section 14.21 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR TESTING... Technical Requirements § 14.21 Laboratory-scale flame test apparatus. The principal parts of the apparatus...

  16. Modeling of Army Research Laboratory EMP simulators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miletta, J.R.; Chase, R.J.; Luu, B.B.

    1993-12-01

    Models are required that permit the estimation of emitted field signatures from EMP simulators to design the simulator antenna structure, to establish the usable test volumes, and to estimate human exposure risk. This paper presents the capabilities and limitations of a variety of EMP simulator models useful to the Army's EMP survivability programs. Comparisons among frequency and time-domain models are provided for two powerful US Army Research Laboratory EMP simulators: AESOP (Army EMP Simulator Operations) and VEMPS II (Vertical EMP Simulator II).

  17. Solar simulator for concentrator photovoltaic systems.

    PubMed

    Domínguez, César; Antón, Ignacio; Sala, Gabriel

    2008-09-15

    A solar simulator for measuring performance of large area concentrator photovoltaic (CPV) modules is presented. Its illumination system is based on a Xenon flash light and a large area collimator mirror, which simulates natural sun light. Quality requirements imposed by the CPV systems have been characterized: irradiance level and uniformity at the receiver, light collimation and spectral distribution. The simulator allows indoor fast and cost-effective performance characterization and classification of CPV systems at the production line as well as module rating carried out by laboratories.

  18. A full scale hydrodynamic simulation of pyrotechnic combustion

    NASA Astrophysics Data System (ADS)

    Kim, Bohoon; Jang, Seung-Gyo; Yoh, Jack

    2017-06-01

    A full scale hydrodynamic simulation that requires an accurate reproduction of shock-induced detonation was conducted for design of an energetic component system. A series of small scale gap tests and detailed hydrodynamic simulations were used to validate the reactive flow model for predicting the shock propagation in a train configuration and to quantify the shock sensitivity of the energetic materials. The energetic component system is composed of four main components, namely a donor unit (HNS + HMX), a bulkhead (STS), an acceptor explosive (RDX), and a propellant (BKNO3) for gas generation. The pressurized gases generated from the burning propellant were purged into a 10 cc release chamber for study of the inherent oscillatory flow induced by the interferences between shock and rarefaction waves. The pressure fluctuations measured from experiment and calculation were investigated to further validate the peculiar peak at specific characteristic frequency (ωc = 8.3 kHz). In this paper, a step-by-step numerical description of detonation of high explosive components, deflagration of propellant component, and deformation of metal component is given in order to facilitate the proper implementation of the outlined formulation into a shock physics code for a full scale hydrodynamic simulation of the energetic component system.

  19. Simulation System Fidelity Assessment at the Vertical Motion Simulator

    NASA Technical Reports Server (NTRS)

    Beard, Steven D.; Reardon, Scott E.; Tobias, Eric L.; Aponso, Bimal L.

    2013-01-01

    Fidelity is a word that is often used but rarely understood when talking about groundbased simulation. Assessing the cueing fidelity of a ground based flight simulator requires a comparison to actual flight data either directly or indirectly. Two experiments were conducted at the Vertical Motion Simulator using the GenHel UH-60A Black Hawk helicopter math model that was directly compared to flight data. Prior to the experiment the simulator s motion and visual system frequency responses were measured, the aircraft math model was adjusted to account for the simulator motion system delays, and the motion system gains and washouts were tuned for the individual tasks. The tuned motion system fidelity was then assessed against the modified Sinacori criteria. The first experiments showed similar handling qualities ratings (HQRs) to actual flight for a bob-up and sidestep maneuvers. The second experiment showed equivalent HQRs between flight and simulation for the ADS33 slalom maneuver for the two pilot participants. The ADS33 vertical maneuver HQRs were mixed with one pilot rating the flight and simulation the same while the second pilot rated the simulation worse. In addition to recording HQRs on the second experiment, an experimental Simulation Fidelity Rating (SFR) scale developed by the University of Liverpool was tested for applicability to engineering simulators. A discussion of the SFR scale for use on the Vertical Motion Simulator is included in this paper.

  20. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.SCALE 6.2 provides many new capabilities and significant improvements of existing features.New capabilities include:• ENDF/B-VII.1 nuclear data libraries CE and MG with enhanced group structures,• Neutron covariance data based on ENDF/B-VII.1 and supplemented with ORNL data,• Covariance data for fission product yields and decay constants,• Stochastic uncertainty and correlation quantification for any SCALE sequence with Sampler,• Parallel calculations with KENO,• Problem-dependent temperature corrections for CE calculations,• CE shielding and criticality accident alarm system analysis with

  1. Reducing Errors in Satellite Simulated Views of Clouds with an Improved Parameterization of Unresolved Scales

    NASA Astrophysics Data System (ADS)

    Hillman, B. R.; Marchand, R.; Ackerman, T. P.

    2016-12-01

    Satellite instrument simulators have emerged as a means to reduce errors in model evaluation by producing simulated or psuedo-retrievals from model fields, which account for limitations in the satellite retrieval process. Because of the mismatch in resolved scales between satellite retrievals and large-scale models, model cloud fields must first be downscaled to scales consistent with satellite retrievals. This downscaling is analogous to that required for model radiative transfer calculations. The assumption is often made in both model radiative transfer codes and satellite simulators that the unresolved clouds follow maximum-random overlap with horizontally homogeneous cloud condensate amounts. We examine errors in simulated MISR and CloudSat retrievals that arise due to these assumptions by applying the MISR and CloudSat simulators to cloud resolving model (CRM) output generated by the Super-parameterized Community Atmosphere Model (SP-CAM). Errors are quantified by comparing simulated retrievals performed directly on the CRM fields with those simulated by first averaging the CRM fields to approximately 2-degree resolution, applying a "subcolumn generator" to regenerate psuedo-resolved cloud and precipitation condensate fields, and then applying the MISR and CloudSat simulators on the regenerated condensate fields. We show that errors due to both assumptions of maximum-random overlap and homogeneous condensate are significant (relative to uncertainties in the observations and other simulator limitations). The treatment of precipitation is particularly problematic for CloudSat-simulated radar reflectivity. We introduce an improved subcolumn generator for use with the simulators, and show that these errors can be greatly reduced by replacing the maximum-random overlap assumption with the more realistic generalized overlap and incorporating a simple parameterization of subgrid-scale cloud and precipitation condensate heterogeneity. Sandia National Laboratories is a

  2. Determining erosion relevant soil characteristics with a small-scale rainfall simulator

    NASA Astrophysics Data System (ADS)

    Schindewolf, M.; Schmidt, J.

    2009-04-01

    The use of soil erosion models is of great importance in soil and water conservation. Routine application of these models on the regional scale is not at least limited by the high parameter demands. Although the EROSION 3D simulation model is operating with a comparable low number of parameters, some of the model input variables could only be determined by rainfall simulation experiments. The existing data base of EROSION 3D was created in the mid 90s based on large-scale rainfall simulation experiments on 22x2m sized experimental plots. Up to now this data base does not cover all soil and field conditions adequately. Therefore a new campaign of experiments would be essential to produce additional information especially with respect to the effects of new soil management practices (e.g. long time conservation tillage, non tillage). The rainfall simulator used in the actual campaign consists of 30 identic modules, which are equipped with oscillating rainfall nozzles. Veejet 80/100 (Spraying Systems Co., Wheaton, IL) are used in order to ensure best possible comparability to natural rainfalls with respect to raindrop size distribution and momentum transfer. Central objectives of the small-scale rainfall simulator are - effectively application - provision of comparable results to large-scale rainfall simulation experiments. A crucial problem in using the small scale simulator is the restriction on rather small volume rates of surface runoff. Under this conditions soil detachment is governed by raindrop impact. Thus impact of surface runoff on particle detachment cannot be reproduced adequately by a small-scale rainfall simulator With this problem in mind this paper presents an enhanced small-scale simulator which allows a virtual multiplication of the plot length by feeding additional sediment loaded water to the plot from upstream. Thus is possible to overcome the plot length limited to 3m while reproducing nearly similar flow conditions as in rainfall experiments on

  3. Simulating neural systems with Xyce.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiek, Richard Louis; Thornquist, Heidi K.; Mei, Ting

    2012-12-01

    Sandias parallel circuit simulator, Xyce, can address large scale neuron simulations in a new way extending the range within which one can perform high-fidelity, multi-compartment neuron simulations. This report documents the implementation of neuron devices in Xyce, their use in simulation and analysis of neuron systems.

  4. Multi-scale simulations of space problems with iPIC3D

    NASA Astrophysics Data System (ADS)

    Lapenta, Giovanni; Bettarini, Lapo; Markidis, Stefano

    The implicit Particle-in-Cell method for the computer simulation of space plasma, and its im-plementation in a three-dimensional parallel code, called iPIC3D, are presented. The implicit integration in time of the Vlasov-Maxwell system removes the numerical stability constraints and enables kinetic plasma simulations at magnetohydrodynamics scales. Simulations of mag-netic reconnection in plasma are presented to show the effectiveness of the algorithm. In particular we will show a number of simulations done for large scale 3D systems using the physical mass ratio for Hydrogen. Most notably one simulation treats kinetically a box of tens of Earth radii in each direction and was conducted using about 16000 processors of the Pleiades NASA computer. The work is conducted in collaboration with the MMS-IDS theory team from University of Colorado (M. Goldman, D. Newman and L. Andersson). Reference: Stefano Markidis, Giovanni Lapenta, Rizwan-uddin Multi-scale simulations of plasma with iPIC3D Mathematics and Computers in Simulation, Available online 17 October 2009, http://dx.doi.org/10.1016/j.matcom.2009.08.038

  5. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  6. The Subsurface Flow and Transport Laboratory: A New Department of Energy User's Facility for Intermediate-Scale Experimentation

    NASA Astrophysics Data System (ADS)

    Wietsma, T. W.; Oostrom, M.; Foster, N. S.

    2003-12-01

    Intermediate-scale experiments (ISEs) for flow and transport are a valuable tool for simulating subsurface features and conditions encountered in the field at government and private sites. ISEs offer the ability to study, under controlled laboratory conditions, complicated processes characteristic of mixed wastes and heterogeneous subsurface environments, in multiple dimensions and at different scales. ISEs may, therefore, result in major cost savings if employed prior to field studies. A distinct advantage of ISEs is that researchers can design physical and/or chemical heterogeneities in the porous media matrix that better approximate natural field conditions and therefore address research questions that contain the additional complexity of processes often encountered in the natural environment. A new Subsurface Flow and Transport Laboratory (SFTL) has been developed for ISE users in the Environmental Spectroscopy & Biogeochemistry Facility in the Environmental Molecular Sciences Laboratory (EMSL) at Pacific Northwest National Laboratory (PNNL). The SFTL offers a variety of columns and flow cells, a new state-of-the-art dual-energy gamma system, a fully automated saturation-pressure apparatus, and analytical equipment for sample processing. The new facility, including qualified staff, is available for scientists interested in collaboration on conducting high-quality flow and transport experiments, including contaminant remediation. Close linkages exist between the SFTL and numerical modelers to aid in experimental design and interpretation. This presentation will discuss the facility and outline the procedures required to submit a proposal to use this unique facility for research purposes. The W. R. Wiley Environmental Molecular Sciences Laboratory, a national scientific user facility, is sponsored by the U.S. Department of Energy's Office of Biological and Environmental Research and located at Pacific Northwest National Laboratory.

  7. The WEBSIM FISHBANKS Simulation Laboratory: Analysis of Its Ripple Effects

    ERIC Educational Resources Information Center

    Arantes do Amaral, João Alberto; Hess, Aurélio

    2018-01-01

    In this article, we discuss the ripple effects of the WEBSIM FISHBANKS Simulation Laboratory held at Federal University of Sao Paulo (UNIFESP) in 2014, held as a result of a partnership between the Sloan School of Management of the Massachusetts Institute of Technology, the UNIFESP, and the Brazilian Chapter of the System Dynamics Society of…

  8. Mars Science Laboratory Rover System Thermal Test

    NASA Technical Reports Server (NTRS)

    Novak, Keith S.; Kempenaar, Joshua E.; Liu, Yuanming; Bhandari, Pradeep; Dudik, Brenda A.

    2012-01-01

    On November 26, 2011, NASA launched a large (900 kg) rover as part of the Mars Science Laboratory (MSL) mission to Mars. The MSL rover is scheduled to land on Mars on August 5, 2012. Prior to launch, the Rover was successfully operated in simulated mission extreme environments during a 16-day long Rover System Thermal Test (STT). This paper describes the MSL Rover STT, test planning, test execution, test results, thermal model correlation and flight predictions. The rover was tested in the JPL 25-Foot Diameter Space Simulator Facility at the Jet Propulsion Laboratory (JPL). The Rover operated in simulated Cruise (vacuum) and Mars Surface environments (8 Torr nitrogen gas) with mission extreme hot and cold boundary conditions. A Xenon lamp solar simulator was used to impose simulated solar loads on the rover during a bounding hot case and during a simulated Mars diurnal test case. All thermal hardware was exercised and performed nominally. The Rover Heat Rejection System, a liquid-phase fluid loop used to transport heat in and out of the electronics boxes inside the rover chassis, performed better than predicted. Steady state and transient data were collected to allow correlation of analytical thermal models. These thermal models were subsequently used to predict rover thermal performance for the MSL Gale Crater landing site. Models predict that critical hardware temperatures will be maintained within allowable flight limits over the entire 669 Sol surface mission.

  9. Use of simulated evaporation to assess the potential for scale formation during reverse osmosis desalination

    USGS Publications Warehouse

    Huff, G.F.

    2004-01-01

    The tendency of solutes in input water to precipitate efficiency lowering scale deposits on the membranes of reverse osmosis (RO) desalination systems is an important factor in determining the suitability of input water for desalination. Simulated input water evaporation can be used as a technique to quantitatively assess the potential for scale formation in RO desalination systems. The technique was demonstrated by simulating the increase in solute concentrations required to form calcite, gypsum, and amorphous silica scales at 25??C and 40??C from 23 desalination input waters taken from the literature. Simulation results could be used to quantitatively assess the potential of a given input water to form scale or to compare the potential of a number of input waters to form scale during RO desalination. Simulated evaporation of input waters cannot accurately predict the conditions under which scale will form owing to the effects of potentially stable supersaturated solutions, solution velocity, and residence time inside RO systems. However, the simulated scale-forming potential of proposed input waters could be compared with the simulated scale-forming potentials and actual scale-forming properties of input waters having documented operational histories in RO systems. This may provide a technique to estimate the actual performance and suitability of proposed input waters during RO.

  10. Improving laboratory efficiencies to scale-up HIV viral load testing.

    PubMed

    Alemnji, George; Onyebujoh, Philip; Nkengasong, John N

    2017-03-01

    Viral load measurement is a key indicator that determines patients' response to treatment and risk for disease progression. Efforts are ongoing in different countries to scale-up access to viral load testing to meet the Joint United Nations Programme on HIV and AIDS target of achieving 90% viral suppression among HIV-infected patients receiving antiretroviral therapy. However, the impact of these initiatives may be challenged by increased inefficiencies along the viral load testing spectrum. This will translate to increased costs and ineffectiveness of scale-up approaches. This review describes different parameters that could be addressed across the viral load testing spectrum aimed at improving efficiencies and utilizing test results for patient management. Though progress is being made in some countries to scale-up viral load, many others still face numerous challenges that may affect scale-up efficiencies: weak demand creation, ineffective supply chain management systems; poor specimen referral systems; inadequate data and quality management systems; and weak laboratory-clinical interface leading to diminished uptake of test results. In scaling up access to viral load testing, there should be a renewed focus to address efficiencies across the entire spectrum, including factors related to access, uptake, and impact of test results.

  11. Power and Scour: Laboratory simulations of tsunami-induced scour

    NASA Astrophysics Data System (ADS)

    Todd, David; McGovern, David; Whitehouse, Richard; Harris, John; Rossetto, Tiziana

    2017-04-01

    The world's coastal regions are becoming increasingly urbanised and densely populated. Recent major tsunami events in regions such as Samoa (2007), Indonesia (2004, 2006, 2010), and Japan (2011) have starkly highlighted this effect, resulting in catastrophic loss of both life and property, with much of the damage to buildings being reported in EEFIT mission reports following each of these events. The URBANWAVES project, led by UCL in collaboration with HR Wallingford, brings the power of the tsunami to the laboratory for the first time. The Pneumatic Tsunami Simulator is capable of tsimulating both idealised and real-world tsunami traces at a scale of 1:50. Experiments undertaken in the Fast Flow Facility at HR Wallingford using square and rectangular buildings placed on a sediment bed have allow us to measure, for the first time under laboratory conditions, the variations in the flow field around buildings produced by tsunami waves as a result of the scour process. The results of these tests are presented, providing insight into the process of scour development under different types of tsunami, giving a glimpse into the power of tsunamis that have already occurred, and helping us to inform the designs of future buildings so that we can be better prepared to analyse and design against these failure modes in the future. Additional supporting abstracts include Foster et al., on tsunami induced building loads; Chandler et al., on the tsunami simulation concept and McGovern et al., on the simulation of tsunami-driven scour and flow fields.

  12. High Fidelity Simulations of Large-Scale Wireless Networks (Plus-Up)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onunkwo, Uzoma

    Sandia has built a strong reputation in scalable network simulation and emulation for cyber security studies to protect our nation’s critical information infrastructures. Georgia Tech has preeminent reputation in academia for excellence in scalable discrete event simulations, with strong emphasis on simulating cyber networks. Many of the experts in this field, such as Dr. Richard Fujimoto, Dr. George Riley, and Dr. Chris Carothers, have strong affiliations with Georgia Tech. The collaborative relationship that we intend to immediately pursue is in high fidelity simulations of practical large-scale wireless networks using ns-3 simulator via Dr. George Riley. This project will have mutualmore » benefits in bolstering both institutions’ expertise and reputation in the field of scalable simulation for cyber-security studies. This project promises to address high fidelity simulations of large-scale wireless networks. This proposed collaboration is directly in line with Georgia Tech’s goals for developing and expanding the Communications Systems Center, the Georgia Tech Broadband Institute, and Georgia Tech Information Security Center along with its yearly Emerging Cyber Threats Report. At Sandia, this work benefits the defense systems and assessment area with promise for large-scale assessment of cyber security needs and vulnerabilities of our nation’s critical cyber infrastructures exposed to wireless communications.« less

  13. Maestro: an orchestration framework for large-scale WSN simulations.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2014-03-18

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation.

  14. Entrainment in Laboratory Simulations of Cumulus Cloud Flows

    NASA Astrophysics Data System (ADS)

    Narasimha, R.; Diwan, S.; Subrahmanyam, D.; Sreenivas, K. R.; Bhat, G. S.

    2010-12-01

    A variety of cumulus cloud flows, including congestus (both shallow bubble and tall tower types), mediocris and fractus have been generated in a water tank by simulating the release of latent heat in real clouds. The simulation is achieved through ohmic heating, injected volumetrically into the flow by applying suitable voltages between diametral cross-sections of starting jets and plumes of electrically conducting fluid (acidified water). Dynamical similarity between atmospheric and laboratory cloud flows is achieved by duplicating values of an appropriate non-dimensional heat release number. Velocity measurements, made by laser instrumentation, show that the Taylor entrainment coefficient generally increases just above the level of commencement of heat injection (corresponding to condensation level in the real cloud). Subsequently the coefficient reaches a maximum before declining to the very low values that characterize tall cumulus towers. The experiments also simulate the protected core of real clouds. Cumulus Congestus : Atmospheric cloud (left), simulated laboratory cloud (right). Panels below show respectively total heat injected and vertical profile of heating in the laboratory cloud.

  15. LABORATORY SCALE STEAM INJECTION TREATABILITY STUDIES

    EPA Science Inventory

    Laboratory scale steam injection treatability studies were first developed at The University of California-Berkeley. A comparable testing facility has been developed at USEPA's Robert S. Kerr Environmental Research Center. Experience has already shown that many volatile organic...

  16. Experimental and simulation studies of pore scale flow and reactive transport associated with supercritical CO2 injection into brine-filled reservoir rocks (Invited)

    NASA Astrophysics Data System (ADS)

    DePaolo, D. J.; Steefel, C. I.; Bourg, I. C.

    2013-12-01

    This talk will review recent research relating to pore scale reactive transport effects done in the context of the Department of Energy-sponsored Energy Frontier Research Center led by Lawrence Berkeley National Laboratory with several other laboratory and University partners. This Center, called the Center for Nanoscale Controls on Geologic CO2 (NCGC) has focused effort on the behavior of supercritical CO2 being injected into and/or residing as capillary trapped-bubbles in sandstone and shale, with particular emphasis on the description of nanoscale to pore scale processes that could provide the basis for advanced simulations. In general, simulation of reservoir-scale behavior of CO2 sequestration assumes a number of mostly qualitative relationships that are defensible as nominal first-order descriptions of single-fluid systems, but neglect the many complications that are associated with a two-phase or three-phase reactive system. The contrasts in properties, and the mixing behavior of scCO2 and brine provide unusual conditions for water-rock interaction, and the NCGC has investigated the underlying issues by a combination of approaches including theoretical and experimental studies of mineral nucleation and growth, experimental studies of brine films, mineral wetting properties, dissolution-precipitation rates and infiltration patterns, molecular dynamic simulations and neutron scattering experiments of fluid properties for fluid confined in nanopores, and various approaches to numerical simulation of reactive transport processes. The work to date has placed new constraints on the thickness of brine films, and also on the wetting properties of CO2 versus brine, a property that varies between minerals and with salinity, and may also change with time as a result of the reactivity of CO2-saturated brine. Mineral dissolution is dependent on reactive surface area, which can be shown to vary by a large factor for various minerals, especially when correlated with

  17. Large-Scale Simulations of Plastic Neural Networks on Neuromorphic Hardware

    PubMed Central

    Knight, James C.; Tully, Philip J.; Kaplan, Bernhard A.; Lansner, Anders; Furber, Steve B.

    2016-01-01

    SpiNNaker is a digital, neuromorphic architecture designed for simulating large-scale spiking neural networks at speeds close to biological real-time. Rather than using bespoke analog or digital hardware, the basic computational unit of a SpiNNaker system is a general-purpose ARM processor, allowing it to be programmed to simulate a wide variety of neuron and synapse models. This flexibility is particularly valuable in the study of biological plasticity phenomena. A recently proposed learning rule based on the Bayesian Confidence Propagation Neural Network (BCPNN) paradigm offers a generic framework for modeling the interaction of different plasticity mechanisms using spiking neurons. However, it can be computationally expensive to simulate large networks with BCPNN learning since it requires multiple state variables for each synapse, each of which needs to be updated every simulation time-step. We discuss the trade-offs in efficiency and accuracy involved in developing an event-based BCPNN implementation for SpiNNaker based on an analytical solution to the BCPNN equations, and detail the steps taken to fit this within the limited computational and memory resources of the SpiNNaker architecture. We demonstrate this learning rule by learning temporal sequences of neural activity within a recurrent attractor network which we simulate at scales of up to 2.0 × 104 neurons and 5.1 × 107 plastic synapses: the largest plastic neural network ever to be simulated on neuromorphic hardware. We also run a comparable simulation on a Cray XC-30 supercomputer system and find that, if it is to match the run-time of our SpiNNaker simulation, the super computer system uses approximately 45× more power. This suggests that cheaper, more power efficient neuromorphic systems are becoming useful discovery tools in the study of plasticity in large-scale brain models. PMID:27092061

  18. Ensemble urban flood simulation in comparison with laboratory-scale experiments: Impact of interaction models for manhole, sewer pipe, and surface flow

    NASA Astrophysics Data System (ADS)

    Noh, Seong Jin; Lee, Seungsoo; An, Hyunuk; Kawaike, Kenji; Nakagawa, Hajime

    2016-11-01

    An urban flood is an integrated phenomenon that is affected by various uncertainty sources such as input forcing, model parameters, complex geometry, and exchanges of flow among different domains in surfaces and subsurfaces. Despite considerable advances in urban flood modeling techniques, limited knowledge is currently available with regard to the impact of dynamic interaction among different flow domains on urban floods. In this paper, an ensemble method for urban flood modeling is presented to consider the parameter uncertainty of interaction models among a manhole, a sewer pipe, and surface flow. Laboratory-scale experiments on urban flood and inundation are performed under various flow conditions to investigate the parameter uncertainty of interaction models. The results show that ensemble simulation using interaction models based on weir and orifice formulas reproduces experimental data with high accuracy and detects the identifiability of model parameters. Among interaction-related parameters, the parameters of the sewer-manhole interaction show lower uncertainty than those of the sewer-surface interaction. Experimental data obtained under unsteady-state conditions are more informative than those obtained under steady-state conditions to assess the parameter uncertainty of interaction models. Although the optimal parameters vary according to the flow conditions, the difference is marginal. Simulation results also confirm the capability of the interaction models and the potential of the ensemble-based approaches to facilitate urban flood simulation.

  19. Effect of nacelle on wake meandering in a laboratory scale wind turbine using LES

    NASA Astrophysics Data System (ADS)

    Foti, Daniel; Yang, Xiaolei; Guala, Michele; Sotiropoulos, Fotis

    2015-11-01

    Wake meandering, large scale motion in the wind turbine wakes, has considerable effects on the velocity deficit and turbulence intensity in the turbine wake from the laboratory scale to utility scale wind turbines. In the dynamic wake meandering model, the wake meandering is assumed to be caused by large-scale atmospheric turbulence. On the other hand, Kang et al. (J. Fluid Mech., 2014) demonstrated that the nacelle geometry has a significant effect on the wake meandering of a hydrokinetic turbine, through the interaction of the inner wake of the nacelle vortex with the outer wake of the tip vortices. In this work, the significance of the nacelle on the wake meandering of a miniature wind turbine previously used in experiments (Howard et al., Phys. Fluid, 2015) is demonstrated with large eddy simulations (LES) using immersed boundary method with fine enough grids to resolve the turbine geometric characteristics. The three dimensionality of the wake meandering is analyzed in detail through turbulent spectra and meander reconstruction. The computed flow fields exhibit wake dynamics similar to those observed in the wind tunnel experiments and are analyzed to shed new light into the role of the energetic nacelle vortex on wake meandering. This work was supported by Department of Energy DOE (DE-EE0002980, DE-EE0005482 and DE-AC04-94AL85000), and Sandia National Laboratories. Computational resources were provided by Sandia National Laboratories and the University of Minnesota Supercomputing.

  20. If You've Got It, Use It (Simulation, That Is...)

    NASA Technical Reports Server (NTRS)

    Frost, Chad; Tucker, George

    2006-01-01

    This viewgraph presentation reviews the Rotorcraft Aircrew Systems Concept Airborne Laboratory (RASCAL) UH-60 in-flight simulator, the use of simulation in support of safety monitor design specification development, the development of a failure/recovery (F/R) rating scale, the use of F/R Rating Scale as a common element between simulation and flight evaluation, and the expansion of the flight envelope without benefit of simulation.

  1. Source Code Analysis Laboratory (SCALe) for Energy Delivery Systems

    DTIC Science & Technology

    2010-12-01

    the software for reevaluation. Once the ree- valuation process is completed, CERT provides the client a report detailing the software’s con - formance...Flagged Nonconformities (FNC) Software System TP/FNC Ratio Mozilla Firefox version 2.0 6/12 50% Linux kernel version 2.6.15 10/126 8% Wine...inappropriately tuned for analysis of the Linux kernel, which has anomalous results. Customizing SCALe to work with energy system software will help

  2. Maestro: An Orchestration Framework for Large-Scale WSN Simulations

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2014-01-01

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123

  3. The Zombie Instability: Using Numerical Simulation to Design a Laboratory Experiment

    NASA Astrophysics Data System (ADS)

    Wang, Meng; Pei, Suyang; Jiang, Chung-Hsiang; Hassanzadeh, Pedram; Marcus, Philip

    2014-11-01

    A new type of finite amplitude-instability has been found in numerical simulations of stratified, rotating, shear flows. The instability occurs via baroclinic critical layers that create linearly unstable vortex layers, which roll-up into vortices. Under the right conditions, those vortices can form a new generation of vortices, resulting in ``vortex self-replication'' that fills the fluid with vortices. Creating this instability in a laboratory would provide further evidence for the existence of the instability, which we first found in numerical simulations of protoplanetary disks. To design a laboratory experiment we need to know how the flow parameters-- shear, rotation and stratification, etc. affect the instability. To build an experiment economically, we also need to know how the finite-amplitude trigger of the instability scales with viscosity and the size of the domain. In this talk, we summarize our findings. We present a map, in terms of the experimentally controllable parameters, that shows where the instability occurs and whether the instability creates a few isolated transient vortices, a few long-lived vortices, or long-lived, self-replicating vortices that fill the entire flow.

  4. A qualitative case study of instructional support for web-based simulated laboratory exercises in online college chemistry laboratory courses

    NASA Astrophysics Data System (ADS)

    Schulman, Kathleen M.

    This study fills a gap in the research literature regarding the types of instructional support provided by instructors in online introductory chemistry laboratory courses that employ chemistry simulations as laboratory exercises. It also provides information regarding students' perceptions of the effectiveness of that instructional support. A multiple case study methodology was used to carry out the research. Two online introductory chemistry courses were studied at two community colleges. Data for this study was collected using phone interviews with faculty and student participants, surveys completed by students, and direct observation of the instructional designs of instructional support in the online Blackboard web sites and the chemistry simulations used by the participating institutions. The results indicated that the instructors provided multiple types of instructional support that correlated with forms of effective instructional support identified in the research literature, such as timely detailed feedback, detailed instructions for the laboratory experiments, and consistency in the instructional design of lecture and laboratory course materials, including the chemistry lab simulation environment. The students in one of these courses identified the following as the most effective types of instructional support provided: the instructor's feedback, opportunities to apply chemistry knowledge in the chemistry lab exercises, detailed procedures for the simulated laboratory exercises, the organization of the course Blackboard sites and the chemistry lab simulation web sites, and the textbook homework web sites. Students also identified components of instructional support they felt were missing. These included a desire for more interaction with the instructor, more support for the simulated laboratory exercises from the instructor and the developer of the chemistry simulations, and faster help with questions about the laboratory exercises or experimental

  5. Geophysical monitoring of solute transport in dual-domain environments through laboratory experiments, field-scale solute tracer tests, and numerical simulation

    NASA Astrophysics Data System (ADS)

    Swanson, Ryan David

    The advection-dispersion equation (ADE) fails to describe non-Fickian solute transport breakthrough curves (BTCs) in saturated porous media in both laboratory and field experiments, necessitating the use of other models. The dual-domain mass transfer (DDMT) model partitions the total porosity into mobile and less-mobile domains with an exchange of mass between the two domains, and this model can reproduce better fits to BTCs in many systems than ADE-based models. However, direct experimental estimation of DDMT model parameters remains elusive and model parameters are often calculated a posteriori by an optimization procedure. Here, we investigate the use of geophysical tools (direct-current resistivity, nuclear magnetic resonance, and complex conductivity) to estimate these model parameters directly. We use two different samples of the zeolite clinoptilolite, a material shown to demonstrate solute mass transfer due to a significant internal porosity, and provide the first evidence that direct-current electrical methods can track solute movement into and out of a less-mobile pore space in controlled laboratory experiments. We quantify the effects of assuming single-rate DDMT for multirate mass transfer systems. We analyze pore structures using material characterization methods (mercury porosimetry, scanning electron microscopy, and X-ray computer tomography), and compare these observations to geophysical measurements. Nuclear magnetic resonance in conjunction with direct-current resistivity measurements can constrain mobile and less-mobile porosities, but complex conductivity may have little value in relation to mass transfer despite the hypothesis that mass transfer and complex conductivity lengths scales are related. Finally, we conduct a geoelectrical monitored tracer test at the Macrodispersion Experiment (MADE) site in Columbus, MS. We relate hydraulic and electrical conductivity measurements to generate a 3D hydraulic conductivity field, and compare to

  6. EON: software for long time simulations of atomic scale systems

    NASA Astrophysics Data System (ADS)

    Chill, Samuel T.; Welborn, Matthew; Terrell, Rye; Zhang, Liang; Berthet, Jean-Claude; Pedersen, Andreas; Jónsson, Hannes; Henkelman, Graeme

    2014-07-01

    The EON software is designed for simulations of the state-to-state evolution of atomic scale systems over timescales greatly exceeding that of direct classical dynamics. States are defined as collections of atomic configurations from which a minimization of the potential energy gives the same inherent structure. The time evolution is assumed to be governed by rare events, where transitions between states are uncorrelated and infrequent compared with the timescale of atomic vibrations. Several methods for calculating the state-to-state evolution have been implemented in EON, including parallel replica dynamics, hyperdynamics and adaptive kinetic Monte Carlo. Global optimization methods, including simulated annealing, basin hopping and minima hopping are also implemented. The software has a client/server architecture where the computationally intensive evaluations of the interatomic interactions are calculated on the client-side and the state-to-state evolution is managed by the server. The client supports optimization for different computer architectures to maximize computational efficiency. The server is written in Python so that developers have access to the high-level functionality without delving into the computationally intensive components. Communication between the server and clients is abstracted so that calculations can be deployed on a single machine, clusters using a queuing system, large parallel computers using a message passing interface, or within a distributed computing environment. A generic interface to the evaluation of the interatomic interactions is defined so that empirical potentials, such as in LAMMPS, and density functional theory as implemented in VASP and GPAW can be used interchangeably. Examples are given to demonstrate the range of systems that can be modeled, including surface diffusion and island ripening of adsorbed atoms on metal surfaces, molecular diffusion on the surface of ice and global structural optimization of nanoparticles.

  7. Fate of Salmonella Typhimurium in laboratory-scale drinking water biofilms.

    PubMed

    Schaefer, L M; Brözel, V S; Venter, S N

    2013-12-01

    Investigations were carried out to evaluate and quantify colonization of laboratory-scale drinking water biofilms by a chromosomally green fluorescent protein (gfp)-tagged strain of Salmonella Typhimurium. Gfp encodes the green fluorescent protein and thus allows in situ detection of undisturbed cells and is ideally suited for monitoring Salmonella in biofilms. The fate and persistence of non-typhoidal Salmonella in simulated drinking water biofilms was investigated. The ability of Salmonella to form biofilms in monoculture and the fate and persistence of Salmonella in a mixed aquatic biofilm was examined. In monoculture S. Typhimurium formed loosely structured biofilms. Salmonella colonized established multi-species drinking water biofilms within 24 hours, forming micro-colonies within the biofilm. S. Typhimurium was also released at high levels from the drinking water-associated biofilm into the water passing through the system. This indicated that Salmonella could enter into, survive and grow within, and be released from a drinking water biofilm. The ability of Salmonella to survive and persist in a drinking water biofilm, and be released at high levels into the flow for recolonization elsewhere, indicates the potential for a persistent health risk to consumers once a network becomes contaminated with this bacterium.

  8. Use of a PhET Interactive Simulation in General Chemistry Laboratory: Models of the Hydrogen Atom

    ERIC Educational Resources Information Center

    Clark, Ted M.; Chamberlain, Julia M.

    2014-01-01

    An activity supporting the PhET interactive simulation, Models of the Hydrogen Atom, has been designed and used in the laboratory portion of a general chemistry course. This article describes the framework used to successfully accomplish implementation on a large scale. The activity guides students through a comparison and analysis of the six…

  9. Numerical simulation on hydromechanical coupling in porous media adopting three-dimensional pore-scale model.

    PubMed

    Liu, Jianjun; Song, Rui; Cui, Mengmeng

    2014-01-01

    A novel approach of simulating hydromechanical coupling in pore-scale models of porous media is presented in this paper. Parameters of the sandstone samples, such as the stress-strain curve, Poisson's ratio, and permeability under different pore pressure and confining pressure, are tested in laboratory scale. The micro-CT scanner is employed to scan the samples for three-dimensional images, as input to construct the model. Accordingly, four physical models possessing the same pore and rock matrix characteristics as the natural sandstones are developed. Based on the micro-CT images, the three-dimensional finite element models of both rock matrix and pore space are established by MIMICS and ICEM software platform. Navier-Stokes equation and elastic constitutive equation are used as the mathematical model for simulation. A hydromechanical coupling analysis in pore-scale finite element model of porous media is simulated by ANSYS and CFX software. Hereby, permeability of sandstone samples under different pore pressure and confining pressure has been predicted. The simulation results agree well with the benchmark data. Through reproducing its stress state underground, the prediction accuracy of the porous rock permeability in pore-scale simulation is promoted. Consequently, the effects of pore pressure and confining pressure on permeability are revealed from the microscopic view.

  10. Numerical Simulation on Hydromechanical Coupling in Porous Media Adopting Three-Dimensional Pore-Scale Model

    PubMed Central

    Liu, Jianjun; Song, Rui; Cui, Mengmeng

    2014-01-01

    A novel approach of simulating hydromechanical coupling in pore-scale models of porous media is presented in this paper. Parameters of the sandstone samples, such as the stress-strain curve, Poisson's ratio, and permeability under different pore pressure and confining pressure, are tested in laboratory scale. The micro-CT scanner is employed to scan the samples for three-dimensional images, as input to construct the model. Accordingly, four physical models possessing the same pore and rock matrix characteristics as the natural sandstones are developed. Based on the micro-CT images, the three-dimensional finite element models of both rock matrix and pore space are established by MIMICS and ICEM software platform. Navier-Stokes equation and elastic constitutive equation are used as the mathematical model for simulation. A hydromechanical coupling analysis in pore-scale finite element model of porous media is simulated by ANSYS and CFX software. Hereby, permeability of sandstone samples under different pore pressure and confining pressure has been predicted. The simulation results agree well with the benchmark data. Through reproducing its stress state underground, the prediction accuracy of the porous rock permeability in pore-scale simulation is promoted. Consequently, the effects of pore pressure and confining pressure on permeability are revealed from the microscopic view. PMID:24955384

  11. Parachute Models Used in the Mars Science Laboratory Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Cruz, Juan R.; Way, David W.; Shidner, Jeremy D.; Davis, Jody L.; Powell, Richard W.; Kipp, Devin M.; Adams, Douglas S.; Witkowski, Al; Kandis, Mike

    2013-01-01

    An end-to-end simulation of the Mars Science Laboratory (MSL) entry, descent, and landing (EDL) sequence was created at the NASA Langley Research Center using the Program to Optimize Simulated Trajectories II (POST2). This simulation is capable of providing numerous MSL system and flight software responses, including Monte Carlo-derived statistics of these responses. The MSL POST2 simulation includes models of EDL system elements, including those related to the parachute system. Among these there are models for the parachute geometry, mass properties, deployment, inflation, opening force, area oscillations, aerodynamic coefficients, apparent mass, interaction with the main landing engines, and off-loading. These models were kept as simple as possible, considering the overall objectives of the simulation. The main purpose of this paper is to describe these parachute system models to the extent necessary to understand how they work and some of their limitations. A list of lessons learned during the development of the models and simulation is provided. Future improvements to the parachute system models are proposed.

  12. Three-dimensional poor man's Navier-Stokes equation: a discrete dynamical system exhibiting k(-5/3) inertial subrange energy scaling.

    PubMed

    McDonough, J M

    2009-06-01

    Outline of the derivation and mathematical and physical interpretations are presented for a discrete dynamical system known as the "poor man's Navier-Stokes equation." Numerical studies demonstrate that velocity fields produced by this dynamical system are similar to those seen in laboratory experiments and in detailed simulations, and they lead to scaling for the turbulence kinetic energy spectrum in accord with Kolmogorov K41 theory.

  13. Simulation of all-scale atmospheric dynamics on unstructured meshes

    NASA Astrophysics Data System (ADS)

    Smolarkiewicz, Piotr K.; Szmelter, Joanna; Xiao, Feng

    2016-10-01

    The advance of massively parallel computing in the nineteen nineties and beyond encouraged finer grid intervals in numerical weather-prediction models. This has improved resolution of weather systems and enhanced the accuracy of forecasts, while setting the trend for development of unified all-scale atmospheric models. This paper first outlines the historical background to a wide range of numerical methods advanced in the process. Next, the trend is illustrated with a technical review of a versatile nonoscillatory forward-in-time finite-volume (NFTFV) approach, proven effective in simulations of atmospheric flows from small-scale dynamics to global circulations and climate. The outlined approach exploits the synergy of two specific ingredients: the MPDATA methods for the simulation of fluid flows based on the sign-preserving properties of upstream differencing; and the flexible finite-volume median-dual unstructured-mesh discretisation of the spatial differential operators comprising PDEs of atmospheric dynamics. The paper consolidates the concepts leading to a family of generalised nonhydrostatic NFTFV flow solvers that include soundproof PDEs of incompressible Boussinesq, anelastic and pseudo-incompressible systems, common in large-eddy simulation of small- and meso-scale dynamics, as well as all-scale compressible Euler equations. Such a framework naturally extends predictive skills of large-eddy simulation to the global atmosphere, providing a bottom-up alternative to the reverse approach pursued in the weather-prediction models. Theoretical considerations are substantiated by calculations attesting to the versatility and efficacy of the NFTFV approach. Some prospective developments are also discussed.

  14. Computer and laboratory simulation of interactions between spacecraft surfaces and charged-particle environments

    NASA Technical Reports Server (NTRS)

    Stevens, N. J.

    1979-01-01

    Cases where the charged-particle environment acts on the spacecraft (e.g., spacecraft charging phenomena) and cases where a system on the spacecraft causes the interaction (e.g., high voltage space power systems) are considered. Both categories were studied in ground simulation facilities to understand the processes involved and to measure the pertinent parameters. Computer simulations are based on the NASA Charging Analyzer Program (NASCAP) code. Analytical models are developed in this code and verified against the experimental data. Extrapolation from the small test samples to space conditions are made with this code. Typical results from laboratory and computer simulations are presented for both types of interactions. Extrapolations from these simulations to performance in space environments are discussed.

  15. Laboratory Information Management System (LIMS): A case study

    NASA Technical Reports Server (NTRS)

    Crandall, Karen S.; Auping, Judith V.; Megargle, Robert G.

    1987-01-01

    In the late 70's, a refurbishment of the analytical laboratories serving the Materials Division at NASA Lewis Research Center was undertaken. As part of the modernization efforts, a Laboratory Information Management System (LIMS) was to be included. Preliminary studies indicated a custom-designed system as the best choice in order to satisfy all of the requirements. A scaled down version of the original design has been in operation since 1984. The LIMS, a combination of computer hardware, provides the chemical characterization laboratory with an information data base, a report generator, a user interface, and networking capabilities. This paper is an account of the processes involved in designing and implementing that LIMS.

  16. Fluid dynamics structures in a fire environment observed in laboratory-scale experiments

    Treesearch

    J. Lozano; W. Tachajapong; D.R. Weise; S. Mahalingam; M. Princevac

    2010-01-01

    Particle Image Velocimetry (PIV) measurements were performed in laboratory-scale experimental fires spreading across horizontal fuel beds composed of aspen (Populus tremuloides Michx) excelsior. The continuous flame, intermittent flame, and thermal plume regions of a fire were investigated. Utilizing a PIV system, instantaneous velocity fields for...

  17. Validating the Equilibrium Stage Model for an Azeotropic System in a Laboratorial Distillation Column

    ERIC Educational Resources Information Center

    Duarte, B. P. M.; Coelho Pinheiro, M. N.; Silva, D. C. M.; Moura, M. J.

    2006-01-01

    The experiment described is an excellent opportunity to apply theoretical concepts of distillation, thermodynamics of mixtures and process simulation at laboratory scale, and simultaneously enhance the ability of students to operate, control and monitor complex units.

  18. 3D printing application and numerical simulations in a fracture system

    NASA Astrophysics Data System (ADS)

    Yoon, H.; Martinez, M. J.

    2017-12-01

    The hydrogeological and mechanical properties in fractured and porous media are fundamental to predicting coupled multiphysics processes in the subsurface. Recent advances in experimental methods and multi-scale imaging capabilities have revolutionized our ability to quantitatively characterize geomaterials and digital counterparts are now routinely used for numerical simulations to characterize petrophysical and mechanical properties across scales. 3D printing is a very effective and creative technique that reproduce the digital images in a controlled way. For geoscience applications, 3D printing can be co-opted to print reproducible porous and fractured structures derived from CT-imaging of actual rocks and theoretical algorithms for experimental testing. In this work we used a stereolithography (SLA) method to create a single fracture network. The fracture in shale was first scanned using a microCT system and then the digital fracture network was printed into two parts and assembled. Aperture ranges from 0.3 to 1 mm. In particular, we discuss the design of single fracture network and the progress of printing practices to reproduce the fracture network system. Printed samples at different scales are used to measure the permeability and surface roughness. Various numerical simulations including (non-)reactive transport and multiphase flow cases are performed to study fluid flow characterization. We will also discuss the innovative advancement of 3D printing techniques applicable for coupled processes in the subsurface. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology & Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525.

  19. Computer Laboratory for Multi-scale Simulations of Novel Nanomaterials

    DTIC Science & Technology

    2014-09-15

    schemes for multiscale modeling of polymers. Permselective ion-exchange membranes for protective clothing, fuel cells , and batteries are of special...polyelectrolyte membranes ( PEM ) with chemical warfare agents (CWA) and their simulants and (2) development of new simulation methods and computational...chemical potential using gauge cell method and calculation of density profiles. However, the code does not run in parallel environments. For mesoscale

  20. Replicating the microbial community and water quality performance of full-scale slow sand filters in laboratory-scale filters.

    PubMed

    Haig, Sarah-Jane; Quince, Christopher; Davies, Robert L; Dorea, Caetano C; Collins, Gavin

    2014-09-15

    Previous laboratory-scale studies to characterise the functional microbial ecology of slow sand filters have suffered from methodological limitations that could compromise their relevance to full-scale systems. Therefore, to ascertain if laboratory-scale slow sand filters (L-SSFs) can replicate the microbial community and water quality production of industrially operated full-scale slow sand filters (I-SSFs), eight cylindrical L-SSFs were constructed and were used to treat water from the same source as the I-SSFs. Half of the L-SSFs sand beds were composed of sterilized sand (sterile) from the industrial filters and the other half with sand taken directly from the same industrial filter (non-sterile). All filters were operated for 10 weeks, with the microbial community and water quality parameters sampled and analysed weekly. To characterize the microbial community phyla-specific qPCR assays and 454 pyrosequencing of the 16S rRNA gene were used in conjunction with an array of statistical techniques. The results demonstrate that it is possible to mimic both the water quality production and the structure of the microbial community of full-scale filters in the laboratory - at all levels of taxonomic classification except OTU - thus allowing comparison of LSSF experiments with full-scale units. Further, it was found that the sand type composing the filter bed (non-sterile or sterile), the water quality produced, the age of the filters and the depth of sand samples were all significant factors in explaining observed differences in the structure of the microbial consortia. This study is the first to the authors' knowledge that demonstrates that scaled-down slow sand filters can accurately reproduce the water quality and microbial consortia of full-scale slow sand filters. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Recent H- diagnostics, plasma simulations, and 2X scaled Penning ion source developments at the Rutherford Appleton Laboratory

    NASA Astrophysics Data System (ADS)

    Lawrie, S. R.; Faircloth, D. C.; Smith, J. D.; Sarmento, T. M.; Whitehead, M. O.; Wood, T.; Perkins, M.; Macgregor, J.; Abel, R.

    2018-05-01

    A vessel for extraction and source plasma analyses is being used for Penning H- ion source development at the Rutherford Appleton Laboratory. A new set of optical elements including an einzel lens has been installed, which transports over 80 mA of H- beam successfully. Simultaneously, a 2X scaled Penning source has been developed to reduce cathode power density. The 2X source is now delivering a 65 mA H- ion beam at 10% duty factor, meeting its design criteria. The long-term viability of the einzel lens and 2X source is now being evaluated, so new diagnostic devices have been installed. A pair of electrostatic deflector plates is used to correct beam misalignment and perform fast chopping, with a voltage rise time of 24 ns. A suite of four quartz crystal microbalances has shown that the cesium flux in the vacuum vessel is only increased by a factor of two, despite the absence of a dedicated cold trap. Finally, an infrared camera has demonstrated good agreement with thermal simulations but has indicated unexpected heating due to beam loss on the downstream electrode. These types of diagnostics are suitable for monitoring all operational ion sources. In addition to experimental campaigns and new diagnostic tools, the high-performance VSim and COMSOL software packages are being used for plasma simulations of two novel ion thrusters for space propulsion applications. In parallel, a VSim framework has been established to include arbitrary temperature and cesium fields to allow the modeling of surface physics in H- ion sources.

  2. The design of dapog rice seeder model for laboratory scale

    NASA Astrophysics Data System (ADS)

    Purba, UI; Rizaldi, T.; Sumono; Sigalingging, R.

    2018-02-01

    The dapog system is seeding rice seeds using a special nursery tray. Rice seedings with dapog systems can produce seedlings in the form of higher quality and uniform seed rolls. This study aims to reduce the cost of making large-scale apparatus by designing models for small-scale and can be used for learning in the laboratory. Parameters observed were soil uniformity, seeds and fertilizers, soil looses, seeds and fertilizers, effective capacity of apparatus, and power requirements. The results showed a high uniformity in soil, seed and fertilizer respectively 92.8%, 1-3 seeds / cm2 and 82%. The scattered materials for soil, seed and fertilizer were respectively 6.23%, 2.7% and 2.23%. The effective capacity of apparatus was 360 boxes / hour with 237.5 kWh of required power.

  3. Laboratory Scale Coal And Biomass To Drop-In Fuels (CBDF) Production And Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lux, Kenneth; Imam, Tahmina; Chevanan, Nehru

    This Final Technical Report describes the work and accomplishments of the project entitled, “Laboratory Scale Coal and Biomass to Drop-In Fuels (CBDF) Production and Assessment.” The main objective of the project was to fabricate and test a lab-scale liquid-fuel production system using coal containing different percentages of biomass such as corn stover and switchgrass at a rate of 2 liters per day. The system utilizes the patented Altex fuel-production technology, which incorporates advanced catalysts developed by Pennsylvania State University. The system was designed, fabricated, tested, and assessed for economic and environmental feasibility relative to competing technologies.

  4. NH4+ ad-/desorption in sequencing batch reactors: simulation, laboratory and full-scale studies.

    PubMed

    Schwitalla, P; Mennerich, A; Austermann-Haun, U; Müller, A; Dorninger, C; Daims, H; Holm, N C; Rönner-Holm, S G E

    2008-01-01

    Significant NH4-N balance deficits were found during the measurement campaigns for the data collection for dynamic simulation studies at five full-scale sequencing batch reactor (SBR) waste water treatment plants (WWTPs), as well as during subsequent calibrations at the investigated plants. Subsequent lab scale investigations showed high evidence for dynamic, cycle-specific NH4+ ad-/desorption to the activated flocs as one reason for this balance deficit. This specific dynamic was investigated at five full-scale SBR plants for the search of the general causing mechanisms. The general mechanism found was a NH4+ desorption from the activated flocs at the end of the nitrification phase with subsequent nitrification and a chemical NH4+ adsorption at the flocs in the course of the filling phases. This NH4+ ad-/desorption corresponds to an antiparallel K+ ad/-desorption.One reasonable full-scale application was investigated at three SBR plants, a controlled filling phase at the beginning of the sedimentation phase. The results indicate that this kind of filling event must be specifically hydraulic controlled and optimised in order to prevent too high waste water break through into the clear water phase, which will subsequently be discarded. IWA Publishing 2008.

  5. A laboratory rainfall simulator to study the soil erosion and runoff water

    NASA Astrophysics Data System (ADS)

    Cancelo González, Javier; Rial, M. E.; Díaz-Fierros, Francisco

    2010-05-01

    The soil erosion and the runoff water composition in some areas affected by forest fires or submitted to intensive agriculture are an important factor to keep an account, particularly in sensitive areas like estuary and rias that have a high importance in the socioeconomic development of some regions. An understanding of runoff production indicates the processes by which pollutants reach streams and also indicates the management techniques that might be uses to minimize the discharge of these materials into surface waters. One of the most methodology implemented in the soil erosion studies is a rainfall simulation. This method can reproduce the natural soil degradation processes in field or laboratory experiences. With the aim of improve the rainfall-runoff generation, a laboratory rainfall simulator which incorporates a fan-like intermittent water jet system for rainfall generation were modified. The major change made to the rainfall simulator consist in a system to coupling stainless steel boxes, whose dimensions are 12 x 20 x 45 centimeters, and it allows to place soil samples under the rainfall simulator. Previously these boxes were used to take soil samples in field with more of 20 centimeters of depth, causing the minimum disturbance in their properties and structure. These new implementations in the rainfall simulator also allow collect water samples of runoff in two ways: firstly, the rain water that constituted the overland flow or direct runoff and besides the rain water seeps into the soil by the process of infiltration and contributed to the subsurface runoff. Among main the variables controlled in the rainfall simulations were the soil slope and the intensity and duration of rainfall. With the aim of test the prototype, six soil samples were collected in the same sampling point and subjected to rainfall simulations in laboratory with the same intensity and duration. Two samples will constitute the control test, and they were fully undisturbed, and four

  6. Numerical Investigation of Earthquake Nucleation on a Laboratory-Scale Heterogeneous Fault with Rate-and-State Friction

    NASA Astrophysics Data System (ADS)

    Higgins, N.; Lapusta, N.

    2014-12-01

    Many large earthquakes on natural faults are preceded by smaller events, often termed foreshocks, that occur close in time and space to the larger event that follows. Understanding the origin of such events is important for understanding earthquake physics. Unique laboratory experiments of earthquake nucleation in a meter-scale slab of granite (McLaskey and Kilgore, 2013; McLaskey et al., 2014) demonstrate that sample-scale nucleation processes are also accompanied by much smaller seismic events. One potential explanation for these foreshocks is that they occur on small asperities - or bumps - on the fault interface, which may also be the locations of smaller critical nucleation size. We explore this possibility through 3D numerical simulations of a heterogeneous 2D fault embedded in a homogeneous elastic half-space, in an attempt to qualitatively reproduce the laboratory observations of foreshocks. In our model, the simulated fault interface is governed by rate-and-state friction with laboratory-relevant frictional properties, fault loading, and fault size. To create favorable locations for foreshocks, the fault surface heterogeneity is represented as patches of increased normal stress, decreased characteristic slip distance L, or both. Our simulation results indicate that one can create a rate-and-state model of the experimental observations. Models with a combination of higher normal stress and lower L at the patches are closest to matching the laboratory observations of foreshocks in moment magnitude, source size, and stress drop. In particular, we find that, when the local compression is increased, foreshocks can occur on patches that are smaller than theoretical critical nucleation size estimates. The additional inclusion of lower L for these patches helps to keep stress drops within the range observed in experiments, and is compatible with the asperity model of foreshock sources, since one would expect more compressed spots to be smoother (and hence have

  7. Building a Laboratory-Scale Biogas Plant and Verifying its Functionality

    NASA Astrophysics Data System (ADS)

    Boleman, Tomáš; Fiala, Jozef; Blinová, Lenka; Gerulová, Kristína

    2011-01-01

    The paper deals with the process of building a laboratory-scale biogas plant and verifying its functionality. The laboratory-scale prototype was constructed in the Department of Safety and Environmental Engineering at the Faculty of Materials Science and Technology in Trnava, of the Slovak University of Technology. The Department has already built a solar laboratory to promote and utilise solar energy, and designed SETUR hydro engine. The laboratory is the next step in the Department's activities in the field of renewable energy sources and biomass. The Department is also involved in the European Union project, where the goal is to upgrade all existed renewable energy sources used in the Department.

  8. Life sciences laboratory breadboard simulations for shuttle

    NASA Technical Reports Server (NTRS)

    Taketa, S. T.; Simmonds, R. C.; Callahan, P. X.

    1975-01-01

    Breadboard simulations of life sciences laboratory concepts for conducting bioresearch in space were undertaken as part of the concept verification testing program. Breadboard simulations were conducted to test concepts of and scope problems associated with bioresearch support equipment and facility requirements and their operational integration for conducting manned research in earth orbital missions. It emphasized requirements, functions, and procedures for candidate research on crew members (simulated) and subhuman primates and on typical radioisotope studies in rats, a rooster, and plants.

  9. ESTABLISHMENT OF AN ENVIRONMENTAL CONTROL TECHNOLOGY LABORATORY WITH A CIRCULATING FLUIDIZED-BED COMBUSTION SYSTEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei-Ping Pan; Andy Wu; John T. Riley

    This report is to present the progress made on the project ''Establishment of an Environmental Control Technology Laboratory (ECTL) with a Circulating Fluidized-Bed Combustion (CFBC) System'' during the period October 1, 2004 through December 31, 2004. The following tasks have been completed. First, the renovation of the new Combustion Laboratory and the construction of the Circulating Fluidized-Bed (CFB) Combustor Building have proceeded well. Second, the detailed design of supporting and hanging structures for the CFBC was completed. Third, the laboratory-scale simulated fluidized-bed facility was modified after completing a series of pretests. The two problems identified during the pretest were solved.more » Fourth, the carbonization of chicken waste and coal was investigated in a tube furnace and a Thermogravimetric Analyzer (TGA). The experimental results from this study are presented in this report. Finally, the proposed work for the next quarter has been outlined in this report.« less

  10. A Report on Simulation-Driven Reliability and Failure Analysis of Large-Scale Storage Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wan, Lipeng; Wang, Feiyi; Oral, H. Sarp

    High-performance computing (HPC) storage systems provide data availability and reliability using various hardware and software fault tolerance techniques. Usually, reliability and availability are calculated at the subsystem or component level using limited metrics such as, mean time to failure (MTTF) or mean time to data loss (MTTDL). This often means settling on simple and disconnected failure models (such as exponential failure rate) to achieve tractable and close-formed solutions. However, such models have been shown to be insufficient in assessing end-to-end storage system reliability and availability. We propose a generic simulation framework aimed at analyzing the reliability and availability of storagemore » systems at scale, and investigating what-if scenarios. The framework is designed for an end-to-end storage system, accommodating the various components and subsystems, their interconnections, failure patterns and propagation, and performs dependency analysis to capture a wide-range of failure cases. We evaluate the framework against a large-scale storage system that is in production and analyze its failure projections toward and beyond the end of lifecycle. We also examine the potential operational impact by studying how different types of components affect the overall system reliability and availability, and present the preliminary results« less

  11. A Laboratory Glass-Cockpit Flight Simulator for Automation and Communications Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Gregory M.; Heers, Susan T.; Shafto, Michael G. (Technical Monitor)

    1995-01-01

    A laboratory glass-cockpit flight simulator supporting research on advanced commercial flight deck and Air Traffic Control (ATC) automation and communication interfaces has been developed at the Aviation Operations Branch at the NASA Ames Research Center. This system provides independent and integrated flight and ATC simulator stations, party line voice and datalink communications, along with video and audio monitoring and recording capabilities. Over the last several years, it has been used to support the investigation of flight human factors research issues involving: communication modality; message content and length; graphical versus textual presentation of information, and human accountability for automation. This paper updates the status of this simulator, describing new functionality in the areas of flight management system, EICAS display, and electronic checklist integration. It also provides an overview of several experiments performed using this simulator, including their application areas and results. Finally future enhancements to its ATC (integration of CTAS software) and flight deck (full crew operations) functionality are described.

  12. Numerical Analysis of Mixed-Phase Icing Cloud Simulations in the NASA Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Bartkus, Tadas; Tsao, Jen-Ching; Struk, Peter; Van Zante, Judith

    2017-01-01

    This presentation describes the development of a numerical model that couples the thermal interaction between ice particles, water droplets, and the flowing gas of an icing wind tunnel for simulation of NASA Glenn Research Centers Propulsion Systems Laboratory (PSL). The ultimate goal of the model is to better understand the complex interactions between the test parameters and have greater confidence in the conditions at the test section of the PSL tunnel. The model attempts to explain the observed changes in test conditions by coupling the conservation of mass and energy equations for both the cloud particles and flowing gas mass. Model predictions were compared to measurements taken during May 2015 testing at PSL, where test conditions varied gas temperature, pressure, velocity and humidity levels, as well as the cloud total water content, particle initial temperature, and particle size distribution.

  13. Numerical Analysis of Mixed-Phase Icing Cloud Simulations in the NASA Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Bartkus, Tadas P.; Tsao, Jen-Ching; Struk, Peter M.; Van Zante, Judith F.

    2017-01-01

    This paper describes the development of a numerical model that couples the thermal interaction between ice particles, water droplets, and the flowing gas of an icing wind tunnel for simulation of NASA Glenn Research Centers Propulsion Systems Laboratory (PSL). The ultimate goal of the model is to better understand the complex interactions between the test parameters and have greater confidence in the conditions at the test section of the PSL tunnel. The model attempts to explain the observed changes in test conditions by coupling the conservation of mass and energy equations for both the cloud particles and flowing gas mass. Model predictions were compared to measurements taken during May 2015 testing at PSL, where test conditions varied gas temperature, pressure, velocity and humidity levels, as well as the cloud total water content, particle initial temperature, and particle size distribution.

  14. In-Flight Validation of a Pilot Rating Scale for Evaluating Failure Transients in Electronic Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Kalinowski, Kevin F.; Tucker, George E.; Moralez, Ernesto, III

    2006-01-01

    Engineering development and qualification of a Research Flight Control System (RFCS) for the Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) JUH-60A has motivated the development of a pilot rating scale for evaluating failure transients in fly-by-wire flight control systems. The RASCAL RFCS includes a highly-reliable, dual-channel Servo Control Unit (SCU) to command and monitor the performance of the fly-by-wire actuators and protect against the effects of erroneous commands from the flexible, but single-thread Flight Control Computer. During the design phase of the RFCS, two piloted simulations were conducted on the Ames Research Center Vertical Motion Simulator (VMS) to help define the required performance characteristics of the safety monitoring algorithms in the SCU. Simulated failures, including hard-over and slow-over commands, were injected into the command path, and the aircraft response and safety monitor performance were evaluated. A subjective Failure/Recovery Rating (F/RR) scale was developed as a means of quantifying the effects of the injected failures on the aircraft state and the degree of pilot effort required to safely recover the aircraft. A brief evaluation of the rating scale was also conducted on the Army/NASA CH-47B variable stability helicopter to confirm that the rating scale was likely to be equally applicable to in-flight evaluations. Following the initial research flight qualification of the RFCS in 2002, a flight test effort was begun to validate the performance of the safety monitors and to validate their design for the safe conduct of research flight testing. Simulated failures were injected into the SCU, and the F/RR scale was applied to assess the results. The results validate the performance of the monitors, and indicate that the Failure/Recovery Rating scale is a very useful tool for evaluating failure transients in fly-by-wire flight control systems.

  15. Simulations and Evaluation of Mesoscale Convective Systems in a Multi-scale Modeling Framework (MMF)

    NASA Astrophysics Data System (ADS)

    Chern, J. D.; Tao, W. K.

    2017-12-01

    It is well known that the mesoscale convective systems (MCS) produce more than 50% of rainfall in most tropical regions and play important roles in regional and global water cycles. Simulation of MCSs in global and climate models is a very challenging problem. Typical MCSs have horizontal scale of a few hundred kilometers. Models with a domain of several hundred kilometers and fine enough resolution to properly simulate individual clouds are required to realistically simulate MCSs. The multiscale modeling framework (MMF), which replaces traditional cloud parameterizations with cloud-resolving models (CRMs) within a host atmospheric general circulation model (GCM), has shown some capabilities of simulating organized MCS-like storm signals and propagations. However, its embedded CRMs typically have small domain (less than 128 km) and coarse resolution ( 4 km) that cannot realistically simulate MCSs and individual clouds. In this study, a series of simulations were performed using the Goddard MMF. The impacts of the domain size and model grid resolution of the embedded CRMs on simulating MCSs are examined. The changes of cloud structure, occurrence, and properties such as cloud types, updraft and downdraft, latent heating profile, and cold pool strength in the embedded CRMs are examined in details. The simulated MCS characteristics are evaluated against satellite measurements using the Goddard Satellite Data Simulator Unit. The results indicate that embedded CRMs with large domain and fine resolution tend to produce better simulations compared to those simulations with typical MMF configuration (128 km domain size and 4 km model grid spacing).

  16. Microphysics in Multi-scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2012-01-01

    Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.

  17. Participatory ergonomics simulation of hospital work systems: The influence of simulation media on simulation outcome.

    PubMed

    Andersen, Simone Nyholm; Broberg, Ole

    2015-11-01

    Current application of work system simulation in participatory ergonomics (PE) design includes a variety of different simulation media. However, the actual influence of the media attributes on the simulation outcome has received less attention. This study investigates two simulation media: full-scale mock-ups and table-top models. The aim is to compare, how the media attributes of fidelity and affordance influence the ergonomics identification and evaluation in PE design of hospital work systems. The results illustrate, how the full-scale mock-ups' high fidelity of room layout and affordance of tool operation support ergonomics identification and evaluation related to the work system entities space and technologies & tools. The table-top models' high fidelity of function relations and affordance of a helicopter view support ergonomics identification and evaluation related to the entity organization. Furthermore, the study addresses the form of the identified and evaluated conditions, being either identified challenges or tangible design criteria. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  18. Argonne Simulation Framework for Intelligent Transportation Systems

    DOT National Transportation Integrated Search

    1996-01-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distribu...

  19. Computational Electromagnetics (CEM) Laboratory: Simulation Planning Guide

    NASA Technical Reports Server (NTRS)

    Khayat, Michael A.

    2011-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CEM Laboratory. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  20. Computer Simulation of Laboratory Experiments: An Unrealized Potential.

    ERIC Educational Resources Information Center

    Magin, D. J.; Reizes, J. A.

    1990-01-01

    Discussion of the use of computer simulation for laboratory experiments in undergraduate engineering education focuses on work at the University of New South Wales in the instructional design and software development of a package simulating a heat exchange device. The importance of integrating theory, design, and experimentation is also discussed.…

  1. Large-scale derived flood frequency analysis based on continuous simulation

    NASA Astrophysics Data System (ADS)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  2. Laboratory meter-scale seismic monitoring of varying water levels in granular media

    NASA Astrophysics Data System (ADS)

    Pasquet, S.; Bodet, L.; Bergamo, P.; Guérin, R.; Martin, R.; Mourgues, R.; Tournat, V.

    2016-12-01

    Laboratory physical modelling and non-contacting ultrasonic techniques are frequently proposed to tackle theoretical and methodological issues related to geophysical prospecting. Following recent developments illustrating the ability of seismic methods to image spatial and/or temporal variations of water content in the vadose zone, we developed laboratory experiments aimed at testing the sensitivity of seismic measurements (i.e., pressure-wave travel times and surface-wave phase velocities) to water saturation variations. Ultrasonic techniques were used to simulate typical seismic acquisitions on small-scale controlled granular media presenting different water levels. Travel times and phase velocity measurements obtained at the dry state were validated with both theoretical models and numerical simulations and serve as reference datasets. The increasing water level clearly affects the recorded wave field in both its phase and amplitude, but the collected data cannot yet be inverted in the absence of a comprehensive theoretical model for such partially saturated and unconsolidated granular media. The differences in travel time and phase velocity observed between the dry and wet models show patterns that are interestingly coincident with the observed water level and depth of the capillary fringe, thus offering attractive perspectives for studying soil water content variations in the field.

  3. Multiscale Laboratory Infrastructure and Services to users: Plans within EPOS

    NASA Astrophysics Data System (ADS)

    Spiers, Chris; Willingshofer, Ernst; Drury, Martyn; Funiciello, Francesca; Rosenau, Matthias; Scarlato, Piergiorgio; Sagnotti, Leonardo; EPOS WG6, Corrado Cimarelli

    2015-04-01

    The participant countries in EPOS embody a wide range of world-class laboratory infrastructures ranging from high temperature and pressure experimental facilities, to electron microscopy, micro-beam analysis, analogue modeling and paleomagnetic laboratories. Most data produced by the various laboratory centres and networks are presently available only in limited "final form" in publications. Many data remain inaccessible and/or poorly preserved. However, the data produced at the participating laboratories are crucial to serving society's need for geo-resources exploration and for protection against geo-hazards. Indeed, to model resource formation and system behaviour during exploitation, we need an understanding from the molecular to the continental scale, based on experimental data. This contribution will describe the plans that the laboratories community in Europe is making, in the context of EPOS. The main objectives are: • To collect and harmonize available and emerging laboratory data on the properties and processes controlling rock system behaviour at multiple scales, in order to generate products accessible and interoperable through services for supporting research activities. • To co-ordinate the development, integration and trans-national usage of the major solid Earth Science laboratory centres and specialist networks. The length scales encompassed by the infrastructures included range from the nano- and micrometer levels (electron microscopy and micro-beam analysis) to the scale of experiments on centimetre sized samples, and to analogue model experiments simulating the reservoir scale, the basin scale and the plate scale. • To provide products and services supporting research into Geo-resources and Geo-storage, Geo-hazards and Earth System Evolution. If the EPOS Implementation Phase proposal presently under construction is successful, then a range of services and transnational activities will be put in place to realize these objectives.

  4. A new scaling approach for the mesoscale simulation of magnetic domain structures using Monte Carlo simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Radhakrishnan, B.; Eisenbach, M.; Burress, Timothy A.

    2017-01-24

    A new scaling approach has been proposed for the spin exchange and the dipole–dipole interaction energy as a function of the system size. The computed scaling laws are used in atomistic Monte Carlo simulations of magnetic moment evolution to predict the transition from single domain to a vortex structure as the system size increases. The width of a 180° – domain wall extracted from the simulated structures is in close agreement with experimentally values for an F–Si alloy. In conclusion, the transition size from a single domain to a vortex structure is also in close agreement with theoretically predicted andmore » experimentally measured values for Fe.« less

  5. Impacts of different characterizations of large-scale background on simulated regional-scale ozone over the continental United States

    NASA Astrophysics Data System (ADS)

    Hogrefe, Christian; Liu, Peng; Pouliot, George; Mathur, Rohit; Roselle, Shawn; Flemming, Johannes; Lin, Meiyun; Park, Rokjin J.

    2018-03-01

    This study analyzes simulated regional-scale ozone burdens both near the surface and aloft, estimates process contributions to these burdens, and calculates the sensitivity of the simulated regional-scale ozone burden to several key model inputs with a particular emphasis on boundary conditions derived from hemispheric or global-scale models. The Community Multiscale Air Quality (CMAQ) model simulations supporting this analysis were performed over the continental US for the year 2010 within the context of the Air Quality Model Evaluation International Initiative (AQMEII) and Task Force on Hemispheric Transport of Air Pollution (TF-HTAP) activities. CMAQ process analysis (PA) results highlight the dominant role of horizontal and vertical advection on the ozone burden in the mid-to-upper troposphere and lower stratosphere. Vertical mixing, including mixing by convective clouds, couples fluctuations in free-tropospheric ozone to ozone in lower layers. Hypothetical bounding scenarios were performed to quantify the effects of emissions, boundary conditions, and ozone dry deposition on the simulated ozone burden. Analysis of these simulations confirms that the characterization of ozone outside the regional-scale modeling domain can have a profound impact on simulated regional-scale ozone. This was further investigated by using data from four hemispheric or global modeling systems (Chemistry - Integrated Forecasting Model (C-IFS), CMAQ extended for hemispheric applications (H-CMAQ), the Goddard Earth Observing System model coupled to chemistry (GEOS-Chem), and AM3) to derive alternate boundary conditions for the regional-scale CMAQ simulations. The regional-scale CMAQ simulations using these four different boundary conditions showed that the largest ozone abundance in the upper layers was simulated when using boundary conditions from GEOS-Chem, followed by the simulations using C-IFS, AM3, and H-CMAQ boundary conditions, consistent with the analysis of the ozone fields

  6. Assessment of released heavy metals from electrical and electronic equipment (EEE) existing in shipwrecks through laboratory-scale simulation reactor.

    PubMed

    Hahladakis, John N; Stylianos, Michailakis; Gidarakos, Evangelos

    2013-04-15

    In a passenger ship, the existence of EEE is obvious. In time, under shipwreck's conditions, all these materials will undergo an accelerated severe corrosion, due to salt water, releasing, consequently, heavy metals and other hazardous substances in the aquatic environment. In this study, a laboratory-scale reactor was manufactured in order to simulate the conditions under which the "Sea Diamond" shipwreck lies (14 bars of pressure and 16°C of temperature) and remotely observe and assess any heavy metal release that would occur, from part of the EEE present in the ship, into the sea. Ten metals were examined and the results showed that zinc, mercury and copper were abundant in the water samples taken from the reactor and in significantly higher concentrations compared to the US EPA CMC (criterion maximum concentration) criterion. Moreover, nickel and lead were found in concentrations higher than the CCC (criterion constant concentration) criterion set by the US EPA for clean seawater. The rest of the elements were measured in concentrations within the permissible limits. It is therefore of environmental benefit to salvage the wreck and recycle all the WEEE found in it. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Numerical simulations of impacts involving porous bodies. II. Comparison with laboratory experiments

    NASA Astrophysics Data System (ADS)

    Jutzi, Martin; Michel, Patrick; Hiraoka, Kensuke; Nakamura, Akiko M.; Benz, Willy

    2009-06-01

    In this paper, we compare the outcome of high-velocity impact experiments on porous targets, composed of pumice, with the results of simulations by a 3D SPH hydrocode in which a porosity model has been implemented. The different populations of small bodies of our Solar System are believed to be composed, at least partially, of objects with a high degree of porosity. To describe the fragmentation of such porous objects, a different model is needed than that used for non-porous bodies. In the case of porous bodies, the impact process is not only driven by the presence of cracks which propagate when a stress threshold is reached, it is also influenced by the crushing of pores and compaction. Such processes can greatly affect the whole body's response to an impact. Therefore, another physical model is necessary to improve our understanding of the collisional process involving porous bodies. Such a model has been developed recently and introduced successfully in a 3D SPH hydrocode [Jutzi, M., Benz, W., Michel, P., 2008. Icarus 198, 242-255]. Basic tests have been performed which already showed that it is implemented in a consistent way and that theoretical solutions are well reproduced. However, its full validation requires that it is also capable of reproducing the results of real laboratory impact experiments. Here we present simulations of laboratory experiments on pumice targets for which several of the main material properties have been measured. We show that using the measured material properties and keeping the remaining free parameters fixed, our numerical model is able to reproduce the outcome of these experiments carried out under different impact conditions. This first complete validation of our model, which will be tested for other porous materials in the future, allows us to start addressing problems at larger scale related to small bodies of our Solar System, such as collisions in the Kuiper Belt or the formation of a family by the disruption of a porous

  8. The optimization of total laboratory automation by simulation of a pull-strategy.

    PubMed

    Yang, Taho; Wang, Teng-Kuan; Li, Vincent C; Su, Chia-Lo

    2015-01-01

    Laboratory results are essential for physicians to diagnose medical conditions. Because of the critical role of medical laboratories, an increasing number of hospitals use total laboratory automation (TLA) to improve laboratory performance. Although the benefits of TLA are well documented, systems occasionally become congested, particularly when hospitals face peak demand. This study optimizes TLA operations. Firstly, value stream mapping (VSM) is used to identify the non-value-added time. Subsequently, batch processing control and parallel scheduling rules are devised and a pull mechanism that comprises a constant work-in-process (CONWIP) is proposed. Simulation optimization is then used to optimize the design parameters and to ensure a small inventory and a shorter average cycle time (CT). For empirical illustration, this approach is applied to a real case. The proposed methodology significantly improves the efficiency of laboratory work and leads to a reduction in patient waiting times and increased service level.

  9. IAPSA 2 small-scale system specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Torkelson, Thomas C.

    1990-01-01

    The details of a hardware implementation of a representative small scale flight critical system is described using Advanced Information Processing System (AIPS) building block components and simulated sensor/actuator interfaces. The system was used to study application performance and reliability issues during both normal and faulted operation.

  10. Laboratory simulation of organic geochemical processes.

    NASA Technical Reports Server (NTRS)

    Eglinton, G.

    1972-01-01

    Discussion of laboratory simulations that are important to organic geochemistry in that they provide direct evidence relating to geochemical cycles involving carbon. Reviewed processes and experiments include reactions occurring in the geosphere, particularly, short-term diagenesis of biolipids and organochlorine pesticides in estuarine muds, as well as maturation of organic matter in ancient sediments.

  11. NASA Langley Research Center's Simulation-To-Flight Concept Accomplished through the Integration Laboratories of the Transport Research Facility

    NASA Technical Reports Server (NTRS)

    Martinez, Debbie; Davidson, Paul C.; Kenney, P. Sean; Hutchinson, Brian K.

    2004-01-01

    The Flight Simulation and Software Branch (FSSB) at NASA Langley Research Center (LaRC) maintains the unique national asset identified as the Transport Research Facility (TRF). The TRF is a group of facilities and integration laboratories utilized to support the LaRC's simulation-to-flight concept. This concept incorporates common software, hardware, and processes for both groundbased flight simulators and LaRC s B-757-200 flying laboratory identified as the Airborne Research Integrated Experiments System (ARIES). These assets provide Government, industry, and academia with an efficient way to develop and test new technology concepts to enhance the capacity, safety, and operational needs of the ever-changing national airspace system. The integration of the TRF enables a smooth continuous flow of the research from simulation to actual flight test.

  12. Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer.

    PubMed

    Schulz, Roland; Lindner, Benjamin; Petridis, Loukas; Smith, Jeremy C

    2009-10-13

    A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors, other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million-atom biological systems scale well up to ∼30k cores, producing ∼30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.

  13. Laboratory-Scale Bismuth Phosphate Extraction Process Simulation To Track Fate of Fission Products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serne, R. JEFFREY; Lindberg, Michael J.; Jones, Thomas E.

    2007-02-28

    Recent field investigation that collected and characterized vadose zone sediments from beneath inactive liquid disposal facilities at the Hanford 200 Areas show lower than expected concentrations of a long-term risk driver, Tc-99. Therefore laboratory studies were performed to re-create one of the three processes that were used to separate the plutonium from spent fuel and that created most of the wastes disposed or currently stored in tanks at Hanford. The laboratory simulations were used to compare with current estimates based mainly on flow sheet estimates and spotty historical data. Three simulations of the bismuth phosphate precipitation process show that lessmore » that 1% of the Tc-99, Cs-135/137, Sr-90, I-129 carry down with the Pu product and thus these isotopes should have remained within the metals waste streams that after neutralization were sent to single shell tanks. Conversely, these isotopes should not be expected to be found in the first and subsequent cycle waste streams that went to cribs. Measurable quantities (~20 to 30%) of the lanthanides, yttrium, and trivalent actinides (Am and Cm) do precipitate with the Pu product, which is higher than the 10% estimate made for current inventory projections. Surprisingly, Se (added as selenate form) also shows about 10% association with the Pu/bismuth phosphate solids. We speculate that the incorporation of some Se into the bismuth phosphate precipitate is caused by selenate substitution into crystal lattice sites for the phosphate. The bulk of the U daughter product Th-234 and Np-237 daughter product Pa-233 also associate with the solids. We suspect that the Pa daughter products of U (Pa-234 and Pa-231) would also co-precipitate with the bismuth phosphate induced solids. No more than 1 % of the Sr-90 and Sb-125 should carry down with the Pu product that ultimately was purified. Thus the current scheme used to estimate where fission products end up being disposed overestimates by one order of magnitude

  14. Experimental methods for the simulation of supercritical CO2 injection at laboratory scale aimed to investigate capillary trapping

    NASA Astrophysics Data System (ADS)

    Trevisan, L.; Illangasekare, T. H.; Rodriguez, D.; Sakaki, T.; Cihan, A.; Birkholzer, J. T.; Zhou, Q.

    2011-12-01

    Geological storage of carbon dioxide in deep geologic formations is being considered as a technical option to reduce greenhouse gas loading to the atmosphere. The processes associated with the movement and stable trapping are complex in deep naturally heterogeneous formations. Three primary mechanisms contribute to trapping; capillary entrapment due to immobilization of the supercritical fluid CO2 within soil pores, liquid CO2 dissolving in the formation water and mineralization. Natural heterogeneity in the formation is expected to affect all three mechanisms. A research project is in progress with the primary goal to improve our understanding of capillary and dissolution trapping during injection and post-injection process, focusing on formation heterogeneity. It is expected that this improved knowledge will help to develop site characterization methods targeting on obtaining the most critical parameters that capture the heterogeneity to design strategies and schemes to maximize trapping. This research combines experiments at the laboratory scale with multiphase modeling to upscale relevant trapping processes to the field scale. This paper presents the results from a set of experiments that were conducted in an intermediate scale test tanks. Intermediate scale testing provides an attractive alternative to investigate these processes under controlled conditions in the laboratory. Conducting these types of experiments is highly challenging as methods have to be developed to extrapolate the data from experiments that are conducted under ambient laboratory conditions to high temperatures and pressures settings in deep geologic formations. We explored the use of a combination of surrogate fluids that have similar density, viscosity contrasts and analogous solubility and interfacial tension as supercritical CO2-brine in deep formations. The extrapolation approach involves the use of dimensionless numbers such as Capillary number (Ca) and the Bond number (Bo). A set of

  15. Experiment-scale molecular simulation study of liquid crystal thin films

    NASA Astrophysics Data System (ADS)

    Nguyen, Trung Dac; Carrillo, Jan-Michael Y.; Matheson, Michael A.; Brown, W. Michael

    2014-03-01

    Supercomputers have now reached a performance level adequate for studying thin films with molecular detail at the relevant scales. By exploiting the power of GPU accelerators on Titan, we have been able to perform simulations of characteristic liquid crystal films that provide remarkable qualitative agreement with experimental images. We have demonstrated that key features of spinodal instability can only be observed with sufficiently large system sizes, which were not accessible with previous simulation studies. Our study emphasizes the capability and significance of petascale simulations in providing molecular-level insights in thin film systems as well as other interfacial phenomena.

  16. Embedding measurement within existing computerized data systems: scaling clinical laboratory and medical records heart failure data to predict ICU admission.

    PubMed

    Fisher, William P; Burton, Elizabeth C

    2010-01-01

    This study employs existing data sources to develop a new measure of intensive care unit (ICU) admission risk for heart failure patients. Outcome measures were constructed from laboratory, accounting, and medical record data for 973 adult inpatients with primary or secondary heart failure. Several scoring interpretations of the laboratory indicators were evaluated relative to their measurement and predictive properties. Cases were restricted to tests within first lab draw that included at least 15 indicators. After optimizing the original clinical observations, a satisfactory heart failure severity scale was calibrated on a 0-1000 continuum. Patients with unadjusted CHF severity measures of 550 or less were 2.7 times more likely to be admitted to the ICU than those with higher measures. Patients with low HF severity measures (550 or less) adjusted for demographic and diagnostic risk factors are about six times more likely to be admitted to the ICU than those with higher adjusted measures. A nomogram facilitates routine clinical application. Existing computerized data systems could be programmed to automatically structure clinical laboratory reports using the results of studies like this one to reduce data volume with no loss of information, make laboratory results more meaningful to clinical end users, improve the quality of care, reduce errors and unneeded tests, prevent unnecessary ICU admissions, lower costs, and improve patient satisfaction. Existing data typically examined piecemeal form a coherent scale measuring heart failure severity sensitive to increased likelihood of ICU admission. Marked improvements in ROC curves were found for the aggregate measures relative to individual clinical indicators.

  17. Hierarchical Engine for Large-scale Infrastructure Co-Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-04-24

    HELICS is designed to support very-large-scale (100,000+ federates) cosimulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features include cross platform operating system support, the integration of both event driven (e.g., packetized communication) and time-series (e.g., power flow) simulations, and the ability to co-iterate among federates to ensure physical model convergence at each time step.

  18. Reconstructing a Large-Scale Population for Social Simulation

    NASA Astrophysics Data System (ADS)

    Fan, Zongchen; Meng, Rongqing; Ge, Yuanzheng; Qiu, Xiaogang

    The advent of social simulation has provided an opportunity to research on social systems. More and more researchers tend to describe the components of social systems in a more detailed level. Any simulation needs the support of population data to initialize and implement the simulation systems. However, it's impossible to get the data which provide full information about individuals and households. We propose a two-step method to reconstruct a large-scale population for a Chinese city according to Chinese culture. Firstly, a baseline population is generated through gathering individuals into households one by one; secondly, social relationships such as friendship are assigned to the baseline population. Through a case study, a population of 3,112,559 individuals gathered in 1,133,835 households is reconstructed for Urumqi city, and the results show that the generated data can respect the real data quite well. The generated data can be applied to support modeling of some social phenomenon.

  19. Open Simulation Laboratories [Guest editors' introduction

    DOE PAGES

    Alexander, Francis J.; Meneveau, Charles

    2015-09-01

    The introduction for the special issue on open simulation laboratories, the guest editors describe how OSLs will become more common as their potential is better understood and they begin providing access to valuable datasets to much larger segments of the scientific community. Moreover, new analysis tools and ways to do science will inevitably develop as a result.

  20. Exact-Differential Large-Scale Traffic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hanai, Masatoshi; Suzumura, Toyotaro; Theodoropoulos, Georgios

    2015-01-01

    Analyzing large-scale traffics by simulation needs repeating execution many times with various patterns of scenarios or parameters. Such repeating execution brings about big redundancy because the change from a prior scenario to a later scenario is very minor in most cases, for example, blocking only one of roads or changing the speed limit of several roads. In this paper, we propose a new redundancy reduction technique, called exact-differential simulation, which enables to simulate only changing scenarios in later execution while keeping exactly same results as in the case of whole simulation. The paper consists of two main efforts: (i) amore » key idea and algorithm of the exact-differential simulation, (ii) a method to build large-scale traffic simulation on the top of the exact-differential simulation. In experiments of Tokyo traffic simulation, the exact-differential simulation shows 7.26 times as much elapsed time improvement in average and 2.26 times improvement even in the worst case as the whole simulation.« less

  1. RANS Simulation (Virtual Blade Model [VBM]) of Single Lab Scaled DOE RM1 MHK Turbine

    DOE Data Explorer

    Javaherchi, Teymour; Stelzenmuller, Nick; Aliseda, Alberto; Seydel, Joseph

    2014-04-15

    Attached are the .cas and .dat files for the Reynolds Averaged Navier-Stokes (RANS) simulation of a single lab-scaled DOE RM1 turbine implemented in ANSYS FLUENT CFD-package. The lab-scaled DOE RM1 is a re-design geometry, based of the full scale DOE RM1 design, producing same power output as the full scale model, while operating at matched Tip Speed Ratio values at reachable laboratory Reynolds number (see attached paper). In this case study the flow field around and in the wake of the lab-scaled DOE RM1 turbine is simulated using Blade Element Model (a.k.a Virtual Blade Model) by solving RANS equations coupled with k-\\omega turbulence closure model. It should be highlighted that in this simulation the actual geometry of the rotor blade is not modeled. The effect of turbine rotating blades are modeled using the Blade Element Theory. This simulation provides an accurate estimate for the performance of device and structure of it's turbulent far wake. Due to the simplifications implemented for modeling the rotating blades in this model, VBM is limited to capture details of the flow field in near wake region of the device. The required User Defined Functions (UDFs) and look-up table of lift and drag coefficients are included along with the .cas and .dat files.

  2. Knowledge Based Simulation: An Artificial Intelligence Approach to System Modeling and Automating the Simulation Life Cycle.

    DTIC Science & Technology

    1988-04-13

    Simulation: An Artificial Intelligence Approach to System Modeling and Automating the Simulation Life Cycle Mark S. Fox, Nizwer Husain, Malcolm...McRoberts and Y.V.Reddy CMU-RI-TR-88-5 Intelligent Systems Laboratory The Robotics Institute Carnegie Mellon University Pittsburgh, Pennsylvania D T T 13...years of research in the application of Artificial Intelligence to Simulation. Our focus has been in two areas: the use of Al knowledge representation

  3. RANS Simulation (Rotating Reference Frame Model [RRF]) of Single Lab-Scaled DOE RM1 MHK Turbine

    DOE Data Explorer

    Javaherchi, Teymour; Stelzenmuller, Nick; Aliseda, Alberto; Seydel, Joseph

    2014-04-15

    Attached are the .cas and .dat files for the Reynolds Averaged Navier-Stokes (RANS) simulation of a single lab-scaled DOE RM1 turbine implemented in ANSYS FLUENT CFD-package. The lab-scaled DOE RM1 is a re-design geometry, based of the full scale DOE RM1 design, producing same power output as the full scale model, while operating at matched Tip Speed Ratio values at reachable laboratory Reynolds number (see attached paper). In this case study taking advantage of the symmetry of lab-scaled DOE RM1 geometry, only half of the geometry is models using (Single) Rotating Reference Frame model [RRF]. In this model RANS equations, coupled with k-\\omega turbulence closure model, are solved in the rotating reference frame. The actual geometry of the turbine blade is included and the turbulent boundary layer along the blade span is simulated using wall-function approach. The rotation of the blade is modeled by applying periodic boundary condition to sets of plane of symmetry. This case study simulates the performance and flow field in the near and far wake of the device at the desired operating conditions. The results of these simulations were validated against in-house experimental data. Please see the attached paper.

  4. Stochastic simulation of power systems with integrated renewable and utility-scale storage resources

    NASA Astrophysics Data System (ADS)

    Degeilh, Yannick

    The push for a more sustainable electric supply has led various countries to adopt policies advocating the integration of renewable yet variable energy resources, such as wind and solar, into the grid. The challenges of integrating such time-varying, intermittent resources has in turn sparked a growing interest in the implementation of utility-scale energy storage resources ( ESRs), with MWweek storage capability. Indeed, storage devices provide flexibility to facilitate the management of power system operations in the presence of uncertain, highly time-varying and intermittent renewable resources. The ability to exploit the potential synergies between renewable and ESRs hinges on developing appropriate models, methodologies, tools and policy initiatives. We report on the development of a comprehensive simulation methodology that provides the capability to quantify the impacts of integrated renewable and ESRs on the economics, reliability and emission variable effects of power systems operating in a market environment. We model the uncertainty in the demands, the available capacity of conventional generation resources and the time-varying, intermittent renewable resources, with their temporal and spatial correlations, as discrete-time random processes. We deploy models of the ESRs to emulate their scheduling and operations in the transmission-constrained hourly day-ahead markets. To this end, we formulate a scheduling optimization problem (SOP) whose solutions determine the operational schedule of the controllable ESRs in coordination with the demands and the conventional/renewable resources. As such, the SOP serves the dual purpose of emulating the clearing of the transmission-constrained day-ahead markets (DAMs ) and scheduling the energy storage resource operations. We also represent the need for system operators to impose stricter ramping requirements on the conventional generating units so as to maintain the system capability to perform "load following'', i

  5. Fracture induced electromagnetic emissions: extending laboratory findings by observations at the geophysical scale

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Constantinos

    2014-05-01

    Under natural conditions, it is practically impossible to install an experimental network on the geophysical scale using the same instrumentations as in laboratory experiments for understanding, through the states of stress and strain and their time variation, the laws that govern the friction during the last stages of EQ generation, or to monitor (much less to control) the principal characteristics of a fracture process. Fracture-induced electromagnetic emissions (EME) in a wide range of frequency bands are sensitive to the micro-structural chances. Thus, their study constitutes a nondestructive method for the monitoring of the evolution of damage process at the laboratory scale. It has been suggested that fracture induced MHz-kHz electromagnetic (EM) emissions, which emerge from a few days up to a few hours before the main seismic shock occurrence permit a real time monitoring of the damage process during the last stages of earthquake preparation, as it happens at the laboratory scale. Since the EME are produced both in the case of the laboratory scale fracture and the EQ preparation process (geophysical scale fracture) they should present similar characteristics in these two scales. Therefore, both the laboratory experimenting scientists and the experimental scientists studying the pre-earthquake EME could benefit from each- other's results. Importantly, it is noted that when studying the fracture process by means of laboratory experiments, the fault growth process normally occurs violently in a fraction of a second. However, a major difference between the laboratory and natural processes is the order-of-magnitude differences in scale (in space and time), allowing the possibility of experimental observation at the geophysical scale for a range of physical processes which are not observable at the laboratory scale. Therefore, the study of fracture-induced EME is expected to reveal more information, especially for the last stages of the fracture process, when it

  6. Systems engineering and integration: Advanced avionics laboratories

    NASA Technical Reports Server (NTRS)

    1990-01-01

    In order to develop the new generation of avionics which will be necessary for upcoming programs such as the Lunar/Mars Initiative, Advanced Launch System, and the National Aerospace Plane, new Advanced Avionics Laboratories are required. To minimize costs and maximize benefits, these laboratories should be capable of supporting multiple avionics development efforts at a single location, and should be of a common design to support and encourage data sharing. Recent technological advances provide the capability of letting the designer or analyst perform simulations and testing in an environment similar to his engineering environment and these features should be incorporated into the new laboratories. Existing and emerging hardware and software standards must be incorporated wherever possible to provide additional cost savings and compatibility. Special care must be taken to design the laboratories such that real-time hardware-in-the-loop performance is not sacrificed in the pursuit of these goals. A special program-independent funding source should be identified for the development of Advanced Avionics Laboratories as resources supporting a wide range of upcoming NASA programs.

  7. A Goddard Multi-Scale Modeling System with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, W.K.; Anderson, D.; Atlas, R.; Chern, J.; Houser, P.; Hou, A.; Lang, S.; Lau, W.; Peters-Lidard, C.; Kakar, R.; hide

    2008-01-01

    Numerical cloud resolving models (CRMs), which are based the non-hydrostatic equations of motion, have been extensively applied to cloud-scale and mesoscale processes during the past four decades. Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that CRMs agree with observations in simulating various types of clouds and cloud systems from different geographic locations. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that Numerical Weather Prediction (NWP) and regional scale model can be run in grid size similar to cloud resolving model through nesting technique. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a szrper-parameterization or multi-scale modeling -framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign can provide initial conditions as well as validation through utilizing the Earth Satellite simulators. At Goddard, we have developed a multi-scale modeling system with unified physics. The modeling system consists a coupled GCM-CRM (or MMF); a state-of-the-art weather research forecast model (WRF) and a cloud-resolving model (Goddard Cumulus Ensemble model). In these models, the same microphysical schemes (2ICE, several 3ICE), radiation (including explicitly calculated cloud optical properties), and surface models are applied. In addition, a comprehensive unified Earth Satellite

  8. Performance of nanoscale zero-valent iron in nitrate reduction from water using a laboratory-scale continuous-flow system.

    PubMed

    Khalil, Ahmed M E; Eljamal, Osama; Saha, Bidyut Baran; Matsunaga, Nobuhiro

    2018-04-01

    Nanoscale zero-valent iron (nZVI) is a versatile treatment reagent that should be utilized in an effective application for nitrate remediation in water. For this purpose, a laboratory-scale continuous-flow system (LSCFS) was developed to evaluate nZVI performance in removal of nitrate in different contaminated-water bodies. The equipment design (reactor, settler, and polisher) and operational parameters of the LSCFS were determined based on nZVI characterization and nitrate reduction kinetics. Ten experimental runs were conducted at different dosages (6, 10 and 20 g) of nZVI-based reagents (nZVI, bimetallic nZVI-Cu, CuCl 2 -added nZVI). Effluent concentrations of nitrogen and iron compounds were measured, and pH and ORP values were monitored. The major role exhibited by the recirculation process of unreacted nZVI from the settler to the reactor succeeded in achieving overall nitrate removal efficiency (RE) of >90%. The similar performance of both nZVI and copper-ions-modified nZVI in contaminated distilled water was an indication of LSCFS reliability in completely utilizing iron nanoparticles. In case of treating contaminated river water and simulated groundwater, the nitrate reduction process was sensitive towards the presence of interfering substances that dropped the overall RE drastically. However, the addition of copper ions during the treatment counteracted the retardation effect and greatly enhanced the nitrate RE. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Improving quality management systems of laboratories in developing countries: an innovative training approach to accelerate laboratory accreditation.

    PubMed

    Yao, Katy; McKinney, Barbara; Murphy, Anna; Rotz, Phil; Wafula, Winnie; Sendagire, Hakim; Okui, Scolastica; Nkengasong, John N

    2010-09-01

    The Strengthening Laboratory Management Toward Accreditation (SLMTA) program was developed to promote immediate, measurable improvement in laboratories of developing countries. The laboratory management framework, a tool that prescribes managerial job tasks, forms the basis of the hands-on, activity-based curriculum. SLMTA is implemented through multiple workshops with intervening site visits to support improvement projects. To evaluate the effectiveness of SLMTA, the laboratory accreditation checklist was developed and subsequently adopted by the World Health Organization Regional Office for Africa (WHO AFRO). The SLMTA program and the implementation model were validated through a pilot in Uganda. SLMTA yielded observable, measurable results in the laboratories and improved patient flow and turnaround time in a laboratory simulation. The laboratory staff members were empowered to improve their own laboratories by using existing resources, communicate with clinicians and hospital administrators, and advocate for system strengthening. The SLMTA program supports laboratories by improving management and building preparedness for accreditation.

  10. The Numerical Propulsion System Simulation: An Overview

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2000-01-01

    Advances in computational technology and in physics-based modeling are making large-scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze major propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of designing systems, providing the designer with critical information about the components early in the design process. This paper describes the development of the numerical propulsion system simulation (NPSS), a modular and extensible framework for the integration of multicomponent and multidisciplinary analysis tools using geographically distributed resources such as computing platforms, data bases, and people. The analysis is currently focused on large-scale modeling of complete aircraft engines. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  11. A Unique Software System For Simulation-to-Flight Research

    NASA Technical Reports Server (NTRS)

    Chung, Victoria I.; Hutchinson, Brian K.

    2001-01-01

    "Simulation-to-Flight" is a research development concept to reduce costs and increase testing efficiency of future major aeronautical research efforts at NASA. The simulation-to-flight concept is achieved by using common software and hardware, procedures, and processes for both piloted-simulation and flight testing. This concept was applied to the design and development of two full-size transport simulators, a research system installed on a NASA B-757 airplane, and two supporting laboratories. This paper describes the software system that supports the simulation-to-flight facilities. Examples of various simulation-to-flight experimental applications were also provided.

  12. Tsunami Simulators in Physical Modelling Laboratories - From Concept to Proven Technique

    NASA Astrophysics Data System (ADS)

    Allsop, W.; Chandler, I.; Rossetto, T.; McGovern, D.; Petrone, C.; Robinson, D.

    2016-12-01

    Before 2004, there was little public awareness around Indian Ocean coasts of the potential size and effects of tsunami. Even in 2011, the scale and extent of devastation by the Japan East Coast Tsunami was unexpected. There were very few engineering tools to assess onshore impacts of tsunami, so no agreement on robust methods to predict forces on coastal defences, buildings or related infrastructure. Modelling generally used substantial simplifications of either solitary waves (far too short durations) or dam break (unrealistic and/or uncontrolled wave forms).This presentation will describe research from EPI-centre, HYDRALAB IV, URBANWAVES and CRUST projects over the last 10 years that have developed and refined pneumatic Tsunami Simulators for the hydraulic laboratory. These unique devices have been used to model generic elevated and N-wave tsunamis up to and over simple shorelines, and at example defences. They have reproduced full-duration tsunamis including the Mercator trace from 2004 at 1:50 scale. Engineering scale models subjected to those tsunamis have measured wave run-up on simple slopes, forces on idealised sea defences and pressures / forces on buildings. This presentation will describe how these pneumatic Tsunami Simulators work, demonstrate how they have generated tsunami waves longer than the facility within which they operate, and will highlight research results from the three generations of Tsunami Simulator. Of direct relevance to engineers and modellers will be measurements of wave run-up levels and comparison with theoretical predictions. Recent measurements of forces on individual buildings have been generalized by separate experiments on buildings (up to 4 rows) which show that the greatest forces can act on the landward (not seaward) buildings. Continuing research in the 70m long 4m wide Fast Flow Facility on tsunami defence structures have also measured forces on buildings in the lee of a failed defence wall.

  13. Design of a simulation environment for laboratory management by robot organizations

    NASA Technical Reports Server (NTRS)

    Zeigler, Bernard P.; Cellier, Francois E.; Rozenblit, Jerzy W.

    1988-01-01

    This paper describes the basic concepts needed for a simulation environment capable of supporting the design of robot organizations for managing chemical, or similar, laboratories on the planned U.S. Space Station. The environment should facilitate a thorough study of the problems to be encountered in assigning the responsibility of managing a non-life-critical, but mission valuable, process to an organized group of robots. In the first phase of the work, we seek to employ the simulation environment to develop robot cognitive systems and strategies for effective multi-robot management of chemical experiments. Later phases will explore human-robot interaction and development of robot autonomy.

  14. Using a Laboratory Simulator in the Teaching and Study of Chemical Processes in Estuarine Systems

    ERIC Educational Resources Information Center

    Garcia-Luque, E.; Ortega, T.; Forja, J. M.; Gomez-Parra, A.

    2004-01-01

    The teaching of Chemical Oceanography in the Faculty of Marine and Environmental Sciences of the University of Cadiz (Spain) has been improved since 1994 by the employment of a device for the laboratory simulation of estuarine mixing processes and the characterisation of the chemical behaviour of many substances that pass through an estuary. The…

  15. Scale-dependent performances of CMIP5 earth system models in simulating terrestrial vegetation carbon

    NASA Astrophysics Data System (ADS)

    Jiang, L.; Luo, Y.; Yan, Y.; Hararuk, O.

    2013-12-01

    Mitigation of global changes will depend on reliable projection for the future situation. As the major tools to predict future climate, Earth System Models (ESMs) used in Coupled Model Intercomparison Project Phase 5 (CMIP5) for the IPCC Fifth Assessment Report have incorporated carbon cycle components, which account for the important fluxes of carbon between the ocean, atmosphere, and terrestrial biosphere carbon reservoirs; and therefore are expected to provide more detailed and more certain projections. However, ESMs are never perfect; and evaluating the ESMs can help us to identify uncertainties in prediction and give the priorities for model development. In this study, we benchmarked carbon in live vegetation in the terrestrial ecosystems simulated by 19 ESMs models from CMIP5 with an observationally estimated data set of global carbon vegetation pool 'Olson's Major World Ecosystem Complexes Ranked by Carbon in Live Vegetation: An Updated Database Using the GLC2000 Land Cover Product' by Gibbs (2006). Our aim is to evaluate the ability of ESMs to reproduce the global vegetation carbon pool at different scales and what are the possible causes for the bias. We found that the performance CMIP5 ESMs is very scale-dependent. While CESM1-BGC, CESM1-CAM5, CESM1-FASTCHEM and CESM1-WACCM, and NorESM1-M and NorESM1-ME (they share the same model structure) have very similar global sums with the observation data but they usually perform poorly at grid cell and biome scale. In contrast, MIROC-ESM and MIROC-ESM-CHEM simulate the best on at grid cell and biome scale but have larger differences in global sums than others. Our results will help improve CMIP5 ESMs for more reliable prediction.

  16. Galactic cosmic ray simulation at the NASA Space Radiation Laboratory

    PubMed Central

    Norbury, John W.; Schimmerling, Walter; Slaba, Tony C.; Azzam, Edouard I.; Badavi, Francis F.; Baiocco, Giorgio; Benton, Eric; Bindi, Veronica; Blakely, Eleanor A.; Blattnig, Steve R.; Boothman, David A.; Borak, Thomas B.; Britten, Richard A.; Curtis, Stan; Dingfelder, Michael; Durante, Marco; Dynan, William S.; Eisch, Amelia J.; Elgart, S. Robin; Goodhead, Dudley T.; Guida, Peter M.; Heilbronn, Lawrence H.; Hellweg, Christine E.; Huff, Janice L.; Kronenberg, Amy; La Tessa, Chiara; Lowenstein, Derek I.; Miller, Jack; Morita, Takashi; Narici, Livio; Nelson, Gregory A.; Norman, Ryan B.; Ottolenghi, Andrea; Patel, Zarana S.; Reitz, Guenther; Rusek, Adam; Schreurs, Ann-Sofie; Scott-Carnell, Lisa A.; Semones, Edward; Shay, Jerry W.; Shurshakov, Vyacheslav A.; Sihver, Lembit; Simonsen, Lisa C.; Story, Michael D.; Turker, Mitchell S.; Uchihori, Yukio; Williams, Jacqueline; Zeitlin, Cary J.

    2017-01-01

    Most accelerator-based space radiation experiments have been performed with single ion beams at fixed energies. However, the space radiation environment consists of a wide variety of ion species with a continuous range of energies. Due to recent developments in beam switching technology implemented at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL), it is now possible to rapidly switch ion species and energies, allowing for the possibility to more realistically simulate the actual radiation environment found in space. The present paper discusses a variety of issues related to implementation of galactic cosmic ray (GCR) simulation at NSRL, especially for experiments in radiobiology. Advantages and disadvantages of different approaches to developing a GCR simulator are presented. In addition, issues common to both GCR simulation and single beam experiments are compared to issues unique to GCR simulation studies. A set of conclusions is presented as well as a discussion of the technical implementation of GCR simulation. PMID:26948012

  17. Galactic cosmic ray simulation at the NASA Space Radiation Laboratory.

    PubMed

    Norbury, John W; Schimmerling, Walter; Slaba, Tony C; Azzam, Edouard I; Badavi, Francis F; Baiocco, Giorgio; Benton, Eric; Bindi, Veronica; Blakely, Eleanor A; Blattnig, Steve R; Boothman, David A; Borak, Thomas B; Britten, Richard A; Curtis, Stan; Dingfelder, Michael; Durante, Marco; Dynan, William S; Eisch, Amelia J; Robin Elgart, S; Goodhead, Dudley T; Guida, Peter M; Heilbronn, Lawrence H; Hellweg, Christine E; Huff, Janice L; Kronenberg, Amy; La Tessa, Chiara; Lowenstein, Derek I; Miller, Jack; Morita, Takashi; Narici, Livio; Nelson, Gregory A; Norman, Ryan B; Ottolenghi, Andrea; Patel, Zarana S; Reitz, Guenther; Rusek, Adam; Schreurs, Ann-Sofie; Scott-Carnell, Lisa A; Semones, Edward; Shay, Jerry W; Shurshakov, Vyacheslav A; Sihver, Lembit; Simonsen, Lisa C; Story, Michael D; Turker, Mitchell S; Uchihori, Yukio; Williams, Jacqueline; Zeitlin, Cary J

    2016-02-01

    Most accelerator-based space radiation experiments have been performed with single ion beams at fixed energies. However, the space radiation environment consists of a wide variety of ion species with a continuous range of energies. Due to recent developments in beam switching technology implemented at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory (BNL), it is now possible to rapidly switch ion species and energies, allowing for the possibility to more realistically simulate the actual radiation environment found in space. The present paper discusses a variety of issues related to implementation of galactic cosmic ray (GCR) simulation at NSRL, especially for experiments in radiobiology. Advantages and disadvantages of different approaches to developing a GCR simulator are presented. In addition, issues common to both GCR simulation and single beam experiments are compared to issues unique to GCR simulation studies. A set of conclusions is presented as well as a discussion of the technical implementation of GCR simulation. Published by Elsevier Ltd.

  18. PhysiCell: An open source physics-based cell simulator for 3-D multicellular systems.

    PubMed

    Ghaffarizadeh, Ahmadreza; Heiland, Randy; Friedman, Samuel H; Mumenthaler, Shannon M; Macklin, Paul

    2018-02-01

    Many multicellular systems problems can only be understood by studying how cells move, grow, divide, interact, and die. Tissue-scale dynamics emerge from systems of many interacting cells as they respond to and influence their microenvironment. The ideal "virtual laboratory" for such multicellular systems simulates both the biochemical microenvironment (the "stage") and many mechanically and biochemically interacting cells (the "players" upon the stage). PhysiCell-physics-based multicellular simulator-is an open source agent-based simulator that provides both the stage and the players for studying many interacting cells in dynamic tissue microenvironments. It builds upon a multi-substrate biotransport solver to link cell phenotype to multiple diffusing substrates and signaling factors. It includes biologically-driven sub-models for cell cycling, apoptosis, necrosis, solid and fluid volume changes, mechanics, and motility "out of the box." The C++ code has minimal dependencies, making it simple to maintain and deploy across platforms. PhysiCell has been parallelized with OpenMP, and its performance scales linearly with the number of cells. Simulations up to 105-106 cells are feasible on quad-core desktop workstations; larger simulations are attainable on single HPC compute nodes. We demonstrate PhysiCell by simulating the impact of necrotic core biomechanics, 3-D geometry, and stochasticity on the dynamics of hanging drop tumor spheroids and ductal carcinoma in situ (DCIS) of the breast. We demonstrate stochastic motility, chemical and contact-based interaction of multiple cell types, and the extensibility of PhysiCell with examples in synthetic multicellular systems (a "cellular cargo delivery" system, with application to anti-cancer treatments), cancer heterogeneity, and cancer immunology. PhysiCell is a powerful multicellular systems simulator that will be continually improved with new capabilities and performance improvements. It also represents a significant

  19. Why build a virtual brain? Large-scale neural simulations as jump start for cognitive computing

    NASA Astrophysics Data System (ADS)

    Colombo, Matteo

    2017-03-01

    Despite the impressive amount of financial resources recently invested in carrying out large-scale brain simulations, it is controversial what the pay-offs are of pursuing this project. One idea is that from designing, building, and running a large-scale neural simulation, scientists acquire knowledge about the computational performance of the simulating system, rather than about the neurobiological system represented in the simulation. It has been claimed that this knowledge may usher in a new era of neuromorphic, cognitive computing systems. This study elucidates this claim and argues that the main challenge this era is facing is not the lack of biological realism. The challenge lies in identifying general neurocomputational principles for the design of artificial systems, which could display the robust flexibility characteristic of biological intelligence.

  20. Laboratory challenges in the scaling up of HIV, TB, and malaria programs: The interaction of health and laboratory systems, clinical research, and service delivery.

    PubMed

    Birx, Deborah; de Souza, Mark; Nkengasong, John N

    2009-06-01

    Strengthening national health laboratory systems in resource-poor countries is critical to meeting the United Nations Millennium Development Goals. Despite strong commitment from the international community to fight major infectious diseases, weak laboratory infrastructure remains a huge rate-limiting step. Some major challenges facing laboratory systems in resource-poor settings include dilapidated infrastructure; lack of human capacity, laboratory policies, and strategic plans; and limited synergies between clinical and research laboratories. Together, these factors compromise the quality of test results and impact patient management. With increased funding, the target of laboratory strengthening efforts in resource-poor countries should be the integrating of laboratory services across major diseases to leverage resources with respect to physical infrastructure; types of assays; supply chain management of reagents and equipment; and maintenance of equipment.

  1. The Virtual Radiopharmacy Laboratory: A 3-D Simulation for Distance Learning

    ERIC Educational Resources Information Center

    Alexiou, Antonios; Bouras, Christos; Giannaka, Eri; Kapoulas, Vaggelis; Nani, Maria; Tsiatsos, Thrasivoulos

    2004-01-01

    This article presents Virtual Radiopharmacy Laboratory (VR LAB), a virtual laboratory accessible through the Internet. VR LAB is designed and implemented in the framework of the VirRAD European project. This laboratory represents a 3D simulation of a radio-pharmacy laboratory, where learners, represented by 3D avatars, can experiment on…

  2. Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schulz, Roland; Lindner, Benjamin; Petridis, Loukas

    2009-01-01

    A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors,more » other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million atom biological systems scale well up to 30k cores, producing 30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.« less

  3. Galactic Cosmic Ray Simulator at the NASA Space Radiation Laboratory

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Slaba, Tony C.; Rusek, Adam

    2015-01-01

    The external Galactic Cosmic Ray (GCR) spectrum is significantly modified when it passes through spacecraft shielding and astronauts. One approach for simulating the GCR space radiation environment is to attempt to reproduce the unmodified, external GCR spectrum at a ground based accelerator. A possibly better approach would use the modified, shielded tissue spectrum, to select accelerator beams impinging on biological targets. NASA plans for implementation of a GCR simulator at the NASA Space Radiation Laboratory (NSRL) at Brookhaven National Laboratory will be discussed.

  4. Simulation Of Combat With An Expert System

    NASA Technical Reports Server (NTRS)

    Provenzano, J. P.

    1989-01-01

    Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.

  5. Comparisons of Mixed-Phase Icing Cloud Simulations with Experiments Conducted at the NASA Propulsion Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Bartkus, Tadas P.; Struk, Peter M.; Tsao, Jen-Ching

    2017-01-01

    This paper builds on previous work that compares numerical simulations of mixed-phase icing clouds with experimental data. The model couples the thermal interaction between ice particles and water droplets of the icing cloud with the flowing air of an icing wind tunnel for simulation of NASA Glenn Research Centers (GRC) Propulsion Systems Laboratory (PSL). Measurements were taken during the Fundamentals of Ice Crystal Icing Physics Tests at the PSL tunnel in March 2016. The tests simulated ice-crystal and mixed-phase icing that relate to ice accretions within turbofan engines. Experimentally measured air temperature, humidity, total water content, liquid and ice water content, as well as cloud particle size, are compared with model predictions. The model showed good trend agreement with experimentally measured values, but often over-predicted aero-thermodynamic changes. This discrepancy is likely attributed to radial variations that this one-dimensional model does not address. One of the key findings of this work is that greater aero-thermodynamic changes occur when humidity conditions are low. In addition a range of mixed-phase clouds can be achieved by varying only the tunnel humidity conditions, but the range of humidities to generate a mixed-phase cloud becomes smaller when clouds are composed of smaller particles. In general, the model predicted melt fraction well, in particular with clouds composed of larger particle sizes.

  6. Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale

    NASA Astrophysics Data System (ADS)

    Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv

    2015-07-01

    X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner.

  7. Simulation framework for intelligent transportation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewing, T.; Doss, E.; Hanebutte, U.

    1996-10-01

    A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphicalmore » user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.« less

  8. Laboratory evaluation and application of microwave absorption properties under simulated conditions for planetary atmospheres

    NASA Technical Reports Server (NTRS)

    Steffes, Paul G.

    1988-01-01

    In the first half of this grant year, laboratory measurements were conducted on the millimeter-wave properties of atmospheric gases under simulated conditions for the outer planet. Significant improvements in the current system have made it possible to accurately characterize the opacity from gaseous NH3 at longer millimeter wavelengths (7 to 10 mm) under simulated Jovian conditions. In the second half of the grant year, it is hoped to extend such measurements to even shorter millimeter-wavelengths. Further analysis and application of the laboratory results to microwave and millimeter-wave absorption data for the outer planets, such as results from Voyager Radio Occultation experiments and earth-based radio astronomical observations will be continued. The analysis of available multispectral microwave opacity data from Venus, including data from the most recent radio astronomical ovservations in the 1.3 to 3.6 cm wavelength range and newly obtained Pioneer-Venus Radio Occulatation measurements at 13 cm, using the laboratory measurements as an interpretative tool will be pursued.

  9. Large Scale Traffic Simulations

    DOT National Transportation Integrated Search

    1997-01-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computation speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated "looping" between t...

  10. Multi-scale imaging and elastic simulation of carbonates

    NASA Astrophysics Data System (ADS)

    Faisal, Titly Farhana; Awedalkarim, Ahmed; Jouini, Mohamed Soufiane; Jouiad, Mustapha; Chevalier, Sylvie; Sassi, Mohamed

    2016-05-01

    Digital Rock Physics (DRP) is an emerging technology that can be used to generate high quality, fast and cost effective special core analysis (SCAL) properties compared to conventional experimental techniques and modeling techniques. The primary workflow of DRP conssits of three elements: 1) image the rock sample using high resolution 3D scanning techniques (e.g. micro CT, FIB/SEM), 2) process and digitize the images by segmenting the pore and matrix phases 3) simulate the desired physical properties of the rocks such as elastic moduli and velocities of wave propagation. A Finite Element Method based algorithm, that discretizes the basic Hooke's Law equation of linear elasticity and solves it numerically using a fast conjugate gradient solver, developed by Garboczi and Day [1] is used for mechanical and elastic property simulations. This elastic algorithm works directly on the digital images by treating each pixel as an element. The images are assumed to have periodic constant-strain boundary condition. The bulk and shear moduli of the different phases are required inputs. For standard 1.5" diameter cores however the Micro-CT scanning reoslution (around 40 μm) does not reveal smaller micro- and nano- pores beyond the resolution. This results in an unresolved "microporous" phase, the moduli of which is uncertain. Knackstedt et al. [2] assigned effective elastic moduli to the microporous phase based on self-consistent theory (which gives good estimation of velocities for well cemented granular media). Jouini et al. [3] segmented the core plug CT scan image into three phases and assumed that micro porous phase is represented by a sub-extracted micro plug (which too was scanned using Micro-CT). Currently the elastic numerical simulations based on CT-images alone largely overpredict the bulk, shear and Young's modulus when compared to laboratory acoustic tests of the same rocks. For greater accuracy of numerical simulation prediction, better estimates of moduli inputs

  11. Laboratory Simulations of Mars Evaporite Geochemistry

    NASA Technical Reports Server (NTRS)

    Moore, Jeffrey M.; Bullock, Mark A.; Newsom, Horton; Nelson, Melissa

    2010-01-01

    Evaporite-rich sedimentary deposits on Mars were formed under chemical conditions quite different from those on Earth. Their unique chemistries record the chemical and aqueous conditions under which they were formed and possibly subsequent conditions to which they were subjected. We have produced evaporite salt mineral suites in the laboratory under two simulated Martian atmospheres: (1) present-day and (2) a model of an ancient Matian atmosphere rich in volcanic gases. The composition of these synthetic Mars evaporites depends on the atmospheres under which they were desiccated as well as the chemistries of their precursor brines. In this report, we describe a Mars analog evaporite laboratory apparatus and the experimental methods we used to produce and analyze the evaporite mineral suites.

  12. Laboratory Information Systems.

    PubMed

    Henricks, Walter H

    2015-06-01

    Laboratory information systems (LISs) supply mission-critical capabilities for the vast array of information-processing needs of modern laboratories. LIS architectures include mainframe, client-server, and thin client configurations. The LIS database software manages a laboratory's data. LIS dictionaries are database tables that a laboratory uses to tailor an LIS to the unique needs of that laboratory. Anatomic pathology LIS (APLIS) functions play key roles throughout the pathology workflow, and laboratories rely on LIS management reports to monitor operations. This article describes the structure and functions of APLISs, with emphasis on their roles in laboratory operations and their relevance to pathologists. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. FY10 Report on Multi-scale Simulation of Solvent Extraction Processes: Molecular-scale and Continuum-scale Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wardle, Kent E.; Frey, Kurt; Pereira, Candido

    2014-02-02

    This task is aimed at predictive modeling of solvent extraction processes in typical extraction equipment through multiple simulation methods at various scales of resolution. We have conducted detailed continuum fluid dynamics simulation on the process unit level as well as simulations of the molecular-level physical interactions which govern extraction chemistry. Through combination of information gained through simulations at each of these two tiers along with advanced techniques such as the Lattice Boltzmann Method (LBM) which can bridge these two scales, we can develop the tools to work towards predictive simulation for solvent extraction on the equipment scale (Figure 1). Themore » goal of such a tool-along with enabling optimized design and operation of extraction units-would be to allow prediction of stage extraction effrciency under specified conditions. Simulation efforts on each of the two scales will be described below. As the initial application of FELBM in the work performed during FYl0 has been on annular mixing it will be discussed in context of the continuum-scale. In the future, however, it is anticipated that the real value of FELBM will be in its use as a tool for sub-grid model development through highly refined DNS-like multiphase simulations facilitating exploration and development of droplet models including breakup and coalescence which will be needed for the large-scale simulations where droplet level physics cannot be resolved. In this area, it can have a significant advantage over traditional CFD methods as its high computational efficiency allows exploration of significantly greater physical detail especially as computational resources increase in the future.« less

  14. Beyond-laboratory-scale prediction for channeling flows through subsurface rock fractures with heterogeneous aperture distributions revealed by laboratory evaluation

    NASA Astrophysics Data System (ADS)

    Ishibashi, Takuya; Watanabe, Noriaki; Hirano, Nobuo; Okamoto, Atsushi; Tsuchiya, Noriyoshi

    2015-01-01

    The present study evaluates aperture distributions and fluid flow characteristics for variously sized laboratory-scale granite fractures under confining stress. As a significant result of the laboratory investigation, the contact area in fracture plane was found to be virtually independent of scale. By combining this characteristic with the self-affine fractal nature of fracture surfaces, a novel method for predicting fracture aperture distributions beyond laboratory scale is developed. Validity of this method is revealed through reproduction of the results of laboratory investigation and the maximum aperture-fracture length relations, which are reported in the literature, for natural fractures. The present study finally predicts conceivable scale dependencies of fluid flows through joints (fractures without shear displacement) and faults (fractures with shear displacement). Both joint and fault aperture distributions are characterized by a scale-independent contact area, a scale-dependent geometric mean, and a scale-independent geometric standard deviation of aperture. The contact areas for joints and faults are approximately 60% and 40%. Changes in the geometric means of joint and fault apertures (µm), em, joint and em, fault, with fracture length (m), l, are approximated by em, joint = 1 × 102 l0.1 and em, fault = 1 × 103 l0.7, whereas the geometric standard deviations of both joint and fault apertures are approximately 3. Fluid flows through both joints and faults are characterized by formations of preferential flow paths (i.e., channeling flows) with scale-independent flow areas of approximately 10%, whereas the joint and fault permeabilities (m2), kjoint and kfault, are scale dependent and are approximated as kjoint = 1 × 10-12 l0.2 and kfault = 1 × 10-8 l1.1.

  15. Large-Scale Simulation of Multi-Asset Ising Financial Markets

    NASA Astrophysics Data System (ADS)

    Takaishi, Tetsuya

    2017-03-01

    We perform a large-scale simulation of an Ising-based financial market model that includes 300 asset time series. The financial system simulated by the model shows a fat-tailed return distribution and volatility clustering and exhibits unstable periods indicated by the volatility index measured as the average of absolute-returns. Moreover, we determine that the cumulative risk fraction, which measures the system risk, changes at high volatility periods. We also calculate the inverse participation ratio (IPR) and its higher-power version, IPR6, from the absolute-return cross-correlation matrix. Finally, we show that the IPR and IPR6 also change at high volatility periods.

  16. Properties important to mixing and simulant recommendations for WTP full-scale vessel testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poirier, M. R.; Martino, C. J.

    2015-12-01

    Full Scale Vessel Testing (FSVT) is being planned by Bechtel National, Inc., to demonstrate the ability of the standard high solids vessel design (SHSVD) to meet mixing requirements over the range of fluid properties planned for processing in the Pretreatment Facility (PTF) of the Hanford Waste Treatment and Immobilization Plant (WTP). Testing will use simulated waste rather than actual Hanford waste. Therefore, the use of suitable simulants is critical to achieving the goals of the test program. WTP personnel requested the Savannah River National Laboratory (SRNL) to assist with development of simulants for use in FSVT. Among the tasks assignedmore » to SRNL was to develop a list of waste properties that are important to pulse-jet mixer (PJM) performance in WTP vessels with elevated concentrations of solids.« less

  17. Numerical propulsion system simulation

    NASA Technical Reports Server (NTRS)

    Lytle, John K.; Remaklus, David A.; Nichols, Lester D.

    1990-01-01

    The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive. One of the major contributors to the high cost is the need to perform many large scale system tests. Extensive testing is used to capture the complex interactions among the multiple disciplines and the multiple components inherent in complex systems. The objective of the Numerical Propulsion System Simulation (NPSS) is to provide insight into these complex interactions through computational simulations. This will allow for comprehensive evaluation of new concepts early in the design phase before a commitment to hardware is made. It will also allow for rapid assessment of field-related problems, particularly in cases where operational problems were encountered during conditions that would be difficult to simulate experimentally. The tremendous progress taking place in computational engineering and the rapid increase in computing power expected through parallel processing make this concept feasible within the near future. However it is critical that the framework for such simulations be put in place now to serve as a focal point for the continued developments in computational engineering and computing hardware and software. The NPSS concept which is described will provide that framework.

  18. Particle release and control of worker exposure during laboratory-scale synthesis, handling and simulated spills of manufactured nanomaterials in fume hoods

    NASA Astrophysics Data System (ADS)

    Fonseca, Ana S.; Kuijpers, Eelco; Kling, Kirsten I.; Levin, Marcus; Koivisto, Antti J.; Nielsen, Signe H.; Fransman, W.; Fedutik, Yijri; Jensen, Keld A.; Koponen, Ismo K.

    2018-02-01

    Fume hoods are one of the most common types of equipment applied to reduce the potential of particle exposure in laboratory environments. A number of previous studies have shown particle release during work with nanomaterials under fume hoods. Here, we assessed laboratory workers' inhalation exposure during synthesis and handling of CuO, TiO2 and ZnO in a fume hood. In addition, we tested the capacity of a fume hood to prevent particle release to laboratory air during simulated spillage of different powders (silica fume, zirconia TZ-3Y and TiO2). Airborne particle concentrations were measured in near field, far field, and in the breathing zone of the worker. Handling CuO nanoparticles increased the concentration of small particles (< 58 nm) inside the fume hood (up to 1 × 105 cm-3). Synthesis, handling and packaging of ZnO and TiO2 nanoparticles did not result in detectable particle release to the laboratory air. Simulated powder spills showed a systematic increase in the particle concentrations inside the fume hood with increasing amount of material and drop height. Despite powder spills were sometimes observed to eject into the laboratory room, the spill events were rarely associated with notable release of particles from the fume hood. Overall, this study shows that a fume hood generally offers sufficient exposure control during synthesis and handling of nanomaterials. An appropriate fume hood with adequate sash height and face velocity prevents 98.3% of particles release into the surrounding environment. Care should still be made to consider spills and high cleanliness to prevent exposure via resuspension and inadvertent exposure by secondary routes.

  19. EFFECTS OF LARVAL STOCKING DENSITY ON LABORATORY-SCALE AND COMMERICAL-SCALE PRODUCTION OF SUMMER FLOUNDER, PARALICHTHYS DENTATUS

    EPA Science Inventory

    Three experiments investigating larval stocking densities of summer flounder from hatch to metamorphosis, Paralichthys dentatus, were conducted at laboratory-scale (75-L aquaria) and at commercial scale (1,000-L tanks). Experiments 1 and 2 at commercial scale tested the densities...

  20. Realistic wave-optics simulation of X-ray phase-contrast imaging at a human scale

    PubMed Central

    Sung, Yongjin; Segars, W. Paul; Pan, Adam; Ando, Masami; Sheppard, Colin J. R.; Gupta, Rajiv

    2015-01-01

    X-ray phase-contrast imaging (XPCI) can dramatically improve soft tissue contrast in X-ray medical imaging. Despite worldwide efforts to develop novel XPCI systems, a numerical framework to rigorously predict the performance of a clinical XPCI system at a human scale is not yet available. We have developed such a tool by combining a numerical anthropomorphic phantom defined with non-uniform rational B-splines (NURBS) and a wave optics-based simulator that can accurately capture the phase-contrast signal from a human-scaled numerical phantom. Using a synchrotron-based, high-performance XPCI system, we provide qualitative comparison between simulated and experimental images. Our tool can be used to simulate the performance of XPCI on various disease entities and compare proposed XPCI systems in an unbiased manner. PMID:26169570

  1. Comparison of sub-scaled to full-scaled aircrafts in simulation environment for air traffic management

    NASA Astrophysics Data System (ADS)

    Elbakary, Mohamed I.; Iftekharuddin, Khan M.; Papelis, Yiannis; Newman, Brett

    2017-05-01

    Air Traffic Management (ATM) concepts are commonly tested in simulation to obtain preliminary results and validate the concepts before adoption. Recently, the researchers found that simulation is not enough because of complexity associated with ATM concepts. In other words, full-scale tests must eventually take place to provide compelling performance evidence before adopting full implementation. Testing using full-scale aircraft produces a high-cost approach that yields high-confidence results but simulation provides a low-risk/low-cost approach with reduced confidence on the results. One possible approach to increase the confidence of the results and simultaneously reduce the risk and the cost is using unmanned sub-scale aircraft in testing new concepts for ATM. This paper presents the simulation results of using unmanned sub-scale aircraft in implementing ATM concepts compared to the full scale aircraft. The results of simulation show that the performance of sub-scale is quite comparable to that of the full-scale which validates use of the sub-scale in testing new ATM concepts. Keywords: Unmanned

  2. Seasonal-Scale Optimization of Conventional Hydropower Operations in the Upper Colorado System

    NASA Astrophysics Data System (ADS)

    Bier, A.; Villa, D.; Sun, A.; Lowry, T. S.; Barco, J.

    2011-12-01

    Sandia National Laboratories is developing the Hydropower Seasonal Concurrent Optimization for Power and the Environment (Hydro-SCOPE) tool to examine basin-wide conventional hydropower operations at seasonal time scales. This tool is part of an integrated, multi-laboratory project designed to explore different aspects of optimizing conventional hydropower operations. The Hydro-SCOPE tool couples a one-dimensional reservoir model with a river routing model to simulate hydrology and water quality. An optimization engine wraps around this model framework to solve for long-term operational strategies that best meet the specific objectives of the hydrologic system while honoring operational and environmental constraints. The optimization routines are provided by Sandia's open source DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) software. Hydro-SCOPE allows for multi-objective optimization, which can be used to gain insight into the trade-offs that must be made between objectives. The Hydro-SCOPE tool is being applied to the Upper Colorado Basin hydrologic system. This system contains six reservoirs, each with its own set of objectives (such as maximizing revenue, optimizing environmental indicators, meeting water use needs, or other objectives) and constraints. This leads to a large optimization problem with strong connectedness between objectives. The systems-level approach used by the Hydro-SCOPE tool allows simultaneous analysis of these objectives, as well as understanding of potential trade-offs related to different objectives and operating strategies. The seasonal-scale tool will be tightly integrated with the other components of this project, which examine day-ahead and real-time planning, environmental performance, hydrologic forecasting, and plant efficiency.

  3. Bootstrapping Methods Applied for Simulating Laboratory Works

    ERIC Educational Resources Information Center

    Prodan, Augustin; Campean, Remus

    2005-01-01

    Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…

  4. Integrated simulation of continuous-scale and discrete-scale radiative transfer in metal foams

    NASA Astrophysics Data System (ADS)

    Xia, Xin-Lin; Li, Yang; Sun, Chuang; Ai, Qing; Tan, He-Ping

    2018-06-01

    A novel integrated simulation of radiative transfer in metal foams is presented. It integrates the continuous-scale simulation with the direct discrete-scale simulation in a single computational domain. It relies on the coupling of the real discrete-scale foam geometry with the equivalent continuous-scale medium through a specially defined scale-coupled zone. This zone holds continuous but nonhomogeneous volumetric radiative properties. The scale-coupled approach is compared to the traditional continuous-scale approach using volumetric radiative properties in the equivalent participating medium and to the direct discrete-scale approach employing the real 3D foam geometry obtained by computed tomography. All the analyses are based on geometrical optics. The Monte Carlo ray-tracing procedure is used for computations of the absorbed radiative fluxes and the apparent radiative behaviors of metal foams. The results obtained by the three approaches are in tenable agreement. The scale-coupled approach is fully validated in calculating the apparent radiative behaviors of metal foams composed of very absorbing to very reflective struts and that composed of very rough to very smooth struts. This new approach leads to a reduction in computational time by approximately one order of magnitude compared to the direct discrete-scale approach. Meanwhile, it can offer information on the local geometry-dependent feature and at the same time the equivalent feature in an integrated simulation. This new approach is promising to combine the advantages of the continuous-scale approach (rapid calculations) and direct discrete-scale approach (accurate prediction of local radiative quantities).

  5. GSFC Space Simulation Laboratory Contamination Philosophy: Efficient Space Simulation Chamber Cleaning Techniques

    NASA Technical Reports Server (NTRS)

    Roman, Juan A.; Stitt, George F.; Roman, Felix R.

    1997-01-01

    This paper will provide a general overview of the molecular contamination philosophy of the Space Simulation Test Engineering Section and how the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) space simulation laboratory controls and maintains the cleanliness of all its facilities, thereby, minimizing down time between tests. It will also briefly cover the proper selection and safety precautions needed when using some chemical solvents for wiping, washing, or spraying thermal shrouds when molecular contaminants increase to unacceptable background levels.

  6. Laboratory simulation of cratering on small bodies

    NASA Technical Reports Server (NTRS)

    Schmidt, Robert M.

    1991-01-01

    A new technique using external pressure was developed to simulate the lithostatic pressure due to self-gravity of small bodies. A 13-in. diameter cylindrical test chamber with L/D of 1 was fabricated to accommodate firing explosive charges with gas overpressures of up to 6000 psi. The chamber was hydrotested to 9000 psi. The method allows much larger scale factors that can be obtained with existing centrifuges and has the correct spherical geometry of self gravity. A simulant for jointed rock to be used in this fixture was developed using weakly cemented basalt. Various strength/pressure scaling theories can now be examined and tested.

  7. Global ice sheet/RSL simulations using the higher-order Ice Sheet System Model.

    NASA Astrophysics Data System (ADS)

    Larour, E. Y.; Ivins, E. R.; Adhikari, S.; Schlegel, N.; Seroussi, H. L.; Morlighem, M.

    2017-12-01

    Relative sea-level rise is driven by processes that are intimately linked to the evolution ofglacial areas and ice sheets in particular. So far, most Earth System models capable of projecting theevolution of RSL on decadal to centennial time scales have relied on offline interactions between RSL andice sheets. In particular, grounding line and calving front dynamics have not been modeled in a way that istightly coupled with Elasto-Static Adjustment (ESA) and/or Glacial-Isostatic Adjustment (GIA). Here, we presenta new simulation of the entire Earth System in which both Greenland and Antarctica ice sheets are tightly coupledto an RSL model that includes both ESA and GIA at resolutions and time scales compatible with processes suchas grounding line dynamics for Antarctica ice shelves and calving front dynamics for Greenland marine-terminatingglaciers. The simulations rely on the Ice Sheet System Model (ISSM) and show the impact of higher-orderice flow dynamics and coupling feedbacks between ice flow and RSL. We quantify the exact impact of ESA andGIA inclusion on grounding line evolution for large ice shelves such as the Ronne and Ross ice shelves, as well asthe Agasea Embayment ice streams, and demonstate how offline vs online RSL simulations diverge in the long run,and the consequences for predictions of sea-level rise.This work was performed at the California Institute of Technology's Jet Propulsion Laboratory undera contract with the National Aeronautics and Space Administration's Cryosphere Science Program.

  8. User's manual for flight Simulator Display System (FSDS)

    NASA Technical Reports Server (NTRS)

    Egerdahl, C. C.

    1979-01-01

    The capabilities of the flight simulator display system (FSDS) are described. FSDS is a color raster scan display generator designed to meet the special needs of Flight Simulation Laboratories. The FSDS can update (revise) the images it generates every 16.6 mS, with limited support from a host processor. This corresponds to the standard TV vertical rate of 60 Hertz, and allows the system to carry out display functions in a time critical environment. Rotation of a complex image in the television raster with minimal hardware is possible with the system.

  9. Laboratories | Energy Systems Integration Facility | NREL

    Science.gov Websites

    laboratories to be safely divided into multiple test stand locations (or "capability hubs") to enable Fabrication Laboratory Energy Systems High-Pressure Test Laboratory Energy Systems Integration Laboratory Energy Systems Sensor Laboratory Fuel Cell Development and Test Laboratory High-Performance Computing

  10. Large Eddy Simulations of Colorless Distributed Combustion Systems

    NASA Astrophysics Data System (ADS)

    Abdulrahman, Husam F.; Jaberi, Farhad; Gupta, Ashwani

    2014-11-01

    Development of efficient and low-emission colorless distributed combustion (CDC) systems for gas turbine applications require careful examination of the role of various flow and combustion parameters. Numerical simulations of CDC in a laboratory-scale combustor have been conducted to carefully examine the effects of these parameters on the CDC. The computational model is based on a hybrid modeling approach combining large eddy simulation (LES) with the filtered mass density function (FMDF) equations, solved with high order numerical methods and complex chemical kinetics. The simulated combustor operates based on the principle of high temperature air combustion (HiTAC) and has shown to significantly reduce the NOx, and CO emissions while improving the reaction pattern factor and stability without using any flame stabilizer and with low pressure drop and noise. The focus of the current work is to investigate the mixing of air and hydrocarbon fuels and the non-premixed and premixed reactions within the combustor by the LES/FMDF with the reduced chemical kinetic mechanisms for the same flow conditions and configurations investigated experimentally. The main goal is to develop better CDC with higher mixing and efficiency, ultra-low emission levels and optimum residence time. The computational results establish the consistency and the reliability of LES/FMDF and its Lagrangian-Eulerian numerical methodology.

  11. Efficient parallelization of analytic bond-order potentials for large-scale atomistic simulations

    NASA Astrophysics Data System (ADS)

    Teijeiro, C.; Hammerschmidt, T.; Drautz, R.; Sutmann, G.

    2016-07-01

    Analytic bond-order potentials (BOPs) provide a way to compute atomistic properties with controllable accuracy. For large-scale computations of heterogeneous compounds at the atomistic level, both the computational efficiency and memory demand of BOP implementations have to be optimized. Since the evaluation of BOPs is a local operation within a finite environment, the parallelization concepts known from short-range interacting particle simulations can be applied to improve the performance of these simulations. In this work, several efficient parallelization methods for BOPs that use three-dimensional domain decomposition schemes are described. The schemes are implemented into the bond-order potential code BOPfox, and their performance is measured in a series of benchmarks. Systems of up to several millions of atoms are simulated on a high performance computing system, and parallel scaling is demonstrated for up to thousands of processors.

  12. MARIKA - A model revision system using qualitative analysis of simulations. [of human orientation system

    NASA Technical Reports Server (NTRS)

    Groleau, Nicolas; Frainier, Richard; Colombano, Silvano; Hazelton, Lyman; Szolovits, Peter

    1993-01-01

    This paper describes portions of a novel system called MARIKA (Model Analysis and Revision of Implicit Key Assumptions) to automatically revise a model of the normal human orientation system. The revision is based on analysis of discrepancies between experimental results and computer simulations. The discrepancies are calculated from qualitative analysis of quantitative simulations. The experimental and simulated time series are first discretized in time segments. Each segment is then approximated by linear combinations of simple shapes. The domain theory and knowledge are represented as a constraint network. Incompatibilities detected during constraint propagation within the network yield both parameter and structural model alterations. Interestingly, MARIKA diagnosed a data set from the Massachusetts Eye and Ear Infirmary Vestibular Laboratory as abnormal though the data was tagged as normal. Published results from other laboratories confirmed the finding. These encouraging results could lead to a useful clinical vestibular tool and to a scientific discovery system for space vestibular adaptation.

  13. Comparing field investigations with laboratory models to predict landfill leachate emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fellner, Johann; Doeberl, Gernot; Allgaier, Gerhard

    2009-06-15

    Investigations into laboratory reactors and landfills are used for simulating and predicting emissions from municipal solid waste landfills. We examined water flow and solute transport through the same waste body for different volumetric scales (laboratory experiment: 0.08 m{sup 3}, landfill: 80,000 m{sup 3}), and assessed the differences in water flow and leachate emissions of chloride, total organic carbon and Kjeldahl nitrogen. The results indicate that, due to preferential pathways, the flow of water in field-scale landfills is less uniform than in laboratory reactors. Based on tracer experiments, it can be discerned that in laboratory-scale experiments around 40% of pore watermore » participates in advective solute transport, whereas this fraction amounts to less than 0.2% in the investigated full-scale landfill. Consequences of the difference in water flow and moisture distribution are: (1) leachate emissions from full-scale landfills decrease faster than predicted by laboratory experiments, and (2) the stock of materials remaining in the landfill body, and thus the long-term emission potential, is likely to be underestimated by laboratory landfill simulations.« less

  14. VIC-CropSyst-v2: A regional-scale modeling platform to simulate the nexus of climate, hydrology, cropping systems, and human decisions

    NASA Astrophysics Data System (ADS)

    Malek, Keyvan; Stöckle, Claudio; Chinnayakanahalli, Kiran; Nelson, Roger; Liu, Mingliang; Rajagopalan, Kirti; Barik, Muhammad; Adam, Jennifer C.

    2017-08-01

    Food supply is affected by a complex nexus of land, atmosphere, and human processes, including short- and long-term stressors (e.g., drought and climate change, respectively). A simulation platform that captures these complex elements can be used to inform policy and best management practices to promote sustainable agriculture. We have developed a tightly coupled framework using the macroscale variable infiltration capacity (VIC) hydrologic model and the CropSyst agricultural model. A mechanistic irrigation module was also developed for inclusion in this framework. Because VIC-CropSyst combines two widely used and mechanistic models (for crop phenology, growth, management, and macroscale hydrology), it can provide realistic and hydrologically consistent simulations of water availability, crop water requirements for irrigation, and agricultural productivity for both irrigated and dryland systems. This allows VIC-CropSyst to provide managers and decision makers with reliable information on regional water stresses and their impacts on food production. Additionally, VIC-CropSyst is being used in conjunction with socioeconomic models, river system models, and atmospheric models to simulate feedback processes between regional water availability, agricultural water management decisions, and land-atmosphere interactions. The performance of VIC-CropSyst was evaluated on both regional (over the US Pacific Northwest) and point scales. Point-scale evaluation involved using two flux tower sites located in agricultural fields in the US (Nebraska and Illinois). The agreement between recorded and simulated evapotranspiration (ET), applied irrigation water, soil moisture, leaf area index (LAI), and yield indicated that, although the model is intended to work on regional scales, it also captures field-scale processes in agricultural areas.

  15. Error simulation of paired-comparison-based scaling methods

    NASA Astrophysics Data System (ADS)

    Cui, Chengwu

    2000-12-01

    Subjective image quality measurement usually resorts to psycho physical scaling. However, it is difficult to evaluate the inherent precision of these scaling methods. Without knowing the potential errors of the measurement, subsequent use of the data can be misleading. In this paper, the errors on scaled values derived form paired comparison based scaling methods are simulated with randomly introduced proportion of choice errors that follow the binomial distribution. Simulation results are given for various combinations of the number of stimuli and the sampling size. The errors are presented in the form of average standard deviation of the scaled values and can be fitted reasonably well with an empirical equation that can be sued for scaling error estimation and measurement design. The simulation proves paired comparison based scaling methods can have large errors on the derived scaled values when the sampling size and the number of stimuli are small. Examples are also given to show the potential errors on actually scaled values of color image prints as measured by the method of paired comparison.

  16. Particle-based simulation of charge transport in discrete-charge nano-scale systems: the electrostatic problem

    PubMed Central

    2012-01-01

    The fast and accurate computation of the electric forces that drive the motion of charged particles at the nanometer scale represents a computational challenge. For this kind of system, where the discrete nature of the charges cannot be neglected, boundary element methods (BEM) represent a better approach than finite differences/finite elements methods. In this article, we compare two different BEM approaches to a canonical electrostatic problem in a three-dimensional space with inhomogeneous dielectrics, emphasizing their suitability for particle-based simulations: the iterative method proposed by Hoyles et al. and the Induced Charge Computation introduced by Boda et al. PMID:22338640

  17. Particle-based simulation of charge transport in discrete-charge nano-scale systems: the electrostatic problem.

    PubMed

    Berti, Claudio; Gillespie, Dirk; Eisenberg, Robert S; Fiegna, Claudio

    2012-02-16

    The fast and accurate computation of the electric forces that drive the motion of charged particles at the nanometer scale represents a computational challenge. For this kind of system, where the discrete nature of the charges cannot be neglected, boundary element methods (BEM) represent a better approach than finite differences/finite elements methods. In this article, we compare two different BEM approaches to a canonical electrostatic problem in a three-dimensional space with inhomogeneous dielectrics, emphasizing their suitability for particle-based simulations: the iterative method proposed by Hoyles et al. and the Induced Charge Computation introduced by Boda et al.

  18. Lightweight computational steering of very large scale molecular dynamics simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beazley, D.M.; Lomdahl, P.S.

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show howmore » this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.« less

  19. Performance goals on simulators boost resident motivation and skills laboratory attendance.

    PubMed

    Stefanidis, Dimitrios; Acker, Christina E; Greene, Frederick L

    2010-01-01

    To assess the impact of setting simulator training goals on resident motivation and skills laboratory attendance. Residents followed a proficiency-based laparoscopic curriculum on the 5 Fundamentals of Laparoscopic Surgery and 9 virtual reality tasks. Training goals consisted of the average expert performance on each task + 2 SD (mandatory) and best expert performance (optional). Residents rated the impact of the training goals on their motivation on a 20-point visual analog scale. Performance and attendance data were analyzed and correlated (Spearman's). Data are reported as medians (range). General Surgery residency program at a regional referral Academic Medical Center. General surgery residents (n = 15). During the first 5 months of the curriculum, weekly attendance rate was 51% (range, 8-96). After 153 (range, 21-412) repetitions, resident speed improved by 97% (range, 18-230), errors improved by 17% (range, 0-24), and motion efficiency by 59% (range, 26-114) compared with their baseline. Nine (60%) residents achieved proficiency in 7 (range, 3-14) and the best goals in 3.5 (range, 1-9) tasks; the other 6 residents had attendance rates <30%. Residents rated the impact of setting performance goals on their motivation as 15 (range, 1-18) and setting a best goal as 13 (range, 1-18). Motivation ratings correlated positively with attendance rates, number of repetitions, performance improvement, and achievement of proficiency and best goals (r = 0.59-0.75; p < 0.05) but negatively with postgraduate year (PGY) (-0.67; p = 0.02). Setting training goals on simulators are associated with improved resident motivation to participate in a simulator curriculum. While more stringent goals may potentiate this effect, they have a limited impact on senior residents. Further research is needed to investigate ways to improve skills laboratory attendance. Copyright 2010 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  20. Using Interorganizational Partnerships to Strengthen Public Health Laboratory Systems

    PubMed Central

    Kimsey, Paul; Buehring, Gertrude

    2013-01-01

    Due to the current economic environment, many local and state health departments are faced with budget reductions. Health department administrators and public health laboratory (PHL) directors need to assess strategies to ensure that their PHLs can provide the same level of service with decreased funds. Exploratory case studies of interorganizational partnerships among local PHLs in California were conducted to determine the impact on local PHL testing services and capacity. Our findings suggest that interorganizational forms of cooperation among local PHLs can help bolster laboratory capacity by capturing economies of scale, leveraging scarce resources, and ensuring access to affordable, timely, and quality laboratory testing services. Interorganizational partnerships will help local and state public health departments continue to maintain a strong and robust laboratory system that supports their role in communicable disease surveillance. PMID:23997305

  1. The Berkeley Environmental Simulation Laboratory: Its Use In Environmental Impact Assessment.

    ERIC Educational Resources Information Center

    Appleyard, Donald; And Others

    An environmental simulation laboratory at the University of California, Berkeley, is testing the adequacy of different techniques for simulating environmental experiences. Various levels of realism, with various costs, are available in different presentation modes. The simulations can aid in communication about and the resolution of environmental…

  2. An Evaluation of Student Perceptions of Screen Presentations in Computer-based Laboratory Simulations.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Evaluates the importance of realism in the screen presentation of the plant in computer-based laboratory simulations for part-time engineering students. Concludes that simulations are less effective than actual laboratories but that realism minimizes the disadvantages. The schematic approach was preferred for ease of use. (AIM)

  3. Large-Scale Modeling of Epileptic Seizures: Scaling Properties of Two Parallel Neuronal Network Simulation Algorithms

    DOE PAGES

    Pesce, Lorenzo L.; Lee, Hyong C.; Hereld, Mark; ...

    2013-01-01

    Our limited understanding of the relationship between the behavior of individual neurons and large neuronal networks is an important limitation in current epilepsy research and may be one of the main causes of our inadequate ability to treat it. Addressing this problem directly via experiments is impossibly complex; thus, we have been developing and studying medium-large-scale simulations of detailed neuronal networks to guide us. Flexibility in the connection schemas and a complete description of the cortical tissue seem necessary for this purpose. In this paper we examine some of the basic issues encountered in these multiscale simulations. We have determinedmore » the detailed behavior of two such simulators on parallel computer systems. The observed memory and computation-time scaling behavior for a distributed memory implementation were very good over the range studied, both in terms of network sizes (2,000 to 400,000 neurons) and processor pool sizes (1 to 256 processors). Our simulations required between a few megabytes and about 150 gigabytes of RAM and lasted between a few minutes and about a week, well within the capability of most multinode clusters. Therefore, simulations of epileptic seizures on networks with millions of cells should be feasible on current supercomputers.« less

  4. Laboratory evaluation of Fecker and Loral optical IR PWI systems

    NASA Technical Reports Server (NTRS)

    Gorstein, M.; Hallock, J. N.; Houten, M.; Mcwilliams, I. G.

    1971-01-01

    A previous flight test of two electro-optical pilot warning indicators, using a flashing xenon strobe and silicon detectors as cooperative elements, pointed out several design deficiencies. The present laboratory evaluation program corrected these faults and calibrated the sensitivity of both systems in azimuth elevation and range. The laboratory tests were performed on an optical bench and consisted of three basic components: (1) a xenon strobe lamp whose output is monitored at the indicator detector to give pulse to pulse information on energy content at the receiver; (2) a strobe light attenuating optical system which is calibrated photometrically to provide simulated range; and (3) a positioning table on which the indicator system under study is mounted and which provides spatial location coordinates for all data points. The test results for both systems are tabulated.

  5. Gyrokinetic Simulations of Transport Scaling and Structure

    NASA Astrophysics Data System (ADS)

    Hahm, Taik Soo

    2001-10-01

    There is accumulating evidence from global gyrokinetic particle simulations with profile variations and experimental fluctuation measurements that microturbulence, with its time-averaged eddy size which scales with the ion gyroradius, can cause ion thermal transport which deviates from the gyro-Bohm scaling. The physics here can be best addressed by large scale (rho* = rho_i/a = 0.001) full torus gyrokinetic particle-in-cell turbulence simulations using our massively parallel, general geometry gyrokinetic toroidal code with field-aligned mesh. Simulation results from device-size scans for realistic parameters show that ``wave transport'' mechanism is not the dominant contribution for this Bohm-like transport and that transport is mostly diffusive driven by microscopic scale fluctuations in the presence of self-generated zonal flows. In this work, we analyze the turbulence and zonal flow statistics from simulations and compare to nonlinear theoretical predictions including the radial decorrelation of the transport events by zonal flows and the resulting probability distribution function (PDF). In particular, possible deviation of the characteristic radial size of transport processes from the time-averaged radial size of the density fluctuation eddys will be critically examined.

  6. Large-Scale Optimization for Bayesian Inference in Complex Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve

  7. Probabilistic simulation of multi-scale composite behavior

    NASA Technical Reports Server (NTRS)

    Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.

    1993-01-01

    A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.

  8. Vacuum packing: a model system for laboratory-scale silage fermentations.

    PubMed

    Johnson, H E; Merry, R J; Davies, D R; Kell, D B; Theodorou, M K; Griffith, G W

    2005-01-01

    To determine the utility of vacuum-packed polythene bags as a convenient, flexible and cost-effective alternative to fixed volume glass vessels for lab-scale silage studies. Using perennial ryegrass or red clover forage, similar fermentations (as assessed by pH measurement) occurred in glass tube and vacuum-packed silos over a 35-day period. As vacuum-packing devices allow modification of initial packing density, the effect of four different settings (initial packing densities of 0.397, 0.435, 0.492 and 0.534 g cm(-3)) on the silage fermentation over 16 days was examined. Significant differences in pH decline and lactate accumulation were observed at different vacuum settings. Gas accumulation was apparent within all bags and changes in bag volume with time was observed to vary according to initial packing density. Vacuum-packed silos do provide a realistic model system for lab-scale silage fermentations. Use of vacuum-packed silos holds potential for lab-scale evaluations of silage fermentations, allowing higher throughput of samples, more consistent packing as well as the possibility of investigating the effects of different initial packing densities and use of different wrapping materials.

  9. Laboratory evaluation and application of microwave absorption properties under simulated conditions for planetary atmospheres

    NASA Technical Reports Server (NTRS)

    Steffes, Paul G.

    1988-01-01

    Radio absorptivity data for planetary atmospheres obtained from spacecraft radio occultation experiments and earth-based radio astronomical observations can be used to infer abundances of microwave absorbing atmospheric constituents in those atmospheres, as long as reliable information regarding the microwave absorbing properties of potential constituents is available. The key activity for this grant year has continued to be laboratory measurements of the microwave and millimeter-wave properties of the simulated atmospheres of the outer planets and their satellites. A Fabry-Perot spectrometer system capable of operation from 32 to 41 GHz was developed. Initially this spectrometer was used to complete laboratory measurements of the 7.5 to 9.3 mm absorption spectrum of ammonia. Laboratory measurements were begun at wavelengths near 3.2 mm, where a large number of observations of the emission from the outer planets were made. A description of this system is presented.

  10. Monte Carlo capabilities of the SCALE code system

    DOE PAGES

    Rearden, Bradley T.; Petrie, Jr., Lester M.; Peplow, Douglas E.; ...

    2014-09-12

    SCALE is a broadly used suite of tools for nuclear systems modeling and simulation that provides comprehensive, verified and validated, user-friendly capabilities for criticality safety, reactor physics, radiation shielding, and sensitivity and uncertainty analysis. For more than 30 years, regulators, licensees, and research institutions around the world have used SCALE for nuclear safety analysis and design. SCALE provides a “plug-and-play” framework that includes three deterministic and three Monte Carlo radiation transport solvers that can be selected based on the desired solution, including hybrid deterministic/Monte Carlo simulations. SCALE includes the latest nuclear data libraries for continuous-energy and multigroup radiation transport asmore » well as activation, depletion, and decay calculations. SCALE’s graphical user interfaces assist with accurate system modeling, visualization, and convenient access to desired results. SCALE 6.2 will provide several new capabilities and significant improvements in many existing features, especially with expanded continuous-energy Monte Carlo capabilities for criticality safety, shielding, depletion, and sensitivity and uncertainty analysis. Finally, an overview of the Monte Carlo capabilities of SCALE is provided here, with emphasis on new features for SCALE 6.2.« less

  11. Biodegradation modelling of a dissolved gasoline plume applying independent laboratory and field parameters

    NASA Astrophysics Data System (ADS)

    Schirmer, Mario; Molson, John W.; Frind, Emil O.; Barker, James F.

    2000-12-01

    Biodegradation of organic contaminants in groundwater is a microscale process which is often observed on scales of 100s of metres or larger. Unfortunately, there are no known equivalent parameters for characterizing the biodegradation process at the macroscale as there are, for example, in the case of hydrodynamic dispersion. Zero- and first-order degradation rates estimated at the laboratory scale by model fitting generally overpredict the rate of biodegradation when applied to the field scale because limited electron acceptor availability and microbial growth are not considered. On the other hand, field-estimated zero- and first-order rates are often not suitable for predicting plume development because they may oversimplify or neglect several key field scale processes, phenomena and characteristics. This study uses the numerical model BIO3D to link the laboratory and field scales by applying laboratory-derived Monod kinetic degradation parameters to simulate a dissolved gasoline field experiment at the Canadian Forces Base (CFB) Borden. All input parameters were derived from independent laboratory and field measurements or taken from the literature a priori to the simulations. The simulated results match the experimental results reasonably well without model calibration. A sensitivity analysis on the most uncertain input parameters showed only a minor influence on the simulation results. Furthermore, it is shown that the flow field, the amount of electron acceptor (oxygen) available, and the Monod kinetic parameters have a significant influence on the simulated results. It is concluded that laboratory-derived Monod kinetic parameters can adequately describe field scale degradation, provided all controlling factors are incorporated in the field scale model. These factors include advective-dispersive transport of multiple contaminants and electron acceptors and large-scale spatial heterogeneities.

  12. Data Services and Transnational Access for European Geosciences Multi-Scale Laboratories

    NASA Astrophysics Data System (ADS)

    Funiciello, Francesca; Rosenau, Matthias; Sagnotti, Leonardo; Scarlato, Piergiorgio; Tesei, Telemaco; Trippanera, Daniele; Spires, Chris; Drury, Martyn; Kan-Parker, Mirjam; Lange, Otto; Willingshofer, Ernst

    2016-04-01

    The EC policy for research in the new millennium supports the development of european-scale research infrastructures. In this perspective, the existing research infrastructures are going to be integrated with the objective to increase their accessibility and to enhance the usability of their multidisciplinary data. Building up integrating Earth Sciences infrastructures in Europe is the mission of the Implementation Phase (IP) of the European Plate Observing System (EPOS) project (2015-2019). The integration of european multiscale laboratories - analytical, experimental petrology and volcanology, magnetic and analogue laboratories - plays a key role in this context and represents a specific task of EPOS IP. In the frame of the WP16 of EPOS IP working package 16, European geosciences multiscale laboratories aims to be linked, merging local infrastructures into a coherent and collaborative network. In particular, the EPOS IP WP16-task 4 "Data services" aims at standardize data and data products, already existing and newly produced by the participating laboratories, and made them available through a new digital platform. The following data and repositories have been selected for the purpose: 1) analytical and properties data a) on volcanic ash from explosive eruptions, of interest to the aviation industry, meteorological and government institutes, b) on magmas in the context of eruption and lava flow hazard evaluation, and c) on rock systems of key importance in mineral exploration and mining operations; 2) experimental data describing: a) rock and fault properties of importance for modelling and forecasting natural and induced subsidence, seismicity and associated hazards, b) rock and fault properties relevant for modelling the containment capacity of rock systems for CO2, energy sources and wastes, c) crustal and upper mantle rheology as needed for modelling sedimentary basin formation and crustal stress distributions, d) the composition, porosity, permeability, and

  13. Roles of laboratories and laboratory systems in effective tuberculosis programmes.

    PubMed

    Ridderhof, John C; van Deun, Armand; Kam, Kai Man; Narayanan, P R; Aziz, Mohamed Abdul

    2007-05-01

    Laboratories and laboratory networks are a fundamental component of tuberculosis (TB) control, providing testing for diagnosis, surveillance and treatment monitoring at every level of the health-care system. New initiatives and resources to strengthen laboratory capacity and implement rapid and new diagnostic tests for TB will require recognition that laboratories are systems that require quality standards, appropriate human resources, and attention to safety in addition to supplies and equipment. To prepare the laboratory networks for new diagnostics and expanded capacity, we need to focus efforts on strengthening quality management systems (QMS) through additional resources for external quality assessment programmes for microscopy, culture, drug susceptibility testing (DST) and molecular diagnostics. QMS should also promote development of accreditation programmes to ensure adherence to standards to improve both the quality and credibility of the laboratory system within TB programmes. Corresponding attention must be given to addressing human resources at every level of the laboratory, with special consideration being given to new programmes for laboratory management and leadership skills. Strengthening laboratory networks will also involve setting up partnerships between TB programmes and those seeking to control other diseases in order to pool resources and to promote advocacy for quality standards, to develop strategies to integrate laboratories functions and to extend control programme activities to the private sector. Improving the laboratory system will assure that increased resources, in the form of supplies, equipment and facilities, will be invested in networks that are capable of providing effective testing to meet the goals of the Global Plan to Stop TB.

  14. Virtual geotechnical laboratory experiments using a simulator

    NASA Astrophysics Data System (ADS)

    Penumadu, Dayakar; Zhao, Rongda; Frost, David

    2000-04-01

    The details of a test simulator that provides a realistic environment for performing virtual laboratory experimentals in soil mechanics is presented. A computer program Geo-Sim that can be used to perform virtual experiments, and allow for real-time observations of material response is presented. The results of experiments, for a given set of input parameters, are obtained with the test simulator using well-trained artificial neural-network-based soil models for different soil types and stress paths. Multimedia capabilities are integrated in Geo-Sim, using software that links and controls a laser disc player with a real-time parallel processing ability. During the simulation of a virtual experiment, relevant portions of the video image of a previously recorded test on an actual soil specimen are dispalyed along with the graphical presentation of response from the feedforward ANN model predictions. The pilot simulator developed to date includes all aspects related to performing a triaxial test on cohesionless soil under undrained and drained conditions. The benefits of the test simulator are also presented.

  15. Development of a continuous motorcycle protection barrier system using computer simulation and full-scale crash testing.

    PubMed

    Atahan, Ali O; Hiekmann, J Marten; Himpe, Jeffrey; Marra, Joseph

    2018-07-01

    Road restraint systems are designed to minimize the undesirable effects of roadside accidents and improve safety of road users. These systems are utilized at either side or median section of roads to contain and redirect errant vehicles. Although restraint systems are mainly designed against car, truck and bus impacts there is an increasing pressure by the motorcycle industry to incorporate motorcycle protection systems into these systems. In this paper development details of a new and versatile motorcycle barrier, CMPS, coupled with an existing vehicle barrier is presented. CMPS is intended to safely contain and redirect motorcyclists during a collision event. First, crash performance of CMPS design is evaluated by means of a three dimensional computer simulation program LS-DYNA. Then full-scale crash tests are used to verify the acceptability of CMPS design. Crash tests were performed at CSI proving ground facility using a motorcycle dummy in accordance with prEN 1317-8 specification. Full-scale crash test results show that CMPS is able to successfully contain and redirect dummy with minimal injury risk on the dummy. Damage on the barrier is also minimal proving the robustness of the CMPS design. Based on the test findings and further review by the authorities the implementation of CMPS was recommended at highway system. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Energy Systems Sensor Laboratory | Energy Systems Integration Facility |

    Science.gov Websites

    NREL Sensor Laboratory Energy Systems Sensor Laboratory The Energy Systems Integration Facility's Energy Systems Sensor Laboratory is designed to support research, development, testing, and evaluation of advanced hydrogen sensor technologies to support the needs of the emerging hydrogen

  17. Cosmological neutrino simulations at extreme scale

    DOE PAGES

    Emberson, J. D.; Yu, Hao-Ran; Inman, Derek; ...

    2017-08-01

    Constraining neutrino mass remains an elusive challenge in modern physics. Precision measurements are expected from several upcoming cosmological probes of large-scale structure. Achieving this goal relies on an equal level of precision from theoretical predictions of neutrino clustering. Numerical simulations of the non-linear evolution of cold dark matter and neutrinos play a pivotal role in this process. We incorporate neutrinos into the cosmological N-body code CUBEP3M and discuss the challenges associated with pushing to the extreme scales demanded by the neutrino problem. We highlight code optimizations made to exploit modern high performance computing architectures and present a novel method ofmore » data compression that reduces the phase-space particle footprint from 24 bytes in single precision to roughly 9 bytes. We scale the neutrino problem to the Tianhe-2 supercomputer and provide details of our production run, named TianNu, which uses 86% of the machine (13,824 compute nodes). With a total of 2.97 trillion particles, TianNu is currently the world’s largest cosmological N-body simulation and improves upon previous neutrino simulations by two orders of magnitude in scale. We finish with a discussion of the unanticipated computational challenges that were encountered during the TianNu runtime.« less

  18. Investigating the dependence of SCM simulated precipitation and clouds on the spatial scale of large-scale forcing at SGP [Investigating the scale dependence of SCM simulated precipitation and cloud by using gridded forcing data at SGP

    DOE PAGES

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2017-08-05

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version ofmore » the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. As a result, other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.« less

  19. Future Automotive Systems Technology Simulator (FASTSim)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An advanced vehicle powertrain systems analysis tool, the Future Automotive Systems Technology Simulator (FASTSim) provides a simple way to compare powertrains and estimate the impact of technology improvements on light-, medium- and heavy-duty vehicle efficiency, performance, cost, and battery life. Created by the National Renewable Energy Laboratory, FASTSim accommodates a range of vehicle types - including conventional vehicles, electric-drive vehicles, and fuel cell vehicles - and is available for free download in Microsoft Excel and Python formats.

  20. Simulating maar-diatreme volcanic systems in bench-scale experiments

    NASA Astrophysics Data System (ADS)

    Andrews, R. G.; White, J. D. L.; Dürig, T.; Zimanowski, B.

    2015-12-01

    Maar-diatreme eruptions are incompletely understood, and explanations for the processes involved in them have been debated for decades. This study extends bench-scale analogue experiments previously conducted on maar-diatreme systems and attempts to scale the results up to both field-scale experimentation and natural volcanic systems in order to produce a reconstructive toolkit for maar volcanoes. These experimental runs produced via multiple mechanisms complex deposits that match many features seen in natural maar-diatreme deposits. The runs include deeper single blasts, series of descending discrete blasts, and series of ascending blasts. Debris-jet inception and diatreme formation are indicated by this study to involve multiple types of granular fountains within diatreme deposits produced under varying initial conditions. The individual energies of blasts in multiple-blast series are not possible to infer from the final deposits. The depositional record of blast sequences can be ascertained from the proportion of fallback sedimentation versus maar ejecta rim material, the final crater size and the degree of overturning or slumping of accessory strata. Quantitatively, deeper blasts involve a roughly equal partitioning of energy into crater excavation energy versus mass movement of juvenile material, whereas shallower blasts expend a much greater proportion of energy in crater excavation.

  1. Human System Simulation in Support of Human Performance Technical Basis at NPPs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Gertman; Katya Le Blanc; alan mecham

    2010-06-01

    This paper focuses on strategies and progress toward establishing the Idaho National Laboratory’s (INL’s) Human Systems Simulator Laboratory at the Center for Advanced Energy Studies (CAES), a consortium of Idaho State Universities. The INL is one of the National Laboratories of the US Department of Energy. One of the first planned applications for the Human Systems Simulator Laboratory is implementation of a dynamic nuclear power plant simulation (NPP) where studies of operator workload, situation awareness, performance and preference will be carried out in simulated control rooms including nuclear power plant control rooms. Simulation offers a means by which to reviewmore » operational concepts, improve design practices and provide a technical basis for licensing decisions. In preparation for the next generation power plant and current government and industry efforts in support of light water reactor sustainability, human operators will be attached to a suite of physiological measurement instruments and, in combination with traditional Human Factors Measurement techniques, carry out control room tasks in simulated advanced digital and hybrid analog/digital control rooms. The current focus of the Human Systems Simulator Laboratory is building core competence in quantitative and qualitative measurements of situation awareness and workload. Of particular interest is whether introduction of digital systems including automated procedures has the potential to reduce workload and enhance safety while improving situation awareness or whether workload is merely shifted and situation awareness is modified in yet to be determined ways. Data analysis is carried out by engineers and scientists and includes measures of the physical and neurological correlates of human performance. The current approach supports a user-centered design philosophy (see ISO 13407 “Human Centered Design Process for Interactive Systems, 1999) wherein the context for task performance along

  2. Results of Small-scale Solid Rocket Combustion Simulator testing at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Goldberg, Benjamin E.; Cook, Jerry

    1993-01-01

    The Small-scale Solid Rocket Combustion Simulator (SSRCS) program was established at the Marshall Space Flight Center (MSFC), and used a government/industry team consisting of Hercules Aerospace Corporation, Aerotherm Corporation, United Technology Chemical Systems Division, Thiokol Corporation and MSFC personnel to study the feasibility of simulating the combustion species, temperatures and flow fields of a conventional solid rocket motor (SRM) with a versatile simulator system. The SSRCS design is based on hybrid rocket motor principles. The simulator uses a solid fuel and a gaseous oxidizer. Verification of the feasibility of a SSRCS system as a test bed was completed using flow field and system analyses, as well as empirical test data. A total of 27 hot firings of a subscale SSRCS motor were conducted at MSFC. Testing of the Small-scale SSRCS program was completed in October 1992. This paper, a compilation of reports from the above team members and additional analysis of the instrumentation results, will discuss the final results of the analyses and test programs.

  3. Particle-in-cell simulations of collisionless shock formation via head-on merging of two laboratory supersonic plasma jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thoma, C.; Welch, D. R.; Hsu, S. C.

    2013-08-15

    We describe numerical simulations, using the particle-in-cell (PIC) and hybrid-PIC code lsp[T. P. Hughes et al., Phys. Rev. ST Accel. Beams 2, 110401 (1999)], of the head-on merging of two laboratory supersonic plasma jets. The goals of these experiments are to form and study astrophysically relevant collisionless shocks in the laboratory. Using the plasma jet initial conditions (density ∼10{sup 14}–10{sup 16} cm{sup −3}, temperature ∼ few eV, and propagation speed ∼20–150 km/s), large-scale simulations of jet propagation demonstrate that interactions between the two jets are essentially collisionless at the merge region. In highly resolved one- and two-dimensional simulations, we showmore » that collisionless shocks are generated by the merging jets when immersed in applied magnetic fields (B∼0.1–1 T). At expected plasma jet speeds of up to 150 km/s, our simulations do not give rise to unmagnetized collisionless shocks, which require much higher velocities. The orientation of the magnetic field and the axial and transverse density gradients of the jets have a strong effect on the nature of the interaction. We compare some of our simulation results with those of previously published PIC simulation studies of collisionless shock formation.« less

  4. Design and test of a simulation system for autonomous optic-navigated planetary landing

    NASA Astrophysics Data System (ADS)

    Cai, Sheng; Yin, Yanhe; Liu, Yanjun; He, Fengyun

    2018-02-01

    In this paper, a simulation system based on commercial projector is proposed to test the optical navigation algorithms for autonomous planetary landing in laboratorial scenarios. The design work of optics, mechanics and synchronization control are carried out. Furthermore, the whole simulation system is set up and tested. Through the calibration of the system, two main problems, synchronization between the projector and CCD and pixel-level shifting caused by the low repeatability of DMD used in the projector, are settled. The experimental result shows that the RMS errors of pitch, yaw and roll angles are 0.78', 0.48', and 2.95' compared with the theoretical calculation, which can fulfill the requirement of experimental simulation for planetary landing in laboratory.

  5. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    NASA Astrophysics Data System (ADS)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  6. Fate of estrone in laboratory-scale constructed wetlands

    USDA-ARS?s Scientific Manuscript database

    A horizontal, subsurface, laboratory-scale constructed wetland (CW) consisting of four cells in series was used to determine the attenuation of the steroid hormone estrone (E1) present in animal wastewater. Liquid swine manure diluted 1:80 with farm pond water and dosed with [14C]E1 flowed through ...

  7. Roles of laboratories and laboratory systems in effective tuberculosis programmes

    PubMed Central

    van Deun, Armand; Kam, Kai Man; Narayanan, PR; Aziz, Mohamed Abdul

    2007-01-01

    Abstract Laboratories and laboratory networks are a fundamental component of tuberculosis (TB) control, providing testing for diagnosis, surveillance and treatment monitoring at every level of the health-care system. New initiatives and resources to strengthen laboratory capacity and implement rapid and new diagnostic tests for TB will require recognition that laboratories are systems that require quality standards, appropriate human resources, and attention to safety in addition to supplies and equipment. To prepare the laboratory networks for new diagnostics and expanded capacity, we need to focus efforts on strengthening quality management systems (QMS) through additional resources for external quality assessment programmes for microscopy, culture, drug susceptibility testing (DST) and molecular diagnostics. QMS should also promote development of accreditation programmes to ensure adherence to standards to improve both the quality and credibility of the laboratory system within TB programmes. Corresponding attention must be given to addressing human resources at every level of the laboratory, with special consideration being given to new programmes for laboratory management and leadership skills. Strengthening laboratory networks will also involve setting up partnerships between TB programmes and those seeking to control other diseases in order to pool resources and to promote advocacy for quality standards, to develop strategies to integrate laboratories’ functions and to extend control programme activities to the private sector. Improving the laboratory system will assure that increased resources, in the form of supplies, equipment and facilities, will be invested in networks that are capable of providing effective testing to meet the goals of the Global Plan to Stop TB. PMID:17639219

  8. Soil erodibility variability in laboratory and field rainfall simulations

    NASA Astrophysics Data System (ADS)

    Szabó, Boglárka; Szabó, Judit; Jakab, Gergely; Centeri, Csaba; Szalai, Zoltán

    2017-04-01

    Rainfall simulation experiments are the most common way to observe and to model the soil erosion processes in in situ and ex situ circumstances. During modelling soil erosion, one of the most important factors are the annual soil loss and the soil erodibility which represent the effect of soil properties on soil loss and the soil resistance against water erosion. The amount of runoff and soil loss can differ in case of the same soil type, while it's characteristics determine the soil erodibility factor. This leads to uncertainties regarding soil erodibility. Soil loss and soil erodibility were examined with the investigation of the same soil under laboratory and field conditions with rainfall simulators. The comparative measurement was carried out in a laboratory on 0,5 m2, and in the field (Shower Power-02) on 6 m2 plot size where the applied slope angles were 5% and 12% with 30 and 90 mm/h rainfall intensity. The main idea was to examine and compare the soil erodibility and its variability coming from the same soil, but different rainfall simulator type. The applied model was the USLE, nomograph and other equations which concern single rainfall events. The given results show differences between the field and laboratory experiments and between the different calculations. Concerning for the whole rainfall events runoff and soil loss, were significantly higher at the laboratory experiments, which affected the soil erodibility values too. The given differences can originate from the plot size. The main research questions are that: How should we handle the soil erodibility factors and its significant variability? What is the best solution for soil erodibility determination?

  9. The spacecraft control laboratory experiment optical attitude measurement system

    NASA Technical Reports Server (NTRS)

    Welch, Sharon S.; Montgomery, Raymond C.; Barsky, Michael F.

    1991-01-01

    A stereo camera tracking system was developed to provide a near real-time measure of the position and attitude of the Spacecraft COntrol Laboratory Experiment (SCOLE). The SCOLE is a mockup of the shuttle-like vehicle with an attached flexible mast and (simulated) antenna, and was designed to provide a laboratory environment for the verification and testing of control laws for large flexible spacecraft. Actuators and sensors located on the shuttle and antenna sense the states of the spacecraft and allow the position and attitude to be controlled. The stereo camera tracking system which was developed consists of two position sensitive detector cameras which sense the locations of small infrared LEDs attached to the surface of the shuttle. Information on shuttle position and attitude is provided in six degrees-of-freedom. The design of this optical system, calibration, and tracking algorithm are described. The performance of the system is evaluated for yaw only.

  10. Using Multi-Scale Modeling Systems to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo

    2010-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  11. Simulations of Sea Level Rise Effects on Complex Coastal Systems

    NASA Astrophysics Data System (ADS)

    Niedoroda, A. W.; Ye, M.; Saha, B.; Donoghue, J. F.; Reed, C. W.

    2009-12-01

    It is now established that complex coastal systems with elements such as beaches, inlets, bays, and rivers adjust their morphologies according to time-varying balances in between the processes that control the exchange of sediment. Accelerated sea level rise introduces a major perturbation into the sediment-sharing systems. A modeling framework based on a new SL-PR model which is an advanced version of the aggregate-scale CST Model and the event-scale CMS-2D and CMS-Wave combination have been used to simulate the recent evolution of a portion of the Florida panhandle coast. This combination of models provides a method to evaluate coefficients in the aggregate-scale model that were previously treated as fitted parameters. That is, by carrying out simulations of a complex coastal system with runs of the event-scale model representing more than a year it is now possible to directly relate the coefficients in the large-scale SL-PR model to measureable physical parameters in the current and wave fields. This cross-scale modeling procedure has been used to simulate the shoreline evolution at the Santa Rosa Island, a long barrier which houses significant military infrastructure at the north Gulf Coast. The model has been used to simulate 137 years of measured shoreline change and to extend these to predictions of future rates of shoreline migration.

  12. Dynamic system simulation of small satellite projects

    NASA Astrophysics Data System (ADS)

    Raif, Matthias; Walter, Ulrich; Bouwmeester, Jasper

    2010-11-01

    A prerequisite to accomplish a system simulation is to have a system model holding all necessary project information in a centralized repository that can be accessed and edited by all parties involved. At the Institute of Astronautics of the Technische Universitaet Muenchen a modular approach for modeling and dynamic simulation of satellite systems has been developed called dynamic system simulation (DySyS). DySyS is based on the platform independent description language SysML to model a small satellite project with respect to the system composition and dynamic behavior. A library of specific building blocks and possible relations between these blocks have been developed. From this library a system model of the satellite of interest can be created. A mapping into a C++ simulation allows the creation of an executable system model with which simulations are performed to observe the dynamic behavior of the satellite. In this paper DySyS is used to model and simulate the dynamic behavior of small satellites, because small satellite projects can act as a precursor to demonstrate the feasibility of a system model since they are less complex compared to a large scale satellite project.

  13. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  14. Elimination of the Reaction Rate "Scale Effect": Application of the Lagrangian Reactive Particle-Tracking Method to Simulate Mixing-Limited, Field-Scale Biodegradation at the Schoolcraft (MI, USA) Site

    NASA Astrophysics Data System (ADS)

    Ding, Dong; Benson, David A.; Fernández-Garcia, Daniel; Henri, Christopher V.; Hyndman, David W.; Phanikumar, Mantha S.; Bolster, Diogo

    2017-12-01

    Measured (or empirically fitted) reaction rates at groundwater remediation sites are typically much lower than those found in the same material at the batch or laboratory scale. The reduced rates are commonly attributed to poorer mixing at the larger scales. A variety of methods have been proposed to account for this scaling effect in reactive transport. In this study, we use the Lagrangian particle-tracking and reaction (PTR) method to simulate a field bioremediation experiment at the Schoolcraft, MI site. A denitrifying bacterium, Pseudomonas Stutzeri strain KC (KC), was injected to the aquifer, along with sufficient substrate, to degrade the contaminant, carbon tetrachloride (CT), under anaerobic conditions. The PTR method simulates chemical reactions through probabilistic rules of particle collisions, interactions, and transformations to address the scale effect (lower apparent reaction rates for each level of upscaling, from batch to column to field scale). In contrast to a prior Eulerian reaction model, the PTR method is able to match the field-scale experiment using the rate coefficients obtained from batch experiments.

  15. Probabilistic Simulation of Multi-Scale Composite Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  16. 2000 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2001-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to

  17. 2001 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2002-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to

  18. The Programming Language Python In Earth System Simulations

    NASA Astrophysics Data System (ADS)

    Gross, L.; Imranullah, A.; Mora, P.; Saez, E.; Smillie, J.; Wang, C.

    2004-12-01

    Mathematical models in earth sciences base on the solution of systems of coupled, non-linear, time-dependent partial differential equations (PDEs). The spatial and time-scale vary from a planetary scale and million years for convection problems to 100km and 10 years for fault systems simulations. Various techniques are in use to deal with the time dependency (e.g. Crank-Nicholson), with the non-linearity (e.g. Newton-Raphson) and weakly coupled equations (e.g. non-linear Gauss-Seidel). Besides these high-level solution algorithms discretization methods (e.g. finite element method (FEM), boundary element method (BEM)) are used to deal with spatial derivatives. Typically, large-scale, three dimensional meshes are required to resolve geometrical complexity (e.g. in the case of fault systems) or features in the solution (e.g. in mantel convection simulations). The modelling environment escript allows the rapid implementation of new physics as required for the development of simulation codes in earth sciences. Its main object is to provide a programming language, where the user can define new models and rapidly develop high-level solution algorithms. The current implementation is linked with the finite element package finley as a PDE solver. However, the design is open and other discretization technologies such as finite differences and boundary element methods could be included. escript is implemented as an extension of the interactive programming environment python (see www.python.org). Key concepts introduced are Data objects, which are holding values on nodes or elements of the finite element mesh, and linearPDE objects, which are defining linear partial differential equations to be solved by the underlying discretization technology. In this paper we will show the basic concepts of escript and will show how escript is used to implement a simulation code for interacting fault systems. We will show some results of large-scale, parallel simulations on an SGI Altix

  19. Laboratory simulations of cumulus cloud flows explain the entrainment anomaly

    NASA Astrophysics Data System (ADS)

    Narasimha, Roddam; Diwan, Sourabh S.; Subrahmanyam, Duvvuri; Sreenivas, K. R.; Bhat, G. S.

    2010-11-01

    In the present laboratory experiments, cumulus cloud flows are simulated by starting plumes and jets subjected to off-source heat addition in amounts that are dynamically similar to latent heat release due to condensation in real clouds. The setup permits incorporation of features like atmospheric inversion layers and the active control of off-source heat addition. Herein we report, for the first time, simulation of five different cumulus cloud types (and many shapes), including three genera and three species (WMO Atlas 1987), which show striking resemblance to real clouds. It is known that the rate of entrainment in cumulus cloud flows is much less than that in classical plumes - the main reason for the failure of early entrainment models. Some of the previous studies on steady-state jets and plumes (done in a similar setup) have attributed this anomaly to the disruption of the large-scale turbulent structures upon the addition of off-source heat. We present estimates of entrainment coefficients from these measurements which show a qualitatively consistent variation with height. We propose that this explains the observed entrainment anomaly in cumulus clouds; further experiments are planned to address this question in the context of starting jets and plumes.

  20. Methane production from food waste leachate in laboratory-scale simulated landfill.

    PubMed

    Behera, Shishir Kumar; Park, Jun Mo; Kim, Kyeong Ho; Park, Hung-Suck

    2010-01-01

    Due to the prohibition of food waste landfilling in Korea from 2005 and the subsequent ban on the marine disposal of organic sludge, including leachate generated from food waste recycling facilities from 2012, it is urgent to develop an innovative and sustainable disposal strategy that is eco-friendly, yet economically beneficial. In this study, methane production from food waste leachate (FWL) in landfill sites with landfill gas recovery facilities was evaluated in simulated landfill reactors (lysimeters) for a period of 90 d with four different inoculum-substrate ratios (ISRs) on volatile solid (VS) basis. Simultaneous biochemical methane potential batch experiments were also conducted at the same ISRs for 30 d to compare CH(4) yield obtained from lysimeter studies. Under the experimental conditions, a maximum CH(4) yield of 0.272 and 0.294 L/g VS was obtained in the batch and lysimeter studies, respectively, at ISR of 1:1. The biodegradability of FWL in batch and lysimeter experiments at ISR of 1:1 was 64% and 69%, respectively. The calculated data using the modified Gompertz equation for the cumulative CH(4) production showed good agreement with the experimental result obtained from lysimeter study. Based on the results obtained from this study, field-scale pilot test is required to re-evaluate the existing sanitary landfills with efficient leachate collection and gas recovery facilities as engineered bioreactors to treat non-hazardous liquid organic wastes for energy recovery with optimum utilization of facilities. 2010 Elsevier Ltd. All rights reserved.

  1. Real-time simulation of large-scale floods

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  2. NASA Lewis Propulsion Systems Laboratory Customer Guide Manual

    NASA Technical Reports Server (NTRS)

    Soeder, Ronald H.

    1994-01-01

    This manual describes the Propulsion Systems Laboratory (PSL) at NASA Lewis Research Center. The PSL complex supports two large engine test cells (PSL-3 and PSL-4) that are capable of providing flight simulation to altitudes of 70,000 ft. Facility variables at the engine or test-article inlet, such as pressure, temperature, and Mach number (up to 3.0 for PSL-3 and up to 6.0 planned for PSL-4), are discussed. Support systems such as the heated and cooled combustion air systems; the altitude exhaust system; the hydraulic system; the nitrogen, oxygen, and hydrogen systems; hydrogen burners; rotating screen assemblies; the engine exhaust gas-sampling system; the infrared imaging system; and single- and multiple-axis thrust stands are addressed. Facility safety procedures are also stated.

  3. Specialized Laboratory Information Systems.

    PubMed

    Dangott, Bryan

    2015-06-01

    Some laboratories or laboratory sections have unique needs that traditional anatomic and clinical pathology systems may not address. A specialized laboratory information system (LIS), which is designed to perform a limited number of functions, may perform well in areas where a traditional LIS falls short. Opportunities for specialized LISs continue to evolve with the introduction of new testing methodologies. These systems may take many forms, including stand-alone architecture, a module integrated with an existing LIS, a separate vendor-supplied module, and customized software. This article addresses the concepts underlying specialized LISs, their characteristics, and in what settings they are found. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Specialized Laboratory Information Systems.

    PubMed

    Dangott, Bryan

    2016-03-01

    Some laboratories or laboratory sections have unique needs that traditional anatomic and clinical pathology systems may not address. A specialized laboratory information system (LIS), which is designed to perform a limited number of functions, may perform well in areas where a traditional LIS falls short. Opportunities for specialized LISs continue to evolve with the introduction of new testing methodologies. These systems may take many forms, including stand-alone architecture, a module integrated with an existing LIS, a separate vendor-supplied module, and customized software. This article addresses the concepts underlying specialized LISs, their characteristics, and in what settings they are found. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. A large scale software system for simulation and design optimization of mechanical systems

    NASA Technical Reports Server (NTRS)

    Dopker, Bernhard; Haug, Edward J.

    1989-01-01

    The concept of an advanced integrated, networked simulation and design system is outlined. Such an advanced system can be developed utilizing existing codes without compromising the integrity and functionality of the system. An example has been used to demonstrate the applicability of the concept of the integrated system outlined here. The development of an integrated system can be done incrementally. Initial capabilities can be developed and implemented without having a detailed design of the global system. Only a conceptual global system must exist. For a fully integrated, user friendly design system, further research is needed in the areas of engineering data bases, distributed data bases, and advanced user interface design.

  6. Scale-Similar Models for Large-Eddy Simulations

    NASA Technical Reports Server (NTRS)

    Sarghini, F.

    1999-01-01

    Scale-similar models employ multiple filtering operations to identify the smallest resolved scales, which have been shown to be the most active in the interaction with the unresolved subgrid scales. They do not assume that the principal axes of the strain-rate tensor are aligned with those of the subgrid-scale stress (SGS) tensor, and allow the explicit calculation of the SGS energy. They can provide backscatter in a numerically stable and physically realistic manner, and predict SGS stresses in regions that are well correlated with the locations where large Reynolds stress occurs. In this paper, eddy viscosity and mixed models, which include an eddy-viscosity part as well as a scale-similar contribution, are applied to the simulation of two flows, a high Reynolds number plane channel flow, and a three-dimensional, nonequilibrium flow. The results show that simulations without models or with the Smagorinsky model are unable to predict nonequilibrium effects. Dynamic models provide an improvement of the results: the adjustment of the coefficient results in more accurate prediction of the perturbation from equilibrium. The Lagrangian-ensemble approach [Meneveau et al., J. Fluid Mech. 319, 353 (1996)] is found to be very beneficial. Models that included a scale-similar term and a dissipative one, as well as the Lagrangian ensemble averaging, gave results in the best agreement with the direct simulation and experimental data.

  7. Monte Carlo simulation of neutron backscattering from concrete walls in the dense plasma focus laboratory of Bologna University.

    PubMed

    Frignani, M; Mostacci, D; Rocchi, F; Sumini, M

    2005-01-01

    Between 2001 and 2003 a 3.2 kJ dense plasma focus (DPF) device has been built at the Montecuccolino Laboratory of the Department of Energy, Nuclear and Environmental Control Engineering (DIENCA) of the University of Bologna. A DPF is a pulsed device in which deuterium nuclear fusion reactions can be obtained through the pinching effects of electromagnetic fields upon a dense plasma. The empirical scale law that governs the total D-D neutron yield from a single pulse of a DPF predicts for this machine a figure of approximately 10(7) fast neutrons per shot. The aim of the present work is to evaluate the role of backscattering of neutrons from the concrete walls surrounding the Montecuccolino DPF in total neutron yield measurements. The evaluation is performed by MCNP-5 simulations that are aimed at estimating the neutron spectra at a few points of interest in the laboratory, where neutron detectors will be placed during the experimental campaigns. Spectral information from the simulations is essential because the response of detectors is influenced by neutron energy. Comparisons are made with the simple r(-2) law, which holds for a DPF in infinite vacuum. The results from the simulations will ultimately be used both in the design and optimisation of the neutron detectors and in their final calibration and placement inside the laboratory.

  8. Scaling up microbial fuel cells and other bioelectrochemical systems.

    PubMed

    Logan, Bruce E

    2010-02-01

    Scientific research has advanced on different microbial fuel cell (MFC) technologies in the laboratory at an amazing pace, with power densities having reached over 1 kW/m(3) (reactor volume) and to 6.9 W/m(2) (anode area) under optimal conditions. The main challenge is to bring these technologies out of the laboratory and engineer practical systems for bioenergy production at larger scales. Recent advances in new types of electrodes, a better understanding of the impact of membranes and separators on performance of these systems, and results from several new pilot-scale tests are all good indicators that commercialization of the technology could be possible within a few years. Some of the newest advances and future challenges are reviewed here with respect to practical applications of these MFCs for renewable energy production and other applications.

  9. Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission

    NASA Astrophysics Data System (ADS)

    Hampton, Jesse Clay

    The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.

  10. Transactive Systems Simulation and Valuation Platform Trial Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widergren, Steven E.; Hammerstrom, Donald J.; Huang, Qiuhua

    Transactive energy systems use principles of value to coordinate responsive supply and demand in energy systems. Work continues within the Transactive Systems Program, which is funded by the U.S. Department of Energy at Pacific Northwest National Laboratory, to understand the value of, understand the theory behind, and simulate the behaviors of transactive energy systems. This report summarizes recent advances made by this program. The main capability advances include a more comprehensive valuation model, including recommended documentation that should make valuation studies of all sorts more transparent, definition of economic metrics with which transactive mechanisms can be evaluated, and multiple improvementsmore » to the time-simulation environment that is being used to evaluate transactive scenarios.« less

  11. The State Public Health Laboratory System.

    PubMed

    Inhorn, Stanley L; Astles, J Rex; Gradus, Stephen; Malmberg, Veronica; Snippes, Paula M; Wilcke, Burton W; White, Vanessa A

    2010-01-01

    This article describes the development since 2000 of the State Public Health Laboratory System in the United States. These state systems collectively are related to several other recent public health laboratory (PHL) initiatives. The first is the Core Functions and Capabilities of State Public Health Laboratories, a white paper that defined the basic responsibilities of the state PHL. Another is the Centers for Disease Control and Prevention National Laboratory System (NLS) initiative, the goal of which is to promote public-private collaboration to assure quality laboratory services and public health surveillance. To enhance the realization of the NLS, the Association of Public Health Laboratories (APHL) launched in 2004 a State Public Health Laboratory System Improvement Program. In the same year, APHL developed a Comprehensive Laboratory Services Survey, a tool to measure improvement through the decade to assure that essential PHL services are provided.

  12. The View of Scientific Inquiry Conveyed by Simulation-Based Virtual Laboratories

    ERIC Educational Resources Information Center

    Chen, Sufen

    2010-01-01

    With an increasing number of studies evincing the effectiveness of simulation-based virtual laboratories (VLs), researchers have discussed replacing traditional laboratories. However, the approach of doing science endorsed by VLs has not been carefully examined. A survey of 233 online VLs revealed that hypothetico-deductive (HD) logic prevails in…

  13. SIMILARITY PROPERTIES AND SCALING LAWS OF RADIATION HYDRODYNAMIC FLOWS IN LABORATORY ASTROPHYSICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falize, E.; Bouquet, S.; Michaut, C., E-mail: emeric.falize@cea.fr

    The spectacular recent development of modern high-energy density laboratory facilities which concentrate more and more energy in millimetric volumes allows the astrophysical community to reproduce and to explore, in millimeter-scale targets and during very short times, astrophysical phenomena where radiation and matter are strongly coupled. The astrophysical relevance of these experiments can be checked from the similarity properties and especially scaling law establishment, which constitutes the keystone of laboratory astrophysics. From the radiating optically thin regime to the so-called optically thick radiative pressure regime, we present in this paper, for the first time, a complete analysis of the main radiatingmore » regimes that we encountered in laboratory astrophysics with the same formalism based on Lie group theory. The use of the Lie group method appears to be a systematic method which allows us to construct easily and systematically the scaling laws of a given problem. This powerful tool permits us to unify the recent major advances on scaling laws and to identify new similarity concepts that we discuss in this paper, and suggests important applications for present and future laboratory astrophysics experiments. All these results enable us to demonstrate theoretically that astrophysical phenomena in such radiating regimes can be explored experimentally thanks to powerful facilities. Consequently, the results presented here are a fundamental tool for the high-energy density laboratory astrophysics community in order to quantify the astrophysics relevance and justify laser experiments. Moreover, relying on Lie group theory, this paper constitutes the starting point of any analysis of the self-similar dynamics of radiating fluids.« less

  14. Testing new approaches to carbonate system simulation at the reef scale: the ReefSam model first results, application to a question in reef morphology and future challenges.

    NASA Astrophysics Data System (ADS)

    Barrett, Samuel; Webster, Jody

    2016-04-01

    Numerical simulation of the stratigraphy and sedimentology of carbonate systems (carbonate forward stratigraphic modelling - CFSM) provides significant insight into the understanding of both the physical nature of these systems and the processes which control their development. It also provides the opportunity to quantitatively test conceptual models concerning stratigraphy, sedimentology or geomorphology, and allows us to extend our knowledge either spatially (e.g. between bore holes) or temporally (forwards or backwards in time). The later is especially important in determining the likely future development of carbonate systems, particularly regarding the effects of climate change. This application, by its nature, requires successful simulation of carbonate systems on short time scales and at high spatial resolutions. Previous modelling attempts have typically focused on the scales of kilometers and kilo-years or greater (the scale of entire carbonate platforms), rather than at the scale of centuries or decades, and tens to hundreds of meters (the scale of individual reefs). Previous work has identified limitations in common approaches to simulating important reef processes. We present a new CFSM, Reef Sedimentary Accretion Model (ReefSAM), which is designed to test new approaches to simulating reef-scale processes, with the aim of being able to better simulate the past and future development of coral reefs. Four major features have been tested: 1. A simulation of wave based hydrodynamic energy with multiple simultaneous directions and intensities including wave refraction, interaction, and lateral sheltering. 2. Sediment transport simulated as sediment being moved from cell to cell in an iterative fashion until complete deposition. 3. A coral growth model including consideration of local wave energy and composition of the basement substrate (as well as depth). 4. A highly quantitative model testing approach where dozens of output parameters describing the reef

  15. Fast laboratory-based micro-computed tomography for pore-scale research: Illustrative experiments and perspectives on the future

    NASA Astrophysics Data System (ADS)

    Bultreys, Tom; Boone, Marijn A.; Boone, Matthieu N.; De Schryver, Thomas; Masschaele, Bert; Van Hoorebeke, Luc; Cnudde, Veerle

    2016-09-01

    Over the past decade, the wide-spread implementation of laboratory-based X-ray micro-computed tomography (micro-CT) scanners has revolutionized both the experimental and numerical research on pore-scale transport in geological materials. The availability of these scanners has opened up the possibility to image a rock's pore space in 3D almost routinely to many researchers. While challenges do persist in this field, we treat the next frontier in laboratory-based micro-CT scanning: in-situ, time-resolved imaging of dynamic processes. Extremely fast (even sub-second) micro-CT imaging has become possible at synchrotron facilities over the last few years, however, the restricted accessibility of synchrotrons limits the amount of experiments which can be performed. The much smaller X-ray flux in laboratory-based systems bounds the time resolution which can be attained at these facilities. Nevertheless, progress is being made to improve the quality of measurements performed on the sub-minute time scale. We illustrate this by presenting cutting-edge pore scale experiments visualizing two-phase flow and solute transport in real-time with a lab-based environmental micro-CT set-up. To outline the current state of this young field and its relevance to pore-scale transport research, we critically examine its current bottlenecks and their possible solutions, both on the hardware and the software level. Further developments in laboratory-based, time-resolved imaging could prove greatly beneficial to our understanding of transport behavior in geological materials and to the improvement of pore-scale modeling by providing valuable validation.

  16. Laboratory analogue of a supersonic accretion column in a binary star system.

    PubMed

    Cross, J E; Gregori, G; Foster, J M; Graham, P; Bonnet-Bidaud, J-M; Busschaert, C; Charpentier, N; Danson, C N; Doyle, H W; Drake, R P; Fyrth, J; Gumbrell, E T; Koenig, M; Krauland, C; Kuranz, C C; Loupias, B; Michaut, C; Mouchet, M; Patankar, S; Skidmore, J; Spindloe, C; Tubman, E R; Woolsey, N; Yurchak, R; Falize, É

    2016-06-13

    Astrophysical flows exhibit rich behaviour resulting from the interplay of different forms of energy-gravitational, thermal, magnetic and radiative. For magnetic cataclysmic variable stars, material from a late, main sequence star is pulled onto a highly magnetized (B>10 MG) white dwarf. The magnetic field is sufficiently large to direct the flow as an accretion column onto the poles of the white dwarf, a star subclass known as AM Herculis. A stationary radiative shock is expected to form 100-1,000 km above the surface of the white dwarf, far too small to be resolved with current telescopes. Here we report the results of a laboratory experiment showing the evolution of a reverse shock when both ionization and radiative losses are important. We find that the stand-off position of the shock agrees with radiation hydrodynamic simulations and is consistent, when scaled to AM Herculis star systems, with theoretical predictions.

  17. Overview of theory and simulations in the Heavy Ion Fusion Science Virtual National Laboratory

    NASA Astrophysics Data System (ADS)

    Friedman, Alex

    2007-07-01

    The Heavy Ion Fusion Science Virtual National Laboratory (HIFS-VNL) is a collaboration of Lawrence Berkeley National Laboratory, Lawrence Livermore National Laboratory, and Princeton Plasma Physics Laboratory. These laboratories, in cooperation with researchers at other institutions, are carrying out a coordinated effort to apply intense ion beams as drivers for studies of the physics of matter at extreme conditions, and ultimately for inertial fusion energy. Progress on this endeavor depends upon coordinated application of experiments, theory, and simulations. This paper describes the state of the art, with an emphasis on the coordination of modeling and experiment; developments in the simulation tools, and in the methods that underly them, are also treated.

  18. (U) Status of Trinity and Crossroads Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archer, Billy Joe; Lujan, James Westley; Hemmert, K. S.

    2017-01-10

    (U) This paper provides a general overview of current and future plans for the Advanced Simulation and Computing (ASC) Advanced Technology (AT) systems fielded by the New Mexico Alliance for Computing at Extreme Scale (ACES), a collaboration between Los Alamos Laboratory and Sandia National Laboratories. Additionally, this paper touches on research of technology beyond traditional CMOS. The status of Trinity, ASCs first AT system, and Crossroads, anticipated to succeed Trinity as the third AT system in 2020 will be presented, along with initial performance studies of the Intel Knights Landing Xeon Phi processors, introduced on Trinity. The challenges and opportunitiesmore » for our production simulation codes on AT systems will also be discussed. Trinity and Crossroads are a joint procurement by ACES and Lawrence Berkeley Laboratory as part of the Alliance for application Performance at EXtreme scale (APEX) http://apex.lanl.gov.« less

  19. Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet

    PubMed Central

    Li, C. K.; Tzeferacos, P.; Lamb, D.; Gregori, G.; Norreys, P. A.; Rosenberg, M. J.; Follett, R. K.; Froula, D. H.; Koenig, M.; Seguin, F. H.; Frenje, J. A.; Rinderknecht, H. G.; Sio, H.; Zylstra, A. B.; Petrasso, R. D.; Amendt, P. A.; Park, H. S.; Remington, B. A.; Ryutov, D. D.; Wilks, S. C.; Betti, R.; Frank, A.; Hu, S. X.; Sangster, T. C.; Hartigan, P.; Drake, R. P.; Kuranz, C. C.; Lebedev, S. V.; Woolsey, N. C.

    2016-01-01

    The remarkable discovery by the Chandra X-ray observatory that the Crab nebula's jet periodically changes direction provides a challenge to our understanding of astrophysical jet dynamics. It has been suggested that this phenomenon may be the consequence of magnetic fields and magnetohydrodynamic instabilities, but experimental demonstration in a controlled laboratory environment has remained elusive. Here we report experiments that use high-power lasers to create a plasma jet that can be directly compared with the Crab jet through well-defined physical scaling laws. The jet generates its own embedded toroidal magnetic fields; as it moves, plasma instabilities result in multiple deflections of the propagation direction, mimicking the kink behaviour of the Crab jet. The experiment is modelled with three-dimensional numerical simulations that show exactly how the instability develops and results in changes of direction of the jet. PMID:27713403

  20. Scaled laboratory experiments explain the kink behaviour of the Crab Nebula jet.

    PubMed

    Li, C K; Tzeferacos, P; Lamb, D; Gregori, G; Norreys, P A; Rosenberg, M J; Follett, R K; Froula, D H; Koenig, M; Seguin, F H; Frenje, J A; Rinderknecht, H G; Sio, H; Zylstra, A B; Petrasso, R D; Amendt, P A; Park, H S; Remington, B A; Ryutov, D D; Wilks, S C; Betti, R; Frank, A; Hu, S X; Sangster, T C; Hartigan, P; Drake, R P; Kuranz, C C; Lebedev, S V; Woolsey, N C

    2016-10-07

    The remarkable discovery by the Chandra X-ray observatory that the Crab nebula's jet periodically changes direction provides a challenge to our understanding of astrophysical jet dynamics. It has been suggested that this phenomenon may be the consequence of magnetic fields and magnetohydrodynamic instabilities, but experimental demonstration in a controlled laboratory environment has remained elusive. Here we report experiments that use high-power lasers to create a plasma jet that can be directly compared with the Crab jet through well-defined physical scaling laws. The jet generates its own embedded toroidal magnetic fields; as it moves, plasma instabilities result in multiple deflections of the propagation direction, mimicking the kink behaviour of the Crab jet. The experiment is modelled with three-dimensional numerical simulations that show exactly how the instability develops and results in changes of direction of the jet.

  1. Regeneration of Exhausted Arsenic Adsorptive media of a Full Scale Treatment System

    EPA Science Inventory

    This presentation will describe the method and results of laboratory tests showing the feasibility of regenerating exhausted, iron-based, adsorptive media and the results of a follow up regeneration test at a full scale system in Twentynine Palms CA. The laboratory studies on se...

  2. The development of an industrial-scale fed-batch fermentation simulation.

    PubMed

    Goldrick, Stephen; Ştefan, Andrei; Lovett, David; Montague, Gary; Lennox, Barry

    2015-01-10

    This paper describes a simulation of an industrial-scale fed-batch fermentation that can be used as a benchmark in process systems analysis and control studies. The simulation was developed using a mechanistic model and validated using historical data collected from an industrial-scale penicillin fermentation process. Each batch was carried out in a 100,000 L bioreactor that used an industrial strain of Penicillium chrysogenum. The manipulated variables recorded during each batch were used as inputs to the simulator and the predicted outputs were then compared with the on-line and off-line measurements recorded in the real process. The simulator adapted a previously published structured model to describe the penicillin fermentation and extended it to include the main environmental effects of dissolved oxygen, viscosity, temperature, pH and dissolved carbon dioxide. In addition the effects of nitrogen and phenylacetic acid concentrations on the biomass and penicillin production rates were also included. The simulated model predictions of all the on-line and off-line process measurements, including the off-gas analysis, were in good agreement with the batch records. The simulator and industrial process data are available to download at www.industrialpenicillinsimulation.com and can be used to evaluate, study and improve on the current control strategy implemented on this facility. Crown Copyright © 2014. Published by Elsevier B.V. All rights reserved.

  3. RANS Simulation (Virtual Blade Model [VBM]) of Array of Three Coaxial Lab Scaled DOE RM1 MHK Turbine with 5D Spacing

    DOE Data Explorer

    Javaherchi, Teymour

    2016-06-08

    Attached are the .cas and .dat files along with the required User Defined Functions (UDFs) and look-up table of lift and drag coefficients for the Reynolds Averaged Navier-Stokes (RANS) simulation of three coaxially located lab-scaled DOE RM1 turbine implemented in ANSYS FLUENT CFD-package. The lab-scaled DOE RM1 is a re-design geometry, based of the full scale DOE RM1 design, producing same power output as the full scale model, while operating at matched Tip Speed Ratio values at reachable laboratory Reynolds number (see attached paper). In this case study the flow field around and in the wake of the lab-scaled DOE RM1 turbines in a coaxial array is simulated using Blade Element Model (a.k.a Virtual Blade Model) by solving RANS equations coupled with k-\\omega turbulence closure model. It should be highlighted that in this simulation the actual geometry of the rotor blade is not modeled. The effect of turbine rotating blades are modeled using the Blade Element Theory. This simulation provides an accurate estimate for the performance of each device and structure of their turbulent far wake. The results of these simulations were validated against the developed in-house experimental data. Simulations for other turbine configurations are available upon request.

  4. Fostering Elementary School Students' Understanding of Simple Electricity by Combining Simulation and Laboratory Activities

    ERIC Educational Resources Information Center

    Jaakkola, T.; Nurmi, S.

    2008-01-01

    Computer simulations and laboratory activities have been traditionally treated as substitute or competing methods in science teaching. The aim of this experimental study was to investigate if it would be more beneficial to combine simulation and laboratory activities than to use them separately in teaching the concepts of simple electricity. Based…

  5. Bioreactor Scalability: Laboratory-Scale Bioreactor Design Influences Performance, Ecology, and Community Physiology in Expanded Granular Sludge Bed Bioreactors

    PubMed Central

    Connelly, Stephanie; Shin, Seung G.; Dillon, Robert J.; Ijaz, Umer Z.; Quince, Christopher; Sloan, William T.; Collins, Gavin

    2017-01-01

    Studies investigating the feasibility of new, or improved, biotechnologies, such as wastewater treatment digesters, inevitably start with laboratory-scale trials. However, it is rarely determined whether laboratory-scale results reflect full-scale performance or microbial ecology. The Expanded Granular Sludge Bed (EGSB) bioreactor, which is a high-rate anaerobic digester configuration, was used as a model to address that knowledge gap in this study. Two laboratory-scale idealizations of the EGSB—a one-dimensional and a three- dimensional scale-down of a full-scale design—were built and operated in triplicate under near-identical conditions to a full-scale EGSB. The laboratory-scale bioreactors were seeded using biomass obtained from the full-scale bioreactor, and, spent water from the distillation of whisky from maize was applied as substrate at both scales. Over 70 days, bioreactor performance, microbial ecology, and microbial community physiology were monitored at various depths in the sludge-beds using 16S rRNA gene sequencing (V4 region), specific methanogenic activity (SMA) assays, and a range of physical and chemical monitoring methods. SMA assays indicated dominance of the hydrogenotrophic pathway at full-scale whilst a more balanced activity profile developed during the laboratory-scale trials. At each scale, Methanobacterium was the dominant methanogenic genus present. Bioreactor performance overall was better at laboratory-scale than full-scale. We observed that bioreactor design at laboratory-scale significantly influenced spatial distribution of microbial community physiology and taxonomy in the bioreactor sludge-bed, with 1-D bioreactor types promoting stratification of each. In the 1-D laboratory bioreactors, increased abundance of Firmicutes was associated with both granule position in the sludge bed and increased activity against acetate and ethanol as substrates. We further observed that stratification in the sludge-bed in 1-D laboratory-scale

  6. Laboratory simulation of the action of weightlessness on the human organism

    NASA Technical Reports Server (NTRS)

    Genin, A. M.

    1977-01-01

    A brief history of attemps by the U.S. and the U.S.S.R. to simulate weightlessness in the laboratory is presented. Model for laboratory modeling of weightlessness included the bed regimen, the clinostat, and water immersion. An outline of immediate physiological effects of weightlessness and long term effects is offered.

  7. A multiscale approach to accelerate pore-scale simulation of porous electrodes

    NASA Astrophysics Data System (ADS)

    Zheng, Weibo; Kim, Seung Hyun

    2017-04-01

    A new method to accelerate pore-scale simulation of porous electrodes is presented. The method combines the macroscopic approach with pore-scale simulation by decomposing a physical quantity into macroscopic and local variations. The multiscale method is applied to the potential equation in pore-scale simulation of a Proton Exchange Membrane Fuel Cell (PEMFC) catalyst layer, and validated with the conventional approach for pore-scale simulation. Results show that the multiscale scheme substantially reduces the computational cost without sacrificing accuracy.

  8. A comparison of relative toxicity rankings by some small-scale laboratory tests

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Cumming, H. J.

    1977-01-01

    Small-scale laboratory tests for fire toxicity, suitable for use in the average laboratory hood, are needed for screening and ranking materials on the basis of relative toxicity. The performance of wool, cotton, and aromatic polyamide under several test procedures is presented.

  9. Space Food Systems Laboratory

    NASA Technical Reports Server (NTRS)

    Perchonok, Michele; Russo, Dane M. (Technical Monitor)

    2001-01-01

    The Space Food Systems Laboratory (SFSL) is a multipurpose laboratory responsible for space food and package research and development. It is located on-site at Johnson Space Center in Building 17. The facility supports the development of flight food, menus, packaging and food related hardware for Shuttle, International Space Station, and Advanced Life Support food systems. All foods used to support NASA ground tests and/or missions must meet the highest standards before they are 'accepted' for use on actual space flights. The foods are evaluated for nutritional content, sensory acceptability, safety, storage and shelf life, and suitability for use in micro-gravity. The food packaging is also tested to determine its functionality and suitability for use in space. Food Scientist, Registered Dieticians, Packaging Engineers, Food Systems Engineers, and Technicians staff the Space Food Systems Laboratory.

  10. Numerical modeling of laboratory-scale surface-to-crown fire transition

    NASA Astrophysics Data System (ADS)

    Castle, Drew Clayton

    Understanding the conditions leading to the transition of fire spread from a surface fuel to an elevated (crown) fuel is critical to effective fire risk assessment and management. Surface fires that successfully transition to crown fires can be very difficult to suppress, potentially leading to damages in the natural and built environments. This is relevant to chaparral shrub lands which are common throughout parts of the Southwest U.S. and represent a significant part of the wildland urban interface. The ability of the Wildland-Urban Interface Fire Dynamic Simulator (WFDS) to model surface-to-crown fire transition was evaluated through comparison to laboratory experiments. The WFDS model is being developed by the U.S. Forest Service (USFS) and the National Institute of Standards and Technology. The experiments were conducted at the USFS Forest Fire Laboratory in Riverside, California. The experiments measured the ignition of chamise (Adenostoma fasciculatum) crown fuel held above a surface fire spreading through excelsior fuel. Cases with different crown fuel bulk densities, crown fuel base heights, and imposed wind speeds were considered. Cold-flow simulations yielded wind speed profiles that closely matched the experimental measurements. Next, fire simulations with only the surface fuel were conducted to verify the rate of spread while factors such as substrate properties were varied. Finally, simulations with both a surface fuel and a crown fuel were completed. Examination of specific surface fire characteristics (rate of spread, flame angle, etc.) and the corresponding experimental surface fire behavior provided a basis for comparison of the factors most responsible for transition from a surface fire to the raised fuel ignition. The rate of spread was determined by tracking the flame in the Smokeview animations using a tool developed for tracking an actual flame in a video. WFDS simulations produced results in both surface fire spread and raised fuel bed

  11. UAS-Systems Integration, Validation, and Diagnostics Simulation Capability

    NASA Technical Reports Server (NTRS)

    Buttrill, Catherine W.; Verstynen, Harry A.

    2014-01-01

    As part of the Phase 1 efforts of NASA's UAS-in-the-NAS Project a task was initiated to explore the merits of developing a system simulation capability for UAS to address airworthiness certification requirements. The core of the capability would be a software representation of an unmanned vehicle, including all of the relevant avionics and flight control system components. The specific system elements could be replaced with hardware representations to provide Hardware-in-the-Loop (HWITL) test and evaluation capability. The UAS Systems Integration and Validation Laboratory (UAS-SIVL) was created to provide a UAS-systems integration, validation, and diagnostics hardware-in-the-loop simulation capability. This paper discusses how SIVL provides a robust and flexible simulation framework that permits the study of failure modes, effects, propagation paths, criticality, and mitigation strategies to help develop safety, reliability, and design data that can assist with the development of certification standards, means of compliance, and design best practices for civil UAS.

  12. GFDL's unified regional-global weather-climate modeling system with variable resolution capability for severe weather predictions and regional climate simulations

    NASA Astrophysics Data System (ADS)

    Lin, S. J.

    2015-12-01

    The NOAA/Geophysical Fluid Dynamics Laboratory has been developing a unified regional-global modeling system with variable resolution capabilities that can be used for severe weather predictions (e.g., tornado outbreak events and cat-5 hurricanes) and ultra-high-resolution (1-km) regional climate simulations within a consistent global modeling framework. The fundation of this flexible regional-global modeling system is the non-hydrostatic extension of the vertically Lagrangian dynamical core (Lin 2004, Monthly Weather Review) known in the community as FV3 (finite-volume on the cubed-sphere). Because of its flexability and computational efficiency, the FV3 is one of the final candidates of NOAA's Next Generation Global Prediction System (NGGPS). We have built into the modeling system a stretched (single) grid capability, a two-way (regional-global) multiple nested grid capability, and the combination of the stretched and two-way nests, so as to make convection-resolving regional climate simulation within a consistent global modeling system feasible using today's High Performance Computing System. One of our main scientific goals is to enable simulations of high impact weather phenomena (such as tornadoes, thunderstorms, category-5 hurricanes) within an IPCC-class climate modeling system previously regarded as impossible. In this presentation I will demonstrate that it is computationally feasible to simulate not only super-cell thunderstorms, but also the subsequent genesis of tornadoes using a global model that was originally designed for century long climate simulations. As a unified weather-climate modeling system, we evaluated the performance of the model with horizontal resolution ranging from 1 km to as low as 200 km. In particular, for downscaling studies, we have developed various tests to ensure that the large-scale circulation within the global varaible resolution system is well simulated while at the same time the small-scale can be accurately captured

  13. An Integrated Modeling and Simulation Methodology for Intelligent Systems Design and Testing

    DTIC Science & Technology

    2002-08-01

    simulation and actual execution. KEYWORDS: Model Continuity, Modeling, Simulation, Experimental Frame, Real Time Systems , Intelligent Systems...the methodology for a stand-alone real time system. Then it will scale up to distributed real time systems . For both systems, step-wise simulation...MODEL CONTINUITY Intelligent real time systems monitor, respond to, or control, an external environment. This environment is connected to the digital

  14. The Numerical Propulsion System Simulation: A Multidisciplinary Design System for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    1999-01-01

    Advances in computational technology and in physics-based modeling are making large scale, detailed simulations of complex systems possible within the design environment. For example, the integration of computing, communications, and aerodynamics has reduced the time required to analyze ma or propulsion system components from days and weeks to minutes and hours. This breakthrough has enabled the detailed simulation of major propulsion system components to become a routine part of design process and to provide the designer with critical information about the components early in the design process. This paper describes the development of the Numerical Propulsion System Simulation (NPSS), a multidisciplinary system of analysis tools that is focussed on extending the simulation capability from components to the full system. This will provide the product developer with a "virtual wind tunnel" that will reduce the number of hardware builds and tests required during the development of advanced aerospace propulsion systems.

  15. Scaling analysis and SE simulation of the tilted cylinder-interface capillary interaction

    NASA Astrophysics Data System (ADS)

    Gao, S. Q.; Zhang, X. Y.; Zhou, Y. H.

    2018-06-01

    The capillary interaction induced by a tilted cylinder and interface is the basic configuration of many complex systems, such as micro-pillar arrays clustering, super-hydrophobicity of hairy surface, water-walking insects, and fiber aggregation. We systematically analyzed the scaling laws of tilt angle, contact angle, and cylinder radius on the contact line shape by SE simulation and experiment. The following in-depth analysis of the characteristic parameters (shift, stretch and distortion) of the deformed contact lines reveals the self-similar shape of contact line. Then a general capillary force scaling law is proposed to incredibly grasp all the simulated and experimental data by a quite straightforward ellipse approximation approach.

  16. Numerical Propulsion System Simulation (NPSS) 1999 Industry Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Evans, Austin

    2000-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. In addition, the paper contains a summary of the feedback received from industry partners in the development effort and the actions taken over the past year to respond to that feedback. The NPSS development was supported in FY99 by the High Performance Computing and Communications Program.

  17. Argonne simulation framework for intelligent transportation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ewing, T.; Doss, E.; Hanebutte, U.

    1996-04-01

    A simulation framework has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS). The simulator is designed to run on parallel computers and distributed (networked) computer systems; however, a version for a stand alone workstation is also available. The ITS simulator includes an Expert Driver Model (EDM) of instrumented ``smart`` vehicles with in-vehicle navigation units. The EDM is capable of performing optimal route planning and communicating with Traffic Management Centers (TMC). A dynamic road map data base is sued for optimum route planning, where the data is updated periodically tomore » reflect any changes in road or weather conditions. The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces that includes human-factors studies to support safety and operational research. Realistic modeling of variations of the posted driving speed are based on human factor studies that take into consideration weather, road conditions, driver`s personality and behavior and vehicle type. The simulator has been developed on a distributed system of networked UNIX computers, but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of the developed simulator is that vehicles will be represented by autonomous computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. Vehicle processes interact with each other and with ITS components by exchanging messages. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.« less

  18. Simulation of long-term influence from technical systems on permafrost with various short-scale and hourly operation modes in Arctic region

    NASA Astrophysics Data System (ADS)

    Vaganova, N. A.

    2017-12-01

    Technogenic and climatic influences have a significant impact on the degradation of permafrost. Long-term forecasts of such changes during long-time periods have to be taken into account in the oil and gas and construction industries in view to development the Arctic and Subarctic regions. There are considered constantly operating technical systems (for example, oil and gas wells) that affect changes in permafrost, as well as the technical systems that have a short-term impact on permafrost (for example, flare systems for emergency flaring of associated gas). The second type of technical systems is rather complex for simulation, since it is required to reserve both short and long-scales in computations with variable time steps describing the complex technological processes. The main attention is paid to the simulation of long-term influence on the permafrost from the second type of the technical systems.

  19. Opportunities for Breakthroughs in Large-Scale Computational Simulation and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Alter, Stephen J.; Atkins, Harold L.; Bey, Kim S.; Bibb, Karen L.; Biedron, Robert T.; Carpenter, Mark H.; Cheatwood, F. McNeil; Drummond, Philip J.; Gnoffo, Peter A.

    2002-01-01

    Opportunities for breakthroughs in the large-scale computational simulation and design of aerospace vehicles are presented. Computational fluid dynamics tools to be used within multidisciplinary analysis and design methods are emphasized. The opportunities stem from speedups and robustness improvements in the underlying unit operations associated with simulation (geometry modeling, grid generation, physical modeling, analysis, etc.). Further, an improved programming environment can synergistically integrate these unit operations to leverage the gains. The speedups result from reducing the problem setup time through geometry modeling and grid generation operations, and reducing the solution time through the operation counts associated with solving the discretized equations to a sufficient accuracy. The opportunities are addressed only at a general level here, but an extensive list of references containing further details is included. The opportunities discussed are being addressed through the Fast Adaptive Aerospace Tools (FAAST) element of the Advanced Systems Concept to Test (ASCoT) and the third Generation Reusable Launch Vehicles (RLV) projects at NASA Langley Research Center. The overall goal is to enable greater inroads into the design process with large-scale simulations.

  20. Partitioning dynamics of unsaturated flows in fractured porous media: Laboratory studies and three-dimensional multi-scale smoothed particle hydrodynamics simulations of gravity-driven flow in fractures

    NASA Astrophysics Data System (ADS)

    Kordilla, J.; Bresinsky, L. T.; Shigorina, E.; Noffz, T.; Dentz, M.; Sauter, M.; Tartakovsky, A. M.

    2017-12-01

    Preferential flow dynamics in unsaturated fractures remain a challenging topic on various scales. On pore- and fracture-scales the highly erratic gravity-driven flow dynamics often provoke a strong deviation from classical volume-effective approaches. Against the common notion that flow in fractures (or macropores) can only occur under equilibrium conditions, i.e., if the surrounding porous matrix is fully saturated and capillary pressures are high enough to allow filling of the fracture void space, arrival times suggest the existence of rapid preferential flow along fractures, fracture networks, and fault zones, even if the matrix is not fully saturated. Modeling such flows requires efficient numerical techniques to cover various flow-relevant physics, such as surface tension, static and dynamic contact angles, free-surface (multi-phase) interface dynamics, and formation of singularities. Here we demonstrate the importance of such flow modes on the partitioning dynamics at simple fracture intersections, with a combination of laboratory experiments, analytical solutions and numerical simulations using our newly developed massively parallel smoothed particle hydrodynamics (SPH) code. Flow modes heavily influence the "bypass" behavior of water flowing along a fracture junction. Flows favoring the formation of droplets exhibit a much stronger bypass capacity compared to rivulet flows, where nearly the whole fluid mass is initially stored within the horizontal fracture. This behavior is demonstrated for a multi-inlet laboratory setup where the inlet-specific flow rate is chosen so that either a droplet or rivulet flow persists. The effect of fluid buffering within the horizontal fracture is presented in terms of dimensionless fracture inflow so that characteristic scaling regimes can be recovered. For both cases (rivulets and droplets), flow within the horizontal fracture transitions into a Washburn regime until a critical threshold is reached and the bypass efficiency

  1. Design and process aspects of laboratory scale SCF particle formation systems.

    PubMed

    Vemavarapu, Chandra; Mollan, Matthew J; Lodaya, Mayur; Needham, Thomas E

    2005-03-23

    Consistent production of solid drug materials of desired particle and crystallographic morphologies under cGMP conditions is a frequent challenge to pharmaceutical researchers. Supercritical fluid (SCF) technology gained significant attention in pharmaceutical research by not only showing a promise in this regard but also accommodating the principles of green chemistry. Given that this technology attained commercialization in coffee decaffeination and in the extraction of hops and other essential oils, a majority of the off-the-shelf SCF instrumentation is designed for extraction purposes. Only a selective few vendors appear to be in the early stages of manufacturing equipment designed for particle formation. The scarcity of information on the design and process engineering of laboratory scale equipment is recognized as a significant shortcoming to the technological progress. The purpose of this article is therefore to provide the information and resources necessary for startup research involving particle formation using supercritical fluids. The various stages of particle formation by supercritical fluid processing can be broadly classified into delivery, reaction, pre-expansion, expansion and collection. The importance of each of these processes in tailoring the particle morphology is discussed in this article along with presenting various alternatives to perform these operations.

  2. Simulated and Virtual Science Laboratory Experiments: Improving Critical Thinking and Higher-Order Learning Skills

    NASA Astrophysics Data System (ADS)

    Simon, Nicole A.

    Virtual laboratory experiments using interactive computer simulations are not being employed as viable alternatives to laboratory science curriculum at extensive enough rates within higher education. Rote traditional lab experiments are currently the norm and are not addressing inquiry, Critical Thinking, and cognition throughout the laboratory experience, linking with educational technologies (Pyatt & Sims, 2007; 2011; Trundle & Bell, 2010). A causal-comparative quantitative study was conducted with 150 learners enrolled at a two-year community college, to determine the effects of simulation laboratory experiments on Higher-Order Learning, Critical Thinking Skills, and Cognitive Load. The treatment population used simulated experiments, while the non-treatment sections performed traditional expository experiments. A comparison was made using the Revised Two-Factor Study Process survey, Motivated Strategies for Learning Questionnaire, and the Scientific Attitude Inventory survey, using a Repeated Measures ANOVA test for treatment or non-treatment. A main effect of simulated laboratory experiments was found for both Higher-Order Learning, [F (1, 148) = 30.32,p = 0.00, eta2 = 0.12] and Critical Thinking Skills, [F (1, 148) = 14.64,p = 0.00, eta 2 = 0.17] such that simulations showed greater increases than traditional experiments. Post-lab treatment group self-reports indicated increased marginal means (+4.86) in Higher-Order Learning and Critical Thinking Skills, compared to the non-treatment group (+4.71). Simulations also improved the scientific skills and mastery of basic scientific subject matter. It is recommended that additional research recognize that learners' Critical Thinking Skills change due to different instructional methodologies that occur throughout a semester.

  3. Energy Systems High-Pressure Test Laboratory | Energy Systems Integration

    Science.gov Websites

    Facility | NREL Energy Systems High-Pressure Test Laboratory Energy Systems High-Pressure Test Laboratory In the Energy Systems Integration Facility's High-Pressure Test Laboratory, researchers can safely test high-pressure hydrogen components. Photo of researchers running an experiment with a hydrogen fuel

  4. Hardware in the Loop at Megawatt-Scale Power | Energy Systems Integration

    Science.gov Websites

    Facility | NREL Hardware in the Loop at Megawatt-Scale Power Hardware in the Loop at Megawatt -Scale Power Hardware-in-the-loop simulation is not new, but the Energy System Integration Facility's -in-the-loop co-simulation. For more information, read the power hardware-in-the-loop factsheet. Text

  5. Ideas in Practice (3): A Simulated Laboratory Experience in Digital Design.

    ERIC Educational Resources Information Center

    Cleaver, Thomas G.

    1988-01-01

    Gives an example of the use of a simplified logic simulator in a logic design course. Discusses some problems in logic design classes, commercially available software, and software problems. Describes computer-aided engineering (CAE) software. Lists 14 experiments in the simulated laboratory and presents students' evaluation of the course. (YP)

  6. Laboratory and pilot-scale bioremediation of pentaerythritol tetranitrate (PETN) contaminated soil.

    PubMed

    Zhuang, Li; Gui, Lai; Gillham, Robert W; Landis, Richard C

    2014-01-15

    PETN (pentaerythritol tetranitrate), a munitions constituent, is commonly encountered in munitions-contaminated soils, and pose a serious threat to aquatic organisms. This study investigated anaerobic remediation of PETN-contaminated soil at a site near Denver Colorado. Both granular iron and organic carbon amendments were used in both laboratory and pilot-scale tests. The laboratory results showed that, with various organic carbon amendments, PETN at initial concentrations of between 4500 and 5000mg/kg was effectively removed within 84 days. In the field trial, after a test period of 446 days, PETN mass removal of up to 53,071mg/kg of PETN (80%) was achieved with an organic carbon amendment (DARAMEND) of 4% by weight. In previous laboratory studies, granular iron has shown to be highly effective in degrading PETN. However, for both the laboratory and pilot-scale tests, granular iron was proven to be ineffective. This was a consequence of passivation of the iron surfaces caused by the very high concentrations of nitrate in the contaminated soil. This study indicated that low concentration of organic carbon was a key factor limiting bioremediation of PETN in the contaminated soil. Furthermore, the addition of organic carbon amendments such as the DARAMEND materials or brewers grain, proved to be highly effective in stimulating the biodegradation of PETN and could provide the basis for full-scale remediation of PETN-contaminated sites. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Analysis of large-scale tablet coating: Modeling, simulation and experiments.

    PubMed

    Boehling, P; Toschkoff, G; Knop, K; Kleinebudde, P; Just, S; Funke, A; Rehbaum, H; Khinast, J G

    2016-07-30

    This work concerns a tablet coating process in an industrial-scale drum coater. We set up a full-scale Design of Simulation Experiment (DoSE) using the Discrete Element Method (DEM) to investigate the influence of various process parameters (the spray rate, the number of nozzles, the rotation rate and the drum load) on the coefficient of inter-tablet coating variation (cv,inter). The coater was filled with up to 290kg of material, which is equivalent to 1,028,369 tablets. To mimic the tablet shape, the glued sphere approach was followed, and each modeled tablet consisted of eight spheres. We simulated the process via the eXtended Particle System (XPS), proving that it is possible to accurately simulate the tablet coating process on the industrial scale. The process time required to reach a uniform tablet coating was extrapolated based on the simulated data and was in good agreement with experimental results. The results are provided at various levels of details, from thorough investigation of the influence that the process parameters have on the cv,inter and the amount of tablets that visit the spray zone during the simulated 90s to the velocity in the spray zone and the spray and bed cycle time. It was found that increasing the number of nozzles and decreasing the spray rate had the highest influence on the cv,inter. Although increasing the drum load and the rotation rate increased the tablet velocity, it did not have a relevant influence on the cv,inter and the process time. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Transcriptional and metabolic response of recombinant Escherichia coli to spatial dissolved oxygen tension gradients simulated in a scale-down system.

    PubMed

    Lara, Alvaro R; Leal, Lidia; Flores, Noemí; Gosset, Guillermo; Bolívar, Francisco; Ramírez, Octavio T

    2006-02-05

    Escherichia coli, expressing recombinant green fluorescent protein (GFP), was subjected to dissolved oxygen tension (DOT) oscillations in a two-compartment system for simulating gradients that can occur in large-scale bioreactors. Cells were continuously circulated between the anaerobic (0% DOT) and aerobic (10% DOT) vessels of the scale-down system to mimic an overall circulation time of 50 s, and a mean residence time in the anaerobic and aerobic compartments of 33 and 17 s, respectively. Transcription levels of mixed acid fermentation genes (ldhA, poxB, frdD, ackA, adhE, pflD, and fdhF), measured by quantitative RT-PCR, increased between 1.5- to over 6-fold under oscillatory DOT compared to aerobic cultures (constant 10% DOT). In addition, the transcription level of fumB increased whereas it decreased for sucA and sucB, suggesting that the tricarboxylic acid cycle was functioning as two open branches. Gene transcription levels revealed that cytrochrome bd, which has higher affinity to oxygen but lower energy efficiency, was preferred over cytochrome bO3 in oscillatory DOT cultures. Post-transcriptional processing limited heterologous protein production in the scale-down system, as inferred from similar gfp transcription but 19% lower GFP concentration compared to aerobic cultures. Simulated DOT gradients also affected the transcription of genes of the glyoxylate shunt (aceA), of global regulators of aerobic and anaerobic metabolism (fnr, arcA, and arcB), and other relevant genes (luxS, sodA, fumA, and sdhB). Transcriptional changes explained the observed alterations in overall stoichiometric and kinetic parameters, and production of ethanol and organic acids. Differences in transcription levels between aerobic and anaerobic compartments were also observed, indicating that E. coli can respond very fast to intermittent DOT conditions. The transcriptional responses of E. coli to DOT gradients reported here are useful for establishing rational scale-up criteria and

  9. Evidence of Biot Slow Waves in Electroseismic Measurementss on Laboratory-Scale

    NASA Astrophysics Data System (ADS)

    Devi, M. S.

    2015-12-01

    Electroseismic methods which are the opposite of seismo-electric methods have only been little investigated up to now especially in the near surface scale. These methods can generate the solid-fluid relative movement induced by the electric potential in fluid-filled porous media. These methods are the response of electro-osmosis due to the presence of the electrical double layer. Laboratory experiments and numerical simulations of electroseismic studies have been performed. Electroseismic measurements conducted in micro glass beads saturated with demineralized water. Pair of 37 x 37 mm square aluminium grids with 2 mm of aperture and 4 mm of spacing is used as the electric dipole that connected to the electric power source with the voltage output 150 V. A laser doppler vibrometer is the system used to measure velocity of vibrating objects during measurements by placing a line of reflective paper on the surface of media that scattered back a helium-neon laser. The results in homogeneous media shows that the compressional waves induced by an electric signal. We confirm that the results are not the effects of thermal expansion. We also noticed that there are two kinds of the compressional waves are recorded: fast and slow P-waves. The latter, Biot slow waves, indicate the dominant amplitude. Moreover, we found that the transition frequency (ωc) of Biot slow waves depends on mechanical parameters such as porosity and permeability. The ωc is not affected when varying conductivity of the fluid from 25 - 320 μS/cm, although the amplitude slightly changed. For the results in two layer media by placing a sandstone as a top layer shows that a large amount of transmission seismic waves (apparently as Biot slow waves) rather than converted electromagnetic-to-seismic waves. These properties have also been simulated with full waveform numerical simulations relying on Pride's (1994) using our computer code (Garambois & Dietrich, 2002). If it is true that the electric source in

  10. Pore-scale simulation of CO2-water-rock interactions

    NASA Astrophysics Data System (ADS)

    Deng, H.; Molins, S.; Steefel, C. I.; DePaolo, D. J.

    2017-12-01

    In Geologic Carbon Storage (GCS) systems, the migration of scCO2 versus CO2-acidifed brine ultimately determines the extent of mineral trapping and caprock integrity, i.e. the long-term storage efficiency and security. While continuum scale multiphase reactive transport models are valuable for large scale investigations, they typically (over-)simplify pore-scale dynamics and cannot capture local heterogeneities that may be important. Therefore, pore-scale models are needed in order to provide mechanistic understanding of how fine scale structural variations and heterogeneous processes influence the transport and geochemistry in the context of multiphase flow, and to inform parameterization of continuum scale modeling. In this study, we investigate the interplay of different processes at pore scale (e.g. diffusion, reactions, and multiphase flow) through the coupling of a well-developed multiphase flow simulator with a sophisticated reactive transport code. The objectives are to understand where brine displaced by scCO2 will reside in a rough pore/fracture, and how the CO2-water-rock interactions may affect the redistribution of different phases. In addition, the coupled code will provide a platform for model testing in pore-scale multiphase reactive transport problems.

  11. Cryosphere Science Outreach using the NASA/JPL Virtual Earth System Laboratory

    NASA Astrophysics Data System (ADS)

    Larour, E. Y.; Cheng, D. L. C.; Quinn, J.; Halkides, D. J.; Perez, G. L.

    2016-12-01

    Understanding the role of Cryosphere Science within the larger context of Sea Level Rise is both a technical and educational challenge that needs to be addressed if the public at large is to truly understand the implications and consequences of Climate Change. Within this context, we propose a new approach in which scientific tools are used directly inside a mobile/website platform geared towards Education/Outreach. Here, we apply this approach by using the Ice Sheet System Model, a state of the art Cryosphere model developed at NASA, and integrated within a Virtual Earth System Laboratory, with the goal to outreach Cryosphere science to K-12 and College level students. The approach mixes laboratory experiments, interactive classes/lessons on a website, and a simplified interface to a full-fledged instance of ISSM to validate the classes/lessons. This novel approach leverages new insights from the Outreach/Educational community and the interest of new generations in web based technologies and simulation tools, all of it delivered in a seamlessly integrated web platform, relying on a state of the art climate model and live simulations.

  12. Robust large-scale parallel nonlinear solvers for simulations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their usemore » in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple

  13. Towards a physically-based multi-scale ecohydrological simulator for semi-arid regions

    NASA Astrophysics Data System (ADS)

    Caviedes-Voullième, Daniel; Josefik, Zoltan; Hinz, Christoph

    2017-04-01

    The use of numerical models as tools for describing and understanding complex ecohydrological systems has enabled to test hypothesis and propose fundamental, process-based explanations of the system system behaviour as a whole as well as its internal dynamics. Reaction-diffusion equations have been used to describe and generate organized pattern such as bands, spots, and labyrinths using simple feedback mechanisms and boundary conditions. Alternatively, pattern-matching cellular automaton models have been used to generate vegetation self-organization in arid and semi-arid regions also using simple description of surface hydrological processes. A key question is: How much physical realism is needed in order to adequately capture the pattern formation processes in semi-arid regions while reliably representing the water balance dynamics at the relevant time scales? In fact, redistribution of water by surface runoff at the hillslope scale occurs at temporal resolution of minutes while the vegetation development requires much lower temporal resolution and longer times spans. This generates a fundamental spatio-temporal multi-scale problem to be solved, for which high resolution rainfall and surface topography are required. Accordingly, the objective of this contribution is to provide proof-of-concept that governing processes can be described numerically at those multiple scales. The requirements for a simulating ecohydrological processes and pattern formation with increased physical realism are, amongst others: i. high resolution rainfall that adequately captures the triggers of growth as vegetation dynamics of arid regions respond as pulsed systems. ii. complex, natural topography in order to accurately model drainage patterns, as surface water redistribution is highly sensitive to topographic features. iii. microtopography and hydraulic roughness, as small scale variations do impact on large scale hillslope behaviour iv. moisture dependent infiltration as temporal

  14. Investigating the dynamics of Vulcanian explosions using scaled laboratory experiments

    NASA Astrophysics Data System (ADS)

    Clarke, A. B.; Phillips, J. C.; Chojnicki, K. N.

    2005-12-01

    Laboratory experiments were conducted to investigate the dynamics of Vulcanian eruptions. A reservoir containing a mixture of water and methanol plus solid particles was pressurized and suddenly released via a rapid-release valve into a 2 ft by 2 ft by 4 ft plexiglass tank containing fresh water. Water and methanol created a light interstitial fluid to simulate buoyant volcanic gases in erupted mixtures. The duration of the subsequent experiments was not pre-determined, but instead was limited by the potential energy associated with the pressurized fluid, rather than by the volume of available fluid. Suspending liquid density was varied between 960 and 1000 kg m-3 by changing methanol concentrations from 5 to 20%. Particle size (4 & 45 microns) and concentration (1 to 5 vol%) were varied in order to change particle settling characteristics and control bulk mixture density. Variations in reservoir pressure and vent size allowed exploration of the controlling source parameters, buoyancy flux (Bo) and momentum flux (Mo). The velocity-height relationship of each experiment was documented by high-speed video, permitting classification of the laboratory flows, which ranged from long continuously accelerating jets, to starting plumes, to low-energy thermals, to collapsing fountains generating density currents. Field-documented Vulcanian explosions exhibit this same wide range of behavior (Self et al. 1979, Nature 277; Sparks & Wilson 1982, Geophys. J. R. astr. Soc. 69; Druitt et al. 2002, Geol. Soc. London, 21), demonstrating that flows obtained in the laboratory are relevant to natural systems. A generalized framework of results was defined as follows. Increasing Mo/Bo for small particles (4 microns; settling time > experiment duration) pushes the system from low-energy thermals toward high-energy, continuously accelerating jets; increasing Mo/Bo for large particles (>45 microns; settling time < experiment duration) pushes the system from a low collapsing fountain to a

  15. WRF/CMAQ AQMEII3 Simulations of US Regional-Scale ...

    EPA Pesticide Factsheets

    Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary conditions prepared from four different global models. Results indicate that the impacts of different boundary conditions are significant for ozone throughout the year and most pronounced outside the summer season. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  16. Bioprocess scale-up/down as integrative enabling technology: from fluid mechanics to systems biology and beyond.

    PubMed

    Delvigne, Frank; Takors, Ralf; Mudde, Rob; van Gulik, Walter; Noorman, Henk

    2017-09-01

    Efficient optimization of microbial processes is a critical issue for achieving a number of sustainable development goals, considering the impact of microbial biotechnology in agrofood, environment, biopharmaceutical and chemical industries. Many of these applications require scale-up after proof of concept. However, the behaviour of microbial systems remains unpredictable (at least partially) when shifting from laboratory-scale to industrial conditions. The need for robust microbial systems is thus highly needed in this context, as well as a better understanding of the interactions between fluid mechanics and cell physiology. For that purpose, a full scale-up/down computational framework is already available. This framework links computational fluid dynamics (CFD), metabolic flux analysis and agent-based modelling (ABM) for a better understanding of the cell lifelines in a heterogeneous environment. Ultimately, this framework can be used for the design of scale-down simulators and/or metabolically engineered cells able to cope with environmental fluctuations typically found in large-scale bioreactors. However, this framework still needs some refinements, such as a better integration of gas-liquid flows in CFD, and taking into account intrinsic biological noise in ABM. © 2017 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  17. Pulverized solid injection system. Application to laboratory burners and pyrometric temperature measurements

    NASA Astrophysics Data System (ADS)

    Therssen, E.; Delfosse, L.

    1995-08-01

    The design and setting up of a pulverized solid injection system for use in laboratory burners is presented. The original dual system consists of a screw feeder coupled to an acoustic sower. This laboratory device allows a good regularity and stability of the particle-gas mixture transported to the burner in a large scale of mass powder and gas vector rate flow. The thermal history of the particles has been followed by optical measurements. The quality of the particle cloud injected in the burner has been validated by the good agreement between experimental and modeling particle temperature.

  18. Modelling high Reynolds number wall-turbulence interactions in laboratory experiments using large-scale free-stream turbulence.

    PubMed

    Dogan, Eda; Hearst, R Jason; Ganapathisubramani, Bharathram

    2017-03-13

    A turbulent boundary layer subjected to free-stream turbulence is investigated in order to ascertain the scale interactions that dominate the near-wall region. The results are discussed in relation to a canonical high Reynolds number turbulent boundary layer because previous studies have reported considerable similarities between these two flows. Measurements were acquired simultaneously from four hot wires mounted to a rake which was traversed through the boundary layer. Particular focus is given to two main features of both canonical high Reynolds number boundary layers and boundary layers subjected to free-stream turbulence: (i) the footprint of the large scales in the logarithmic region on the near-wall small scales, specifically the modulating interaction between these scales, and (ii) the phase difference in amplitude modulation. The potential for a turbulent boundary layer subjected to free-stream turbulence to 'simulate' high Reynolds number wall-turbulence interactions is discussed. The results of this study have encouraging implications for future investigations of the fundamental scale interactions that take place in high Reynolds number flows as it demonstrates that these can be achieved at typical laboratory scales.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).

  19. A daylong clinical laboratory: from gaming to high-fidelity simulators.

    PubMed

    Bantz, Diana; Dancer, Michelle Mattice; Hodson-Carlton, Kay; Van Hove, Sharon

    2007-01-01

    Meeting required objectives in the clinical setting can be difficult because of low exposure to critical events. This has been further compounded by an increase in the number of enrolling students without a reciprocal rise in the number of field-related clinical sites. As simulation gains popularity in nursing, exploration of its use and benefits to teach nursing-related concepts is desirable. The authors discuss a variety of teaching strategies ranging from the use of games to high-fidelity simulators that have been incorporated into an all-day clinical simulation campus laboratory.

  20. Application of lab derived kinetic biodegradation parameters at the field scale

    NASA Astrophysics Data System (ADS)

    Schirmer, M.; Barker, J. F.; Butler, B. J.; Frind, E. O.

    2003-04-01

    Estimating the intrinsic remediation potential of an aquifer typically requires the accurate assessment of the biodegradation kinetics, the level of available electron acceptors and the flow field. Zero- and first-order degradation rates derived at the laboratory scale generally overpredict the rate of biodegradation when applied to the field scale, because limited electron acceptor availability and microbial growth are typically not considered. On the other hand, field estimated zero- and first-order rates are often not suitable to forecast plume development because they may be an oversimplification of the processes at the field scale and ignore several key processes, phenomena and characteristics of the aquifer. This study uses the numerical model BIO3D to link the laboratory and field scale by applying laboratory derived Monod kinetic degradation parameters to simulate a dissolved gasoline field experiment at Canadian Forces Base (CFB) Borden. All additional input parameters were derived from laboratory and field measurements or taken from the literature. The simulated results match the experimental results reasonably well without having to calibrate the model. An extensive sensitivity analysis was performed to estimate the influence of the most uncertain input parameters and to define the key controlling factors at the field scale. It is shown that the most uncertain input parameters have only a minor influence on the simulation results. Furthermore it is shown that the flow field, the amount of electron acceptor (oxygen) available and the Monod kinetic parameters have a significant influence on the simulated results. Under the field conditions modelled and the assumptions made for the simulations, it can be concluded that laboratory derived Monod kinetic parameters can adequately describe field scale degradation processes, if all controlling factors are incorporated in the field scale modelling that are not necessarily observed at the lab scale. In this way

  1. Overcoming time scale and finite size limitations to compute nucleation rates from small scale well tempered metadynamics simulations

    NASA Astrophysics Data System (ADS)

    Salvalaglio, Matteo; Tiwary, Pratyush; Maggioni, Giovanni Maria; Mazzotti, Marco; Parrinello, Michele

    2016-12-01

    Condensation of a liquid droplet from a supersaturated vapour phase is initiated by a prototypical nucleation event. As such it is challenging to compute its rate from atomistic molecular dynamics simulations. In fact at realistic supersaturation conditions condensation occurs on time scales that far exceed what can be reached with conventional molecular dynamics methods. Another known problem in this context is the distortion of the free energy profile associated to nucleation due to the small, finite size of typical simulation boxes. In this work the problem of time scale is addressed with a recently developed enhanced sampling method while contextually correcting for finite size effects. We demonstrate our approach by studying the condensation of argon, and showing that characteristic nucleation times of the order of magnitude of hours can be reliably calculated. Nucleation rates spanning a range of 10 orders of magnitude are computed at moderate supersaturation levels, thus bridging the gap between what standard molecular dynamics simulations can do and real physical systems.

  2. Overcoming time scale and finite size limitations to compute nucleation rates from small scale well tempered metadynamics simulations.

    PubMed

    Salvalaglio, Matteo; Tiwary, Pratyush; Maggioni, Giovanni Maria; Mazzotti, Marco; Parrinello, Michele

    2016-12-07

    Condensation of a liquid droplet from a supersaturated vapour phase is initiated by a prototypical nucleation event. As such it is challenging to compute its rate from atomistic molecular dynamics simulations. In fact at realistic supersaturation conditions condensation occurs on time scales that far exceed what can be reached with conventional molecular dynamics methods. Another known problem in this context is the distortion of the free energy profile associated to nucleation due to the small, finite size of typical simulation boxes. In this work the problem of time scale is addressed with a recently developed enhanced sampling method while contextually correcting for finite size effects. We demonstrate our approach by studying the condensation of argon, and showing that characteristic nucleation times of the order of magnitude of hours can be reliably calculated. Nucleation rates spanning a range of 10 orders of magnitude are computed at moderate supersaturation levels, thus bridging the gap between what standard molecular dynamics simulations can do and real physical systems.

  3. Formalizing Knowledge in Multi-Scale Agent-Based Simulations

    PubMed Central

    Somogyi, Endre; Sluka, James P.; Glazier, James A.

    2017-01-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused. PMID:29338063

  4. Formalizing Knowledge in Multi-Scale Agent-Based Simulations.

    PubMed

    Somogyi, Endre; Sluka, James P; Glazier, James A

    2016-10-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused.

  5. The ideal laboratory information system.

    PubMed

    Sepulveda, Jorge L; Young, Donald S

    2013-08-01

    Laboratory information systems (LIS) are critical components of the operation of clinical laboratories. However, the functionalities of LIS have lagged significantly behind the capacities of current hardware and software technologies, while the complexity of the information produced by clinical laboratories has been increasing over time and will soon undergo rapid expansion with the use of new, high-throughput and high-dimensionality laboratory tests. In the broadest sense, LIS are essential to manage the flow of information between health care providers, patients, and laboratories and should be designed to optimize not only laboratory operations but also personalized clinical care. To list suggestions for designing LIS with the goal of optimizing the operation of clinical laboratories while improving clinical care by intelligent management of laboratory information. Literature review, interviews with laboratory users, and personal experience and opinion. Laboratory information systems can improve laboratory operations and improve patient care. Specific suggestions for improving the function of LIS are listed under the following sections: (1) Information Security, (2) Test Ordering, (3) Specimen Collection, Accessioning, and Processing, (4) Analytic Phase, (5) Result Entry and Validation, (6) Result Reporting, (7) Notification Management, (8) Data Mining and Cross-sectional Reports, (9) Method Validation, (10) Quality Management, (11) Administrative and Financial Issues, and (12) Other Operational Issues.

  6. Spacecraft contamination programs within the Air Force Systems Command Laboratories

    NASA Technical Reports Server (NTRS)

    Murad, Edmond

    1990-01-01

    Spacecraft contamination programs exist in five independent AFSC organizations: Geophysics Laboratory (GL), Arnold Engineering and Development Center (AEDC), Rome Air Development Center (RADC/OSCE), Wright Research and Development Center (MLBT), Armament Laboratory (ATL/SAI), and Space Systems Division (SSD/OL-AW). In addition, a sizable program exists at Aerospace Corp. These programs are complementary, each effort addressing a specific area of expertise: GL's effort is aimed at addressing the effects of on-orbit contamination; AEDC's effort is aimed at ground simulation and measurement of optical contamination; RADC's effort addresses the accumulation, measurement, and removal of contamination on large optics; MLBT's effort is aimed at understanding the effect of contamination on materials; ATL's effort is aimed at understanding the effect of plume contamination on systems; SSD's effort is confined to the integration of some contamination experiments sponsored by SSD/CLT; and Aerospace Corp.'s effort is aimed at supporting the needs of the using System Program Offices (SPO) in specific areas, such as contamination during ground handling, ascent phase, laboratory measurements aimed at understanding on-orbit contamination, and mass loss and mass gain in on-orbit operations. These programs are described in some detail, with emphasis on GL's program.

  7. Aeration of the teuftal landfill: Field scale concept and lab scale simulation.

    PubMed

    Ritzkowski, Marco; Walker, Beat; Kuchta, Kerstin; Raga, Roberto; Stegmann, Rainer

    2016-09-01

    Long lasting post-closure care (PCC) is often the major financial burden for operators of municipal solid waste (MSW) landfills. Beside costs for the installation and maintenance of technical equipment and barriers, in particular long term treatment of leachate and landfill gas has to be paid from capital surplus. Estimations based on laboratory experiments project time periods of many decades until leachate quality allows for direct discharge (i.e. no need for further purification). Projections based on leachate samples derived from the last 37years for 35 German landfills confirm these assumption. Moreover, the data illustrate that in particular ammonium nitrogen concentrations are likely to fall below limit values only after a period of 300years. In order to avoid long lasting PCC the operator of Teuftal landfill, located in the Swiss canton Bern, decided to biologically stabilize the landfill by means of a combined in situ aeration and moisturization approach. In December 2014 the aeration started at a landfill section containing approximately 30% of the total landfill volume. From summer 2016 onwards the remaining part of the landfill will be aerated. Landfill aeration through horizontal gas and leachate drains is carried out for the first time in field scale in Europe. The technical concept is described in the paper. Parallel to field scale aeration, investigations for the carbon and nitrogen turnover are carried out by means of both simulated aerated landfills and simulated anaerobic landfills. The results presented in this paper demonstrate that aeration is capable to enhance, both carbon mobilization and discharge via the gas phase. This effect comes along with a significant increase in bio-stabilization of the waste organic fraction, which positively affects the landfill emission behavior in the long run. In terms of leachate pollution reduction it could be demonstrated that the organic load decrease fast and widely independent of the adjusted aeration

  8. Small-scale impacts as potential trigger for landslides on small Solar system bodies

    NASA Astrophysics Data System (ADS)

    Hofmann, Marc; Sierks, Holger; Blum, Jürgen

    2017-07-01

    We conducted a set of experiments to investigate whether millimetre-sized impactors impinging on a granular material at several m s-1 are able to trigger avalanches on small, atmosphereless planetary bodies. These experiments were carried out at the Zentrum für angewandte Raumfahrttechnologie und Mikrogravitation (ZARM) drop tower facility in Bremen, Germany to facilitate a reduced gravity environment. Additional data were gathered at Earth gravity levels in the laboratory. As sample materials we used a ground Howardites, Eucrites and Diogenites (HED) meteorite and the Johnson Space Center (JSC) Mars-1 Martian soil simulant. We found that this type of small-scale impact can trigger avalanches with a moderate probability, if the target material is tilted to an angle close to the angle of repose. We additionally simulated a small-scale impact using the discrete element method code esys-particle. These simulations show that energy transfer from impactor to the target material is most efficient at low- and moderate-impactor inclinations and the transferred energy is retained in particles close to the surface due to a rapid dissipation of energy in lower material layers driven by inelastic collisions. Through Monte Carlo simulations we estimate the time-scale on which small-scale impacts with the observed characteristics will trigger avalanches covering all steep slopes on the surface of a small planetary body to be of the order 105 yr.

  9. Field-scale simulation of chemical flooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saad, N.

    1989-01-01

    A three-dimensional compositional chemical flooding simulator (UTCHEM) has been improved. The new mathematical formulation, boundary conditions, and a description of the physicochemical models of the simulator are presented. This improved simulator has been used for the study of the low tension pilot project at the Big Muddy field near Casper, Wyoming. Both the tracer injection conducted prior to the injection of the chemical slug, and the chemical flooding stages of the pilot project, have been analyzed. Not only the oil recovery but also the tracers, polymer, alcohol and chloride histories have been successfully matched with field results. Simulation results indicatemore » that, for this fresh water reservoir, the salinity gradient during the preflush and the resulting calcium pickup by the surfactant slug played a major role in the success of the project. In addition, analysis of the effects of the crossflow on the performance of the pilot project indicates that, for the well spacing of the pilot, crossflow does not play as important a role as it might for a large-scale project. To improve the numerical efficiency of the simulator, a third order convective differencing scheme has been applied to the simulator. This method can be used with non-uniform mesh, and therefore is suited for simulation studies of large-scale multiwell heterogeneous reservoirs. Comparison of the results with one and two dimensional analytical solutions shows that this method is effective in eliminating numerical dispersion using relatively large grid blocks. Results of one, two and three-dimensional miscible water/tracer flow, water flooding, polymer flooding, and micellar-polymer flooding test problems, and results of grid orientation studies, are presented.« less

  10. 42 CFR 493.1230 - Condition: General laboratory systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Condition: General laboratory systems. 493.1230... SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived Testing General Laboratory Systems § 493.1230 Condition: General laboratory systems. Each laboratory that...

  11. Advances and issues from the simulation of planetary magnetospheres with recent supercomputer systems

    NASA Astrophysics Data System (ADS)

    Fukazawa, K.; Walker, R. J.; Kimura, T.; Tsuchiya, F.; Murakami, G.; Kita, H.; Tao, C.; Murata, K. T.

    2016-12-01

    Planetary magnetospheres are very large, while phenomena within them occur on meso- and micro-scales. These scales range from 10s of planetary radii to kilometers. To understand dynamics in these multi-scale systems, numerical simulations have been performed by using the supercomputer systems. We have studied the magnetospheres of Earth, Jupiter and Saturn by using 3-dimensional magnetohydrodynamic (MHD) simulations for a long time, however, we have not obtained the phenomena near the limits of the MHD approximation. In particular, we have not studied meso-scale phenomena that can be addressed by using MHD.Recently we performed our MHD simulation of Earth's magnetosphere by using the K-computer which is the first 10PFlops supercomputer and obtained multi-scale flow vorticity for the both northward and southward IMF. Furthermore, we have access to supercomputer systems which have Xeon, SPARC64, and vector-type CPUs and can compare simulation results between the different systems. Finally, we have compared the results of our parameter survey of the magnetosphere with observations from the HISAKI spacecraft.We have encountered a number of difficulties effectively using the latest supercomputer systems. First the size of simulation output increases greatly. Now a simulation group produces over 1PB of output. Storage and analysis of this much data is difficult. The traditional way to analyze simulation results is to move the results to the investigator's home computer. This takes over three months using an end-to-end 10Gbps network. In reality, there are problems at some nodes such as firewalls that can increase the transfer time to over one year. Another issue is post-processing. It is hard to treat a few TB of simulation output due to the memory limitations of a post-processing computer. To overcome these issues, we have developed and introduced the parallel network storage, the highly efficient network protocol and the CUI based visualization tools.In this study, we

  12. Brain-wave measures of workload in advanced cockpits: The transition of technology from laboratory to cockpit simulator, phase 2

    NASA Technical Reports Server (NTRS)

    Horst, Richard L.; Mahaffey, David L.; Munson, Robert C.

    1989-01-01

    The present Phase 2 small business innovation research study was designed to address issues related to scalp-recorded event-related potential (ERP) indices of mental workload and to transition this technology from the laboratory to cockpit simulator environments for use as a systems engineering tool. The project involved five main tasks: (1) Two laboratory studies confirmed the generality of the ERP indices of workload obtained in the Phase 1 study and revealed two additional ERP components related to workload. (2) A task analysis' of flight scenarios and pilot tasks in the Advanced Concepts Flight Simulator (ACFS) defined cockpit events (i.e., displays, messages, alarms) that would be expected to elicit ERPs related to workload. (3) Software was developed to support ERP data analysis. An existing ARD-proprietary package of ERP data analysis routines was upgraded, new graphics routines were developed to enhance interactive data analysis, and routines were developed to compare alternative single-trial analysis techniques using simulated ERP data. (4) Working in conjunction with NASA Langley research scientists and simulator engineers, preparations were made for an ACFS validation study of ERP measures of workload. (5) A design specification was developed for a general purpose, computerized, workload assessment system that can function in simulators such as the ACFS.

  13. A Monte Carlo Simulation of the in vivo measurement of lung activity in the Lawrence Livermore National Laboratory torso phantom.

    PubMed

    Acha, Robert; Brey, Richard; Capello, Kevin

    2013-02-01

    A torso phantom was developed by the Lawrence Livermore National Laboratory (LLNL) that serves as a standard for intercomparison and intercalibration of detector systems used to measure low-energy photons from radionuclides, such as americium deposited in the lungs. DICOM images of the second-generation Human Monitoring Laboratory-Lawrence Livermore National Laboratory (HML-LLNL) torso phantom were segmented and converted into three-dimensional (3D) voxel phantoms to simulate the response of high purity germanium (HPGe) detector systems, as found in the HML new lung counter using a Monte Carlo technique. The photon energies of interest in this study were 17.5, 26.4, 45.4, 59.5, 122, 244, and 344 keV. The detection efficiencies at these photon energies were predicted for different chest wall thicknesses (1.49 to 6.35 cm) and compared to measured values obtained with lungs containing (241)Am (34.8 kBq) and (152)Eu (10.4 kBq). It was observed that no statistically significant differences exist at the 95% confidence level between the mean values of simulated and measured detection efficiencies. Comparisons between the simulated and measured detection efficiencies reveal a variation of 20% at 17.5 keV and 1% at 59.5 keV. It was found that small changes in the formulation of the tissue substitute material caused no significant change in the outcome of Monte Carlo simulations.

  14. Simulation and Laboratory results of the Hard X-ray Polarimeter: X-Calibur

    NASA Astrophysics Data System (ADS)

    Guo, Qingzhen; Beilicke, M.; Kislat, F.; Krawczynski, H.

    2014-01-01

    X-ray polarimetry promises to give qualitatively new information about high-energy sources, such as binary black hole (BH) systems, Microquasars, active galactic nuclei (AGN), GRBs, etc. We designed, built and tested a hard X-ray polarimeter 'X-Calibur' to be flown in the focal plane of the InFOCuS grazing incidence hard X-ray telescope in 2014. X-Calibur combines a low-Z Compton scatterer with a CZT detector assembly to measure the polarization of 20- 80 keV X-rays making use of the fact that polarized photons Compton scatter preferentially perpendicular to the E field orientation. X-Calibur achieves a high detection efficiency of order unity. We optimized of the design of the instrument based on Monte Carlo simulations of polarized and unpolarized X-ray beams and of the most important background components. We have calibrated and tested X-Calibur extensively in the laboratory at Washington University and at the Cornell High-Energy Synchrotron Source (CHESS). Measurements using the highly polarized synchrotron beam at CHESS confirm the polarization sensitivity of the instrument. In this talk we report on the optimization of the design of the instrument based on Monte Carlo simulations, as well as results of laboratory calibration measurements characterizing the performance of the instrument.

  15. Micron-scale Reactive Atomistic Simulation of Void Collapse and Hotspot Growth in PETN

    NASA Astrophysics Data System (ADS)

    Thompson, Aidan; Shan, Tzu-Ray; Wixom, Ryan

    2015-06-01

    Material defects and other heterogeneities such as dislocations, micro-porosity, and grain boundaries play key roles in the shock-induced initiation of detonation in energetic materials. We performed non-equilibrium molecular dynamics simulations to explore the effect of nanoscale voids on hotspot growth and initiation in micron-scale pentaerythritol tetranitrate (PETN) crystals under weak shock loading (Up = 1.25 km/s; Us = 4.5 km/s). We used the ReaxFF potential implemented in LAMMPS. We built a pseudo-2D PETN crystal with dimensions 0.3 μm × 0.22 μm × 1.3 nm containing a 20 nm cylindrical void. Once the initial shockwave traversed the entire sample, the shock-front absorbing boundary condition was applied, allowing the simulation to continue beyond 1 nanosecond. Results show an exponentially increasing hotspot growth rate. The hotspot morphology is initially symmetric about the void axis, but strong asymmetry develops at later times, due to strong coupling between exothermic chemistry, temperature, and divergent secondary shockwaves emanating from the collapsing void. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DOE National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  16. Pore-scale simulation of microbial growth using a genome-scale metabolic model: Implications for Darcy-scale reactive transport

    NASA Astrophysics Data System (ADS)

    Tartakovsky, G. D.; Tartakovsky, A. M.; Scheibe, T. D.; Fang, Y.; Mahadevan, R.; Lovley, D. R.

    2013-09-01

    Recent advances in microbiology have enabled the quantitative simulation of microbial metabolism and growth based on genome-scale characterization of metabolic pathways and fluxes. We have incorporated a genome-scale metabolic model of the iron-reducing bacteria Geobacter sulfurreducens into a pore-scale simulation of microbial growth based on coupling of iron reduction to oxidation of a soluble electron donor (acetate). In our model, fluid flow and solute transport is governed by a combination of the Navier-Stokes and advection-diffusion-reaction equations. Microbial growth occurs only on the surface of soil grains where solid-phase mineral iron oxides are available. Mass fluxes of chemical species associated with microbial growth are described by the genome-scale microbial model, implemented using a constraint-based metabolic model, and provide the Robin-type boundary condition for the advection-diffusion equation at soil grain surfaces. Conventional models of microbially-mediated subsurface reactions use a lumped reaction model that does not consider individual microbial reaction pathways, and describe reactions rates using empirically-derived rate formulations such as the Monod-type kinetics. We have used our pore-scale model to explore the relationship between genome-scale metabolic models and Monod-type formulations, and to assess the manifestation of pore-scale variability (microenvironments) in terms of apparent Darcy-scale microbial reaction rates. The genome-scale model predicted lower biomass yield, and different stoichiometry for iron consumption, in comparison to prior Monod formulations based on energetics considerations. We were able to fit an equivalent Monod model, by modifying the reaction stoichiometry and biomass yield coefficient, that could effectively match results of the genome-scale simulation of microbial behaviors under excess nutrient conditions, but predictions of the fitted Monod model deviated from those of the genome-scale model

  17. Pore-scale simulation of microbial growth using a genome-scale metabolic model: Implications for Darcy-scale reactive transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tartakovsky, Guzel D.; Tartakovsky, Alexandre M.; Scheibe, Timothy D.

    2013-09-07

    Recent advances in microbiology have enabled the quantitative simulation of microbial metabolism and growth based on genome-scale characterization of metabolic pathways and fluxes. We have incorporated a genome-scale metabolic model of the iron-reducing bacteria Geobacter sulfurreducens into a pore-scale simulation of microbial growth based on coupling of iron reduction to oxidation of a soluble electron donor (acetate). In our model, fluid flow and solute transport is governed by a combination of the Navier-Stokes and advection-diffusion-reaction equations. Microbial growth occurs only on the surface of soil grains where solid-phase mineral iron oxides are available. Mass fluxes of chemical species associated withmore » microbial growth are described by the genome-scale microbial model, implemented using a constraint-based metabolic model, and provide the Robin-type boundary condition for the advection-diffusion equation at soil grain surfaces. Conventional models of microbially-mediated subsurface reactions use a lumped reaction model that does not consider individual microbial reaction pathways, and describe reactions rates using empirically-derived rate formulations such as the Monod-type kinetics. We have used our pore-scale model to explore the relationship between genome-scale metabolic models and Monod-type formulations, and to assess the manifestation of pore-scale variability (microenvironments) in terms of apparent Darcy-scale microbial reaction rates. The genome-scale model predicted lower biomass yield, and different stoichiometry for iron consumption, in comparisonto prior Monod formulations based on energetics considerations. We were able to fit an equivalent Monod model, by modifying the reaction stoichiometry and biomass yield coefficient, that could effectively match results of the genome-scale simulation of microbial behaviors under excess nutrient conditions, but predictions of the fitted Monod model deviated from those of the genome-scale

  18. Pore-scale simulation of microbial growth using a genome-scale metabolic model: Implications for Darcy-scale reactive transport

    NASA Astrophysics Data System (ADS)

    Scheibe, T. D.; Tartakovsky, G.; Tartakovsky, A. M.; Fang, Y.; Mahadevan, R.; Lovley, D. R.

    2012-12-01

    Recent advances in microbiology have enabled the quantitative simulation of microbial metabolism and growth based on genome-scale characterization of metabolic pathways and fluxes. We have incorporated a genome-scale metabolic model of the iron-reducing bacteria Geobacter sulfurreducens into a pore-scale simulation of microbial growth based on coupling of iron reduction to oxidation of a soluble electron donor (acetate). In our model, fluid flow and solute transport is governed by a combination of the Navier-Stokes and advection-diffusion-reaction equations. Microbial growth occurs only on the surface of soil grains where solid-phase mineral iron oxides are available. Mass fluxes of chemical species associated with microbial growth are described by the genome-scale microbial model, implemented using a constraint-based metabolic model, and provide the Robin-type boundary condition for the advection-diffusion equation at soil grain surfaces. Conventional models of microbially-mediated subsurface reactions use a lumped reaction model that does not consider individual microbial reaction pathways, and describe reactions rates using empirically-derived rate formulations such as the Monod-type kinetics. We have used our pore-scale model to explore the relationship between genome-scale metabolic models and Monod-type formulations, and to assess the manifestation of pore-scale variability (microenvironments) in terms of apparent Darcy-scale microbial reaction rates. The genome-scale model predicted lower biomass yield, and different stoichiometry for iron consumption, in comparison to prior Monod formulations based on energetics considerations. We were able to fit an equivalent Monod model, by modifying the reaction stoichiometry and biomass yield coefficient, that could effectively match results of the genome-scale simulation of microbial behaviors under excess nutrient conditions, but predictions of the fitted Monod model deviated from those of the genome-scale model

  19. Field scale test of multi-dimensional flow and morphodynamic simulations used for restoration design analysis

    USGS Publications Warehouse

    McDonald, Richard R.; Nelson, Jonathan M.; Fosness, Ryan L.; Nelson, Peter O.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    Two- and three-dimensional morphodynamic simulations are becoming common in studies of channel form and process. The performance of these simulations are often validated against measurements from laboratory studies. Collecting channel change information in natural settings for model validation is difficult because it can be expensive and under most channel forming flows the resulting channel change is generally small. Several channel restoration projects designed in part to armor large meanders with several large spurs constructed of wooden piles on the Kootenai River, ID, have resulted in rapid bed elevation change following construction. Monitoring of these restoration projects includes post- restoration (as-built) Digital Elevation Models (DEMs) as well as additional channel surveys following high channel forming flows post-construction. The resulting sequence of measured bathymetry provides excellent validation data for morphodynamic simulations at the reach scale of a real river. In this paper we test the performance a quasi-three-dimensional morphodynamic simulation against the measured elevation change. The resulting simulations predict the pattern of channel change reasonably well but many of the details such as the maximum scour are under predicted.

  20. What Works Clearinghouse Quick Review: "Conceptualizing Astronomical Scale: Virtual Simulations on Handheld Tablet Computers Reverse Misconceptions"

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2014

    2014-01-01

    This study examined how using two different ways of displaying the solar system--a true-to-scale mode vs. an orrery mode--affected students' knowledge of astronomical concepts. Solar system displays were presented in a software application on a handheld tablet computer. In the true-to-scale mode, users navigated a simulated three-dimensional solar…

  1. Influence of capillary end effects on steady-state relative permeability estimates from direct pore-scale simulations

    NASA Astrophysics Data System (ADS)

    Guédon, Gaël Raymond; Hyman, Jeffrey De'Haven; Inzoli, Fabio; Riva, Monica; Guadagnini, Alberto

    2017-12-01

    We investigate and characterize the influence of capillary end effects on steady-state relative permeabilities obtained in pore-scale numerical simulations of two-phase flows. Our study is motivated by the observation that capillary end effects documented in two-phase laboratory-scale experiments can significantly influence permeability estimates. While numerical simulations of two-phase flows in reconstructed pore-spaces are increasingly employed to characterize relative permeabilities, a phenomenon which is akin to capillary end effects can also arise in such analyses due to the constraints applied at the boundaries of the computational domain. We profile the relative strength of these capillary end effects on the calculation of steady-state relative permeabilities obtained within randomly generated porous micro-structures using a finite volume-based two-phase flow solver. We suggest a procedure to estimate the extent of the regions influenced by these capillary end effects, which in turn allows for the alleviation of bias in the estimation of relative permeabilities.

  2. A pore-scale numerical method for simulating low-salinity waterflooding in porous media

    NASA Astrophysics Data System (ADS)

    Jiang, F.; Yang, J.; Tsuji, T.

    2017-12-01

    Low-salinity (LS)water injection has been attracting attention as a practical oil recovery technique because of its low cost and high efficiency in recent years. Many researchers conducted laboratory and observed its significant benefits compared to conventional high-salinity (HS) waterflooding. However, the fundamental mechanisms remain poorly understood. Different mechanisms such as fine migration, wettability alteration have been proposed to explain this low-salinity effect. Here, we aim to focus on investigating the effect of wettability alteration on the recovery efficiency. For this purpose, we proposed a pore scale numerical method to quantitatively evaluate the impact of salinity concentration on the sweep efficiency. We first developed the pore scale model by coupling the convection-diffusion model for tracking the concentration change and the lattice Boltzmann model for two-phase flow behavior, and assuming that a reduction of water salinity leads to localised wettability alteration. The model is then validated by simulating the contact angle change of an oil droplet attached to a clay substrate. Finally, the method was applied on a real rock geometry extracted from the micro-CT images of Berea sandstone. The results indicate that the initial wettability state of the system and the extent of wettability alteration are important in predicting the improvement of oil recovery due to LS brine injection. This work was supported by JSPS KAKENHI Grant Numbers 16K18331.

  3. Large-scale ground motion simulation using GPGPU

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Maeda, T.; Nishizawa, N.; Aoki, T.

    2012-12-01

    Huge computation resources are required to perform large-scale ground motion simulations using 3-D finite difference method (FDM) for realistic and complex models with high accuracy. Furthermore, thousands of various simulations are necessary to evaluate the variability of the assessment caused by uncertainty of the assumptions of the source models for future earthquakes. To conquer the problem of restricted computational resources, we introduced the use of GPGPU (General purpose computing on graphics processing units) which is the technique of using a GPU as an accelerator of the computation which has been traditionally conducted by the CPU. We employed the CPU version of GMS (Ground motion Simulator; Aoi et al., 2004) as the original code and implemented the function for GPU calculation using CUDA (Compute Unified Device Architecture). GMS is a total system for seismic wave propagation simulation based on 3-D FDM scheme using discontinuous grids (Aoi&Fujiwara, 1999), which includes the solver as well as the preprocessor tools (parameter generation tool) and postprocessor tools (filter tool, visualization tool, and so on). The computational model is decomposed in two horizontal directions and each decomposed model is allocated to a different GPU. We evaluated the performance of our newly developed GPU version of GMS on the TSUBAME2.0 which is one of the Japanese fastest supercomputer operated by the Tokyo Institute of Technology. First we have performed a strong scaling test using the model with about 22 million grids and achieved 3.2 and 7.3 times of the speed-up by using 4 and 16 GPUs. Next, we have examined a weak scaling test where the model sizes (number of grids) are increased in proportion to the degree of parallelism (number of GPUs). The result showed almost perfect linearity up to the simulation with 22 billion grids using 1024 GPUs where the calculation speed reached to 79.7 TFlops and about 34 times faster than the CPU calculation using the same number

  4. Development of the Transport Class Model (TCM) Aircraft Simulation From a Sub-Scale Generic Transport Model (GTM) Simulation

    NASA Technical Reports Server (NTRS)

    Hueschen, Richard M.

    2011-01-01

    A six degree-of-freedom, flat-earth dynamics, non-linear, and non-proprietary aircraft simulation was developed that is representative of a generic mid-sized twin-jet transport aircraft. The simulation was developed from a non-proprietary, publicly available, subscale twin-jet transport aircraft simulation using scaling relationships and a modified aerodynamic database. The simulation has an extended aerodynamics database with aero data outside the normal transport-operating envelope (large angle-of-attack and sideslip values). The simulation has representative transport aircraft surface actuator models with variable rate-limits and generally fixed position limits. The simulation contains a generic 40,000 lb sea level thrust engine model. The engine model is a first order dynamic model with a variable time constant that changes according to simulation conditions. The simulation provides a means for interfacing a flight control system to use the simulation sensor variables and to command the surface actuators and throttle position of the engine model.

  5. NIF laboratory astrophysics simulations investigating the effects of a radiative shock on hydrodynamic instabilities

    NASA Astrophysics Data System (ADS)

    Angulo, A. A.; Kuranz, C. C.; Drake, R. P.; Huntington, C. M.; Park, H.-S.; Remington, B. A.; Kalantar, D.; MacLaren, S.; Raman, K.; Miles, A.; Trantham, Matthew; Kline, J. L.; Flippo, K.; Doss, F. W.; Shvarts, D.

    2016-10-01

    This poster will describe simulations based on results from ongoing laboratory astrophysics experiments at the National Ignition Facility (NIF) relevant to the effects of radiative shock on hydrodynamically unstable surfaces. The experiments performed on NIF uniquely provide the necessary conditions required to emulate radiative shock that occurs in astrophysical systems. The core-collapse explosions of red supergiant stars is such an example wherein the interaction between the supernova ejecta and the circumstellar medium creates a region susceptible to Rayleigh-Taylor (R-T) instabilities. Radiative and nonradiative experiments were performed to show that R-T growth should be reduced by the effects of the radiative shocks that occur during this core-collapse. Simulations were performed using the radiation hydrodynamics code Hyades using the experimental conditions to find the mean interface acceleration of the instability and then further analyzed in the buoyancy drag model to observe how the material expansion contributes to the mix-layer growth. This work is funded by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas under Grant Number DE-FG52-09NA29548.

  6. Hierarchical coarse-graining strategy for protein-membrane systems to access mesoscopic scales

    PubMed Central

    Ayton, Gary S.; Lyman, Edward

    2014-01-01

    An overall multiscale simulation strategy for large scale coarse-grain simulations of membrane protein systems is presented. The protein is modeled as a heterogeneous elastic network, while the lipids are modeled using the hybrid analytic-systematic (HAS) methodology, where in both cases atomistic level information obtained from molecular dynamics simulation is used to parameterize the model. A feature of this approach is that from the outset liposome length scales are employed in the simulation (i.e., on the order of ½ a million lipids plus protein). A route to develop highly coarse-grained models from molecular-scale information is proposed and results for N-BAR domain protein remodeling of a liposome are presented. PMID:20158037

  7. Multi-scale gyrokinetic simulation of Alcator C-Mod tokamak discharges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howard, N. T., E-mail: nthoward@psfc.mit.edu; White, A. E.; Greenwald, M.

    2014-03-15

    Alcator C-Mod tokamak discharges have been studied with nonlinear gyrokinetic simulation simultaneously spanning both ion and electron spatiotemporal scales. These multi-scale simulations utilized the gyrokinetic model implemented by GYRO code [J. Candy and R. E. Waltz, J. Comput. Phys. 186, 545 (2003)] and the approximation of reduced electron mass (μ = (m{sub D}/m{sub e}){sup .5} = 20.0) to qualitatively study a pair of Alcator C-Mod discharges: a low-power discharge, previously demonstrated (using realistic mass, ion-scale simulation) to display an under-prediction of the electron heat flux and a high-power discharge displaying agreement with both ion and electron heat flux channels [N. T. Howard et al.,more » Nucl. Fusion 53, 123011 (2013)]. These multi-scale simulations demonstrate the importance of electron-scale turbulence in the core of conventional tokamak discharges and suggest it is a viable candidate for explaining the observed under-prediction of electron heat flux. In this paper, we investigate the coupling of turbulence at the ion (k{sub θ}ρ{sub s}∼O(1.0)) and electron (k{sub θ}ρ{sub e}∼O(1.0)) scales for experimental plasma conditions both exhibiting strong (high-power) and marginally stable (low-power) low-k (k{sub θ}ρ{sub s} < 1.0) turbulence. It is found that reduced mass simulation of the plasma exhibiting marginally stable low-k turbulence fails to provide even qualitative insight into the turbulence present in the realistic plasma conditions. In contrast, multi-scale simulation of the plasma condition exhibiting strong turbulence provides valuable insight into the coupling of the ion and electron scales.« less

  8. A study of the parallel algorithm for large-scale DC simulation of nonlinear systems

    NASA Astrophysics Data System (ADS)

    Cortés Udave, Diego Ernesto; Ogrodzki, Jan; Gutiérrez de Anda, Miguel Angel

    Newton-Raphson DC analysis of large-scale nonlinear circuits may be an extremely time consuming process even if sparse matrix techniques and bypassing of nonlinear models calculation are used. A slight decrease in the time required for this task may be enabled on multi-core, multithread computers if the calculation of the mathematical models for the nonlinear elements as well as the stamp management of the sparse matrix entries are managed through concurrent processes. This numerical complexity can be further reduced via the circuit decomposition and parallel solution of blocks taking as a departure point the BBD matrix structure. This block-parallel approach may give a considerable profit though it is strongly dependent on the system topology and, of course, on the processor type. This contribution presents the easy-parallelizable decomposition-based algorithm for DC simulation and provides a detailed study of its effectiveness.

  9. High Fidelity Simulations of Large-Scale Wireless Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onunkwo, Uzoma; Benz, Zachary

    The worldwide proliferation of wireless connected devices continues to accelerate. There are 10s of billions of wireless links across the planet with an additional explosion of new wireless usage anticipated as the Internet of Things develops. Wireless technologies do not only provide convenience for mobile applications, but are also extremely cost-effective to deploy. Thus, this trend towards wireless connectivity will only continue and Sandia must develop the necessary simulation technology to proactively analyze the associated emerging vulnerabilities. Wireless networks are marked by mobility and proximity-based connectivity. The de facto standard for exploratory studies of wireless networks is discrete event simulationsmore » (DES). However, the simulation of large-scale wireless networks is extremely difficult due to prohibitively large turnaround time. A path forward is to expedite simulations with parallel discrete event simulation (PDES) techniques. The mobility and distance-based connectivity associated with wireless simulations, however, typically doom PDES and fail to scale (e.g., OPNET and ns-3 simulators). We propose a PDES-based tool aimed at reducing the communication overhead between processors. The proposed solution will use light-weight processes to dynamically distribute computation workload while mitigating communication overhead associated with synchronizations. This work is vital to the analytics and validation capabilities of simulation and emulation at Sandia. We have years of experience in Sandia’s simulation and emulation projects (e.g., MINIMEGA and FIREWHEEL). Sandia’s current highly-regarded capabilities in large-scale emulations have focused on wired networks, where two assumptions prevent scalable wireless studies: (a) the connections between objects are mostly static and (b) the nodes have fixed locations.« less

  10. Large-Scale Reactive Atomistic Simulation of Shock-induced Initiation Processes in Energetic Materials

    NASA Astrophysics Data System (ADS)

    Thompson, Aidan

    2013-06-01

    Initiation in energetic materials is fundamentally dependent on the interaction between a host of complex chemical and mechanical processes, occurring on scales ranging from intramolecular vibrations through molecular crystal plasticity up to hydrodynamic phenomena at the mesoscale. A variety of methods (e.g. quantum electronic structure methods (QM), non-reactive classical molecular dynamics (MD), mesoscopic continuum mechanics) exist to study processes occurring on each of these scales in isolation, but cannot describe how these processes interact with each other. In contrast, the ReaxFF reactive force field, implemented in the LAMMPS parallel MD code, allows us to routinely perform multimillion-atom reactive MD simulations of shock-induced initiation in a variety of energetic materials. This is done either by explicitly driving a shock-wave through the structure (NEMD) or by imposing thermodynamic constraints on the collective dynamics of the simulation cell e.g. using the Multiscale Shock Technique (MSST). These MD simulations allow us to directly observe how energy is transferred from the shockwave into other processes, including intramolecular vibrational modes, plastic deformation of the crystal, and hydrodynamic jetting at interfaces. These processes in turn cause thermal excitation of chemical bonds leading to initial chemical reactions, and ultimately to exothermic formation of product species. Results will be presented on the application of this approach to several important energetic materials, including pentaerythritol tetranitrate (PETN) and ammonium nitrate/fuel oil (ANFO). In both cases, we validate the ReaxFF parameterizations against QM and experimental data. For PETN, we observe initiation occurring via different chemical pathways, depending on the shock direction. For PETN containing spherical voids, we observe enhanced sensitivity due to jetting, void collapse, and hotspot formation, with sensitivity increasing with void size. For ANFO, we

  11. Simulation of Left Atrial Function Using a Multi-Scale Model of the Cardiovascular System

    PubMed Central

    Pironet, Antoine; Dauby, Pierre C.; Paeme, Sabine; Kosta, Sarah; Chase, J. Geoffrey; Desaive, Thomas

    2013-01-01

    During a full cardiac cycle, the left atrium successively behaves as a reservoir, a conduit and a pump. This complex behavior makes it unrealistic to apply the time-varying elastance theory to characterize the left atrium, first, because this theory has known limitations, and second, because it is still uncertain whether the load independence hypothesis holds. In this study, we aim to bypass this uncertainty by relying on another kind of mathematical model of the cardiac chambers. In the present work, we describe both the left atrium and the left ventricle with a multi-scale model. The multi-scale property of this model comes from the fact that pressure inside a cardiac chamber is derived from a model of the sarcomere behavior. Macroscopic model parameters are identified from reference dog hemodynamic data. The multi-scale model of the cardiovascular system including the left atrium is then simulated to show that the physiological roles of the left atrium are correctly reproduced. This include a biphasic pressure wave and an eight-shaped pressure-volume loop. We also test the validity of our model in non basal conditions by reproducing a preload reduction experiment by inferior vena cava occlusion with the model. We compute the variation of eight indices before and after this experiment and obtain the same variation as experimentally observed for seven out of the eight indices. In summary, the multi-scale mathematical model presented in this work is able to correctly account for the three roles of the left atrium and also exhibits a realistic left atrial pressure-volume loop. Furthermore, the model has been previously presented and validated for the left ventricle. This makes it a proper alternative to the time-varying elastance theory if the focus is set on precisely representing the left atrial and left ventricular behaviors. PMID:23755183

  12. Toward simulating complex systems with quantum effects

    NASA Astrophysics Data System (ADS)

    Kenion-Hanrath, Rachel Lynn

    Quantum effects like tunneling, coherence, and zero point energy often play a significant role in phenomena on the scales of atoms and molecules. However, the exact quantum treatment of a system scales exponentially with dimensionality, making it impractical for characterizing reaction rates and mechanisms in complex systems. An ongoing effort in the field of theoretical chemistry and physics is extending scalable, classical trajectory-based simulation methods capable of capturing quantum effects to describe dynamic processes in many-body systems; in the work presented here we explore two such techniques. First, we detail an explicit electron, path integral (PI)-based simulation protocol for predicting the rate of electron transfer in condensed-phase transition metal complex systems. Using a PI representation of the transferring electron and a classical representation of the transition metal complex and solvent atoms, we compute the outer sphere free energy barrier and dynamical recrossing factor of the electron transfer rate while accounting for quantum tunneling and zero point energy effects. We are able to achieve this employing only a single set of force field parameters to describe the system rather than parameterizing along the reaction coordinate. Following our success in describing a simple model system, we discuss our next steps in extending our protocol to technologically relevant materials systems. The latter half focuses on the Mixed Quantum-Classical Initial Value Representation (MQC-IVR) of real-time correlation functions, a semiclassical method which has demonstrated its ability to "tune'' between quantum- and classical-limit correlation functions while maintaining dynamic consistency. Specifically, this is achieved through a parameter that determines the quantumness of individual degrees of freedom. Here, we derive a semiclassical correction term for the MQC-IVR to systematically characterize the error introduced by different choices of simulation

  13. Laboratory evaluation and application of microwave absorption properties under simulated conditions for planetary atmospheres

    NASA Technical Reports Server (NTRS)

    Steffes, Paul G.

    1991-01-01

    Laboratory measurements of microwave and millimeter wave properties of the simulated atmosphere of the outer planets and their satellites has continued. One of the focuses is on the development of a radiative transfer model of the Jovian atmosphere at wavelengths from 1 mm to 10 cm. This modeling effort led to laboratory measurements of the millimeter wave opacity of hydrogen sulfide (H2S) under simulated Jovian conditions. Descriptions of the modeling effort, the Laboratory experiment, and the observations are presented. Correlative studies of measurements with Pioneer-Venus radio occultation measurements with longer wavelength emission measurements have provided new ways for characterizing temporal and spatial variations in the abundance of both gases H2SO4 and SO2, and for modeling their roles in the subcloud atmosphere. Laboratory measurements were conducted on 1.35 cm (and 13 cm) opacity of gaseous SO2 and absorptivity of gaseous SO2 at the 3.2 mm wavelength under simulated Venus conditions. Laboratory measurements were completed on millimeter wave dielectric properties of liquid H2SO4, in order to model the effects of the opacity of the clouds of Venus onto millimeter wave emission spectrum.

  14. Simulation of diurnal thermal energy storage systems: Preliminary results

    NASA Astrophysics Data System (ADS)

    Katipamula, S.; Somasundaram, S.; Williams, H. R.

    1994-12-01

    This report describes the results of a simulation of thermal energy storage (TES) integrated with a simple-cycle gas turbine cogeneration system. Integrating TES with cogeneration can serve the electrical and thermal loads independently while firing all fuel in the gas turbine. The detailed engineering and economic feasibility of diurnal TES systems integrated with cogeneration systems has been described in two previous PNL reports. The objective of this study was to lay the ground work for optimization of the TES system designs using a simulation tool called TRNSYS (TRaNsient SYstem Simulation). TRNSYS is a transient simulation program with a sequential-modular structure developed at the Solar Energy Laboratory, University of Wisconsin-Madison. The two TES systems selected for the base-case simulations were: (1) a one-tank storage model to represent the oil/rock TES system; and (2) a two-tank storage model to represent the molten nitrate salt TES system. Results of the study clearly indicate that an engineering optimization of the TES system using TRNSYS is possible. The one-tank stratified oil/rock storage model described here is a good starting point for parametric studies of a TES system. Further developments to the TRNSYS library of available models (economizer, evaporator, gas turbine, etc.) are recommended so that the phase-change processes is accurately treated.

  15. From micro-scale 3D simulations to macro-scale model of periodic porous media

    NASA Astrophysics Data System (ADS)

    Crevacore, Eleonora; Tosco, Tiziana; Marchisio, Daniele; Sethi, Rajandrea; Messina, Francesca

    2015-04-01

    In environmental engineering, the transport of colloidal suspensions in porous media is studied to understand the fate of potentially harmful nano-particles and to design new remediation technologies. In this perspective, averaging techniques applied to micro-scale numerical simulations are a powerful tool to extrapolate accurate macro-scale models. Choosing two simplified packing configurations of soil grains and starting from a single elementary cell (module), it is possible to take advantage of the periodicity of the structures to reduce the computation costs of full 3D simulations. Steady-state flow simulations for incompressible fluid in laminar regime are implemented. Transport simulations are based on the pore-scale advection-diffusion equation, that can be enriched introducing also the Stokes velocity (to consider the gravity effect) and the interception mechanism. Simulations are carried on a domain composed of several elementary modules, that serve as control volumes in a finite volume method for the macro-scale method. The periodicity of the medium involves the periodicity of the flow field and this will be of great importance during the up-scaling procedure, allowing relevant simplifications. Micro-scale numerical data are treated in order to compute the mean concentration (volume and area averages) and fluxes on each module. The simulation results are used to compare the micro-scale averaged equation to the integral form of the macroscopic one, making a distinction between those terms that could be computed exactly and those for which a closure in needed. Of particular interest it is the investigation of the origin of macro-scale terms such as the dispersion and tortuosity, trying to describe them with micro-scale known quantities. Traditionally, to study the colloidal transport many simplifications are introduced, such those concerning ultra-simplified geometry that usually account for a single collector. Gradual removal of such hypothesis leads to a

  16. A time domain simulation of a beam control system

    NASA Astrophysics Data System (ADS)

    Mitchell, J. R.

    1981-02-01

    The Airborne Laser Laboratory (ALL) is being developed by the Air Force to investigate the integration and operation of high energy laser components in a dynamic airborne environment and to study the propagation of laser light from an airborne vehicle to an airborne target. The ALL is composed of several systems; among these are the Airborne Pointing and Tracking System (APT) and the Automatic Alignment System (AAS). This report presents the results of performing a time domain dynamic simulation for an integrated beam control system composed of the APT and AAS. The simulation is performed on a digital computer using the MIMIC language. It includes models of the dynamics of the system and of disturbances. Also presented in the report are the rationales and developments of these models. The data from the simulation code is summarized by several plots. In addition results from massaging the data with waveform analysis packages are presented. The results are discussed and conclusions are drawn.

  17. Dynamics analysis of the fast-slow hydro-turbine governing system with different time-scale coupling

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Chen, Diyi; Wu, Changzhi; Wang, Xiangyu

    2018-01-01

    Multi-time scales modeling of hydro-turbine governing system is crucial in precise modeling of hydropower plant and provides support for the stability analysis of the system. Considering the inertia and response time of the hydraulic servo system, the hydro-turbine governing system is transformed into the fast-slow hydro-turbine governing system. The effects of the time-scale on the dynamical behavior of the system are analyzed and the fast-slow dynamical behaviors of the system are investigated with different time-scale. Furthermore, the theoretical analysis of the stable regions is presented. The influences of the time-scale on the stable region are analyzed by simulation. The simulation results prove the correctness of the theoretical analysis. More importantly, the methods and results of this paper provide a perspective to multi-time scales modeling of hydro-turbine governing system and contribute to the optimization analysis and control of the system.

  18. The Protein Information Management System (PiMS): a generic tool for any structural biology research laboratory

    PubMed Central

    Morris, Chris; Pajon, Anne; Griffiths, Susanne L.; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M.; Wilter da Silva, Alan; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S.; Stuart, David I.; Henrick, Kim; Esnouf, Robert M.

    2011-01-01

    The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service. PMID:21460443

  19. The Protein Information Management System (PiMS): a generic tool for any structural biology research laboratory.

    PubMed

    Morris, Chris; Pajon, Anne; Griffiths, Susanne L; Daniel, Ed; Savitsky, Marc; Lin, Bill; Diprose, Jonathan M; da Silva, Alan Wilter; Pilicheva, Katya; Troshin, Peter; van Niekerk, Johannes; Isaacs, Neil; Naismith, James; Nave, Colin; Blake, Richard; Wilson, Keith S; Stuart, David I; Henrick, Kim; Esnouf, Robert M

    2011-04-01

    The techniques used in protein production and structural biology have been developing rapidly, but techniques for recording the laboratory information produced have not kept pace. One approach is the development of laboratory information-management systems (LIMS), which typically use a relational database schema to model and store results from a laboratory workflow. The underlying philosophy and implementation of the Protein Information Management System (PiMS), a LIMS development specifically targeted at the flexible and unpredictable workflows of protein-production research laboratories of all scales, is described. PiMS is a web-based Java application that uses either Postgres or Oracle as the underlying relational database-management system. PiMS is available under a free licence to all academic laboratories either for local installation or for use as a managed service.

  20. Direct geoelectrical evidence of mass transfer at the laboratory scale

    NASA Astrophysics Data System (ADS)

    Swanson, Ryan D.; Singha, Kamini; Day-Lewis, Frederick D.; Binley, Andrew; Keating, Kristina; Haggerty, Roy

    2012-10-01

    Previous field-scale experimental data and numerical modeling suggest that the dual-domain mass transfer (DDMT) of electrolytic tracers has an observable geoelectrical signature. Here we present controlled laboratory experiments confirming the electrical signature of DDMT and demonstrate the use of time-lapse electrical measurements in conjunction with concentration measurements to estimate the parameters controlling DDMT, i.e., the mobile and immobile porosity and rate at which solute exchanges between mobile and immobile domains. We conducted column tracer tests on unconsolidated quartz sand and a material with a high secondary porosity: the zeolite clinoptilolite. During NaCl tracer tests we collected nearly colocated bulk direct-current electrical conductivity (σb) and fluid conductivity (σf) measurements. Our results for the zeolite show (1) extensive tailing and (2) a hysteretic relation between σf and σb, thus providing evidence of mass transfer not observed within the quartz sand. To identify best-fit parameters and evaluate parameter sensitivity, we performed over 2700 simulations of σf, varying the immobile and mobile domain and mass transfer rate. We emphasized the fit to late-time tailing by minimizing the Box-Cox power transformed root-mean square error between the observed and simulated σf. Low-field proton nuclear magnetic resonance (NMR) measurements provide an independent quantification of the volumes of the mobile and immobile domains. The best-fit parameters based on σf match the NMR measurements of the immobile and mobile domain porosities and provide the first direct electrical evidence for DDMT. Our results underscore the potential of using electrical measurements for DDMT parameter inference.

  1. Direct geoelectrical evidence of mass transfer at the laboratory scale

    USGS Publications Warehouse

    Swanson, Ryan D.; Singha, Kamini; Day-Lewis, Frederick D.; Binley, Andrew; Keating, Kristina; Haggerty, Roy

    2012-01-01

    Previous field-scale experimental data and numerical modeling suggest that the dual-domain mass transfer (DDMT) of electrolytic tracers has an observable geoelectrical signature. Here we present controlled laboratory experiments confirming the electrical signature of DDMT and demonstrate the use of time-lapse electrical measurements in conjunction with concentration measurements to estimate the parameters controlling DDMT, i.e., the mobile and immobile porosity and rate at which solute exchanges between mobile and immobile domains. We conducted column tracer tests on unconsolidated quartz sand and a material with a high secondary porosity: the zeolite clinoptilolite. During NaCl tracer tests we collected nearly colocated bulk direct-current electrical conductivity (σb) and fluid conductivity (σf) measurements. Our results for the zeolite show (1) extensive tailing and (2) a hysteretic relation between σf and σb, thus providing evidence of mass transfer not observed within the quartz sand. To identify best-fit parameters and evaluate parameter sensitivity, we performed over 2700 simulations of σf, varying the immobile and mobile domain and mass transfer rate. We emphasized the fit to late-time tailing by minimizing the Box-Cox power transformed root-mean square error between the observed and simulated σf. Low-field proton nuclear magnetic resonance (NMR) measurements provide an independent quantification of the volumes of the mobile and immobile domains. The best-fit parameters based on σf match the NMR measurements of the immobile and mobile domain porosities and provide the first direct electrical evidence for DDMT. Our results underscore the potential of using electrical measurements for DDMT parameter inference.

  2. Practical recipes for the model order reduction, dynamical simulation and compressive sampling of large-scale open quantum systems

    NASA Astrophysics Data System (ADS)

    Sidles, John A.; Garbini, Joseph L.; Harrell, Lee E.; Hero, Alfred O.; Jacky, Jonathan P.; Malcomb, Joseph R.; Norman, Anthony G.; Williamson, Austin M.

    2009-06-01

    Practical recipes are presented for simulating high-temperature and nonequilibrium quantum spin systems that are continuously measured and controlled. The notion of a spin system is broadly conceived, in order to encompass macroscopic test masses as the limiting case of large-j spins. The simulation technique has three stages: first the deliberate introduction of noise into the simulation, then the conversion of that noise into an equivalent continuous measurement and control process, and finally, projection of the trajectory onto state-space manifolds having reduced dimensionality and possessing a Kähler potential of multilinear algebraic form. These state-spaces can be regarded as ruled algebraic varieties upon which a projective quantum model order reduction (MOR) is performed. The Riemannian sectional curvature of ruled Kählerian varieties is analyzed, and proved to be non-positive upon all sections that contain a rule. These manifolds are shown to contain Slater determinants as a special case and their identity with Grassmannian varieties is demonstrated. The resulting simulation formalism is used to construct a positive P-representation for the thermal density matrix. Single-spin detection by magnetic resonance force microscopy (MRFM) is simulated, and the data statistics are shown to be those of a random telegraph signal with additive white noise. Larger-scale spin-dust models are simulated, having no spatial symmetry and no spatial ordering; the high-fidelity projection of numerically computed quantum trajectories onto low dimensionality Kähler state-space manifolds is demonstrated. The reconstruction of quantum trajectories from sparse random projections is demonstrated, the onset of Donoho-Stodden breakdown at the Candès-Tao sparsity limit is observed, a deterministic construction for sampling matrices is given and methods for quantum state optimization by Dantzig selection are given.

  3. Science Laboratory Depth of Learning: Interactive Multimedia Simulation and Virtual Dissection Software

    ERIC Educational Resources Information Center

    Yuza, Steve C.

    2010-01-01

    The purpose of this study was to determine the effects of interactive multimedia simulations and virtual dissection software on depth of learning among students participating in biology and chemistry laboratory courses. By understanding more about how simulation and virtual dissection software changes depth of learning, educators will have the…

  4. Measuring ignitability for in situ burning of oil spills weathered under Arctic conditions: from laboratory studies to large-scale field experiments.

    PubMed

    Fritt-Rasmussen, Janne; Brandvik, Per Johan

    2011-08-01

    This paper compares the ignitability of Troll B crude oil weathered under simulated Arctic conditions (0%, 50% and 90% ice cover). The experiments were performed in different scales at SINTEF's laboratories in Trondheim, field research station on Svalbard and in broken ice (70-90% ice cover) in the Barents Sea. Samples from the weathering experiments were tested for ignitability using the same laboratory burning cell. The measured ignitability from the experiments in these different scales showed a good agreement for samples with similar weathering. The ice conditions clearly affected the weathering process, and 70% ice or more reduces the weathering and allows a longer time window for in situ burning. The results from the Barents Sea revealed that weathering and ignitability can vary within an oil slick. This field use of the burning cell demonstrated that it can be used as an operational tool to monitor the ignitability of oil spills. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. A Novel Simulation Technician Laboratory Design: Results of a Survey-Based Study

    PubMed Central

    Hughes, Patrick G; Friedl, Ed; Ortiz Figueroa, Fabiana; Cepeda Brito, Jose R; Frey, Jennifer; Birmingham, Lauren E; Atkinson, Steven Scott

    2016-01-01

    Objective  The purpose of this study was to elicit feedback from simulation technicians prior to developing the first simulation technician-specific simulation laboratory in Akron, OH. Background Simulation technicians serve a vital role in simulation centers within hospitals/health centers around the world. The first simulation technician degree program in the US has been approved in Akron, OH. To satisfy the requirements of this program and to meet the needs of this special audience of learners, a customized simulation lab is essential.  Method A web-based survey was circulated to simulation technicians prior to completion of the lab for the new program. The survey consisted of questions aimed at identifying structural and functional design elements of a novel simulation center for the training of simulation technicians. Quantitative methods were utilized to analyze data. Results Over 90% of technicians (n=65) think that a lab designed explicitly for the training of technicians is novel and beneficial. Approximately 75% of respondents think that the space provided appropriate audiovisual (AV) infrastructure and space to evaluate the ability of technicians to be independent. The respondents think that the lab needed more storage space, visualization space for a large number of students, and more space in the technical/repair area. Conclusions  A space designed for the training of simulation technicians was considered to be beneficial. This laboratory requires distinct space for technical repair, adequate bench space for the maintenance and repair of simulators, an appropriate AV infrastructure, and space to evaluate the ability of technicians to be independent. PMID:27096134

  6. A Novel Simulation Technician Laboratory Design: Results of a Survey-Based Study.

    PubMed

    Ahmed, Rami; Hughes, Patrick G; Friedl, Ed; Ortiz Figueroa, Fabiana; Cepeda Brito, Jose R; Frey, Jennifer; Birmingham, Lauren E; Atkinson, Steven Scott

    2016-03-16

    OBJECTIVE : The purpose of this study was to elicit feedback from simulation technicians prior to developing the first simulation technician-specific simulation laboratory in Akron, OH. Simulation technicians serve a vital role in simulation centers within hospitals/health centers around the world. The first simulation technician degree program in the US has been approved in Akron, OH. To satisfy the requirements of this program and to meet the needs of this special audience of learners, a customized simulation lab is essential. A web-based survey was circulated to simulation technicians prior to completion of the lab for the new program. The survey consisted of questions aimed at identifying structural and functional design elements of a novel simulation center for the training of simulation technicians. Quantitative methods were utilized to analyze data. Over 90% of technicians (n=65) think that a lab designed explicitly for the training of technicians is novel and beneficial. Approximately 75% of respondents think that the space provided appropriate audiovisual (AV) infrastructure and space to evaluate the ability of technicians to be independent. The respondents think that the lab needed more storage space, visualization space for a large number of students, and more space in the technical/repair area. CONCLUSIONS : A space designed for the training of simulation technicians was considered to be beneficial. This laboratory requires distinct space for technical repair, adequate bench space for the maintenance and repair of simulators, an appropriate AV infrastructure, and space to evaluate the ability of technicians to be independent.

  7. Progress in fast, accurate multi-scale climate simulations

    DOE PAGES

    Collins, W. D.; Johansen, H.; Evans, K. J.; ...

    2015-06-01

    We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less

  8. Simulator training to minimize ionizing radiation exposure in the catheterization laboratory.

    PubMed

    Katz, Aric; Shtub, Avraham; Solomonica, Amir; Poliakov, Adva; Roguin, Ariel

    2017-03-01

    To learn about radiation and how to lower it. Patients and operators are routinely exposed to high doses of ionizing radiation during catheterization procedures. This increased exposure to ionizing radiation is partially due to a lack of awareness to the effects of ionizing radiation, and lack of knowledge on the distribution and behavior of scattered radiation. A simulator, which incorporates data on scattered ionizing radiation, was built based on multiple phantom measurements and used for teaching radiation safety. The validity of the simulator was confirmed in three catheterization laboratories and tested by 20 interventional cardiologists. All evaluators were tested by an objective knowledge examination before, immediately following, and 12 weeks after simulator-based learning and training. A subjective Likert questionnaire on satisfaction with simulation-based learning and training was also completed. The 20 evaluators learned and retained the knowledge that they gained from using the simulator: the average scores of the knowledge examination pre-simulator training was 54 ± 15% (mean ± standard deviation), and this score significantly increased after training to 94 ± 10% (p < 0.001). The evaluators also reported high levels of satisfaction following simulation-based learning and training according to the results of the subjective Likert questionnaire. Simulators can be used to train cardiology staff and fellows and to further educate experienced personnel on radiation safety. As a result of simulator training, the operator gains knowledge, which can then be applied in the catheterization laboratory in order to reduce radiation doses to the patient and to the operator, thereby improving the safety of the intervention.

  9. Microphysics in the Multi-Scale Modeling Systems with Unified Physics

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2011-01-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (l) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, the microphysics developments of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the heavy precipitation processes will be presented.

  10. Computationally Efficient Modeling and Simulation of Large Scale Systems

    NASA Technical Reports Server (NTRS)

    Jain, Jitesh (Inventor); Koh, Cheng-Kok (Inventor); Balakrishnan, Vankataramanan (Inventor); Cauley, Stephen F (Inventor); Li, Hong (Inventor)

    2014-01-01

    A system for simulating operation of a VLSI interconnect structure having capacitive and inductive coupling between nodes thereof, including a processor, and a memory, the processor configured to perform obtaining a matrix X and a matrix Y containing different combinations of passive circuit element values for the interconnect structure, the element values for each matrix including inductance L and inverse capacitance P, obtaining an adjacency matrix A associated with the interconnect structure, storing the matrices X, Y, and A in the memory, and performing numerical integration to solve first and second equations.

  11. Enhancing nursing informatics competencies and critical thinking skills using wireless clinical simulation laboratories.

    PubMed

    Cholewka, Patricia A; Mohr, Bernard

    2009-01-01

    Nursing students at New York City College of Technology are assigned client care experiences that focus on common alterations in health status. However, due to the unpredictability of client census within any healthcare facility, it is not possible for all students to have the same opportunity to care for clients with specific medical conditions. But with the use of patient simulators in a dedicated Clinical Simulation Laboratory setting, students can be universally, consistently, and repeatedly exposed to programmed scenarios that connect theory with the clinical environment. Outcomes from using patient simulators include improved nursing knowledge base, enhanced critical thinking, reflective learning, and increased understanding of information technology for using a Personal Digital Assistant and documenting care by means of an electronic Patient Record System. An innovative nursing education model using a wireless, inter-connective data network was developed by this college in response to the need for increasing nursing informatics competencies and critical thinking skills by students in preparation for client care.

  12. The scaling of relativistic double-year widths - Poisson-Vlasov solutions and particle-in-cell simulations

    NASA Technical Reports Server (NTRS)

    Sulkanen, Martin E.; Borovsky, Joseph E.

    1992-01-01

    The study of relativistic plasma double layers is described through the solution of the one-dimensional, unmagnetized, steady-state Poisson-Vlasov equations and by means of one-dimensional, unmagnetized, particle-in-cell simulations. The thickness vs potential-drop scaling law is extended to relativistic potential drops and relativistic plasma temperatures. The transition in the scaling law for 'strong' double layers suggested by analytical two-beam models by Carlqvist (1982) is confirmed, and causality problems of standard double-layer simulation techniques applied to relativistic plasma systems are discussed.

  13. A Process for Comparing Dynamics of Distributed Space Systems Simulations

    NASA Technical Reports Server (NTRS)

    Cures, Edwin Z.; Jackson, Albert A.; Morris, Jeffery C.

    2009-01-01

    The paper describes a process that was developed for comparing the primary orbital dynamics behavior between space systems distributed simulations. This process is used to characterize and understand the fundamental fidelities and compatibilities of the modeling of orbital dynamics between spacecraft simulations. This is required for high-latency distributed simulations such as NASA s Integrated Mission Simulation and must be understood when reporting results from simulation executions. This paper presents 10 principal comparison tests along with their rationale and examples of the results. The Integrated Mission Simulation (IMSim) (formerly know as the Distributed Space Exploration Simulation (DSES)) is a NASA research and development project focusing on the technologies and processes that are related to the collaborative simulation of complex space systems involved in the exploration of our solar system. Currently, the NASA centers that are actively participating in the IMSim project are the Ames Research Center, the Jet Propulsion Laboratory (JPL), the Johnson Space Center (JSC), the Kennedy Space Center, the Langley Research Center and the Marshall Space Flight Center. In concept, each center participating in IMSim has its own set of simulation models and environment(s). These simulation tools are used to build the various simulation products that are used for scientific investigation, engineering analysis, system design, training, planning, operations and more. Working individually, these production simulations provide important data to various NASA projects.

  14. A Multi-scale Modeling System with Unified Physics to Study Precipitation Processes

    NASA Astrophysics Data System (ADS)

    Tao, W. K.

    2017-12-01

    In recent years, exponentially increasing computer power has extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 km2 in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale model can be run in grid size similar to cloud resolving model through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), and (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF). The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling system to study the precipitation, processes and their sensitivity on model resolution and microphysics schemes will be presented. Also how to use of the multi-satellite simulator to improve precipitation processes will be discussed.

  15. Computer and laboratory simulation in the teaching of neonatal nursing: innovation and impact on learning 1

    PubMed Central

    Fonseca, Luciana Mara Monti; Aredes, Natália Del' Angelo; Fernandes, Ananda Maria; Batalha, Luís Manuel da Cunha; Apóstolo, Jorge Manuel Amado; Martins, José Carlos Amado; Rodrigues, Manuel Alves

    2016-01-01

    ABSTRACT Objectives: to evaluate the cognitive learning of nursing students in neonatal clinical evaluation from a blended course with the use of computer and laboratory simulation; to compare the cognitive learning of students in a control and experimental group testing the laboratory simulation; and to assess the extracurricular blended course offered on the clinical assessment of preterm infants, according to the students. Method: a quasi-experimental study with 14 Portuguese students, containing pretest, midterm test and post-test. The technologies offered in the course were serious game e-Baby, instructional software of semiology and semiotechnique, and laboratory simulation. Data collection tools developed for this study were used for the course evaluation and characterization of the students. Nonparametric statistics were used: Mann-Whitney and Wilcoxon. Results: the use of validated digital technologies and laboratory simulation demonstrated a statistically significant difference (p = 0.001) in the learning of the participants. The course was evaluated as very satisfactory for them. The laboratory simulation alone did not represent a significant difference in the learning. Conclusions: the cognitive learning of participants increased significantly. The use of technology can be partly responsible for the course success, showing it to be an important teaching tool for innovation and motivation of learning in healthcare. PMID:27737376

  16. [The future of clinical laboratory database management system].

    PubMed

    Kambe, M; Imidy, D; Matsubara, A; Sugimoto, Y

    1999-09-01

    To assess the present status of the clinical laboratory database management system, the difference between the Clinical Laboratory Information System and Clinical Laboratory System was explained in this study. Although three kinds of database management systems (DBMS) were shown including the relational model, tree model and network model, the relational model was found to be the best DBMS for the clinical laboratory database based on our experience and developments of some clinical laboratory expert systems. As a future clinical laboratory database management system, the IC card system connected to an automatic chemical analyzer was proposed for personal health data management and a microscope/video system was proposed for dynamic data management of leukocytes or bacteria.

  17. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  18. Scale-Up of GRCop: From Laboratory to Rocket Engines

    NASA Technical Reports Server (NTRS)

    Ellis, David L.

    2016-01-01

    GRCop is a high temperature, high thermal conductivity copper-based series of alloys designed primarily for use in regeneratively cooled rocket engine liners. It began with laboratory-level production of a few grams of ribbon produced by chill block melt spinning and has grown to commercial-scale production of large-scale rocket engine liners. Along the way, a variety of methods of consolidating and working the alloy were examined, a database of properties was developed and a variety of commercial and government applications were considered. This talk will briefly address the basic material properties used for selection of compositions to scale up, the methods used to go from simple ribbon to rocket engines, the need to develop a suitable database, and the issues related to getting the alloy into a rocket engine or other application.

  19. Effect of Aperture Field Variability, Flow Rate, and Ionic Strength on Colloid Transport in Single Fractures: Laboratory-Scale Experiments and Numerical Simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Q.; Dickson, S.; Guo, Y.

    2007-12-01

    A good understanding of the physico-chemical processes (i.e., advection, dispersion, attachment/detachment, straining, sedimentation etc.) governing colloid transport in fractured media is imperative in order to develop appropriate bioremediation and/or bioaugmentation strategies for contaminated fractured aquifers, form management plans for groundwater resources to prevent pathogen contamination, and identify suitable radioactive waste disposal sites. However, research in this field is still in its infancy due to the complex heterogeneous nature of fractured media and the resulting difficulty in characterizing this media. The goal of this research is to investigate the effects of aperture field variability, flow rate and ionic strength on colloid transport processes in well characterized single fractures. A combination of laboratory-scale experiments, numerical simulations, and imaging techniques were employed to achieve this goal. Transparent replicas were cast from natural rock fractures, and a light transmission technique was employed to measure their aperture fields directly. The surface properties of the synthetic fractures were characterized by measuring the zeta-potential under different ionic strengths. A 33 (3 increased to the power of 3) factorial experiment was implemented to investigate the influence of aperture field variability, flow rate, and ionic strength on different colloid transport processes in the laboratory-scale fractures, specifically dispersion and attachment/detachment. A fluorescent stain technique was employed to photograph the colloid transport processes, and an analytical solution to the one-dimensional transport equation was fit to the colloid breakthrough curves to calculate the average transport velocity, dispersion coefficient, and attachment/detachment coefficient. The Reynolds equation was solved to obtain the flow field in the measured aperture fields, and the random walk particle tracking technique was employed to model the

  20. Multiple-access phased array antenna simulator for a digital beam-forming system investigation

    NASA Technical Reports Server (NTRS)

    Kerczewski, Robert J.; Yu, John; Walton, Joanne C.; Perl, Thomas D.; Andro, Monty; Alexovich, Robert E.

    1992-01-01

    Future versions of data relay satellite systems are currently being planned by NASA. Being given consideration for implementation are on-board digital beamforming techniques which will allow multiple users to simultaneously access a single S-band phased array antenna system. To investigate the potential performance of such a system, a laboratory simulator has been developed at NASA's Lewis Research Center. This paper describes the system simulator, and in particular, the requirements, design and performance of a key subsystem, the phased array antenna simulator, which provides realistic inputs to the digital processor including multiple signals, noise, and nonlinearities.

  1. Multiple-access phased array antenna simulator for a digital beam forming system investigation

    NASA Technical Reports Server (NTRS)

    Kerczewski, Robert J.; Yu, John; Walton, Joanne C.; Perl, Thomas D.; Andro, Monty; Alexovich, Robert E.

    1992-01-01

    Future versions of data relay satellite systems are currently being planned by NASA. Being given consideration for implementation are on-board digital beamforming techniques which will allow multiple users to simultaneously access a single S-band phased array antenna system. To investigate the potential performance of such a system, a laboratory simulator has been developed at NASA's Lewis Research Center. This paper describes the system simulator, and in particular, the requirements, design, and performance of a key subsystem, the phased array antenna simulator, which provides realistic inputs to the digital processor including multiple signals, noise, and nonlinearities.

  2. Length scale effects of friction in particle compaction using atomistic simulations and a friction scaling model

    NASA Astrophysics Data System (ADS)

    Stone, T. W.; Horstemeyer, M. F.

    2012-09-01

    The objective of this study is to illustrate and quantify the length scale effects related to interparticle friction under compaction. Previous studies have shown as the length scale of a specimen decreases, the strength of a single crystal metal or ceramic increases. The question underlying this research effort continues the thought—If there is a length scale parameter related to the strength of a material, is there a length scale parameter related to friction? To explore the length scale effects of friction, molecular dynamics (MD) simulations using an embedded atom method potential were performed to analyze the compression of two spherical FCC nickel nanoparticles at different contact angles. In the MD model study, we applied a macroscopic plastic contact formulation to determine the normal plastic contact force at the particle interfaces and used the average shear stress from the MD simulations to determine the tangential contact forces. Combining this information with the Coulomb friction law, we quantified the MD interparticle coefficient of friction and showed good agreement with experimental studies and a Discrete Element Method prediction as a function of contact angle. Lastly, we compared our MD simulation friction values to the tribological predictions of Bhushan and Nosonovsky (BN), who developed a friction scaling model based on strain gradient plasticity and dislocation-assisted sliding that included a length scale parameter. The comparison revealed that the BN elastic friction scaling model did a much better job than the BN plastic scaling model of predicting the coefficient of friction values obtained from the MD simulations.

  3. Ground Contact Model for Mars Science Laboratory Mission Simulations

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad; Way, David

    2012-01-01

    The Program to Optimize Simulated Trajectories II (POST 2) has been successful in simulating the flight of launch vehicles and entry bodies on earth and other planets. POST 2 has been the primary simulation tool for the Entry Descent, and Landing (EDL) phase of numerous Mars lander missions such as Mars Pathfinder in 1997, the twin Mars Exploration Rovers (MER-A and MER-B) in 2004, Mars Phoenix lander in 2007, and it is now the main trajectory simulation tool for Mars Science Laboratory (MSL) in 2012. In all previous missions, the POST 2 simulation ended before ground impact, and a tool other than POST 2 simulated landing dynamics. It would be ideal for one tool to simulate the entire EDL sequence, thus avoiding errors that could be introduced by handing off position, velocity, or other fight parameters from one simulation to the other. The desire to have one continuous end-to-end simulation was the motivation for developing the ground interaction model in POST 2. Rover landing, including the detection of the postlanding state, is a very critical part of the MSL mission, as the EDL landing sequence continues for a few seconds after landing. The method explained in this paper illustrates how a simple ground force interaction model has been added to POST 2, which allows simulation of the entire EDL from atmospheric entry through touchdown.

  4. Nitrate reduction in a simulated free-water surface wetland system.

    PubMed

    Misiti, Teresa M; Hajaya, Malek G; Pavlostathis, Spyros G

    2011-11-01

    The feasibility of using a constructed wetland for treatment of nitrate-contaminated groundwater resulting from the land application of biosolids was investigated for a site in the southeastern United States. Biosolids degradation led to the release of ammonia, which upon oxidation resulted in nitrate concentrations in the upper aquifer in the range of 65-400 mg N/L. A laboratory-scale system was constructed in support of a pilot-scale project to investigate the effect of temperature, hydraulic retention time (HRT) and nitrate and carbon loading on denitrification using soil and groundwater from the biosolids application site. The maximum specific reduction rates (MSRR), measured in batch assays conducted with an open to the atmosphere reactor at four initial nitrate concentrations from 70 to 400 mg N/L, showed that the nitrate reduction rate was not affected by the initial nitrate concentration. The MSRR values at 22 °C for nitrate and nitrite were 1.2 ± 0.2 and 0.7 ± 0.1 mg N/mg VSS(COD)-day, respectively. MSRR values were also measured at 5, 10, 15 and 22 °C and the temperature coefficient for nitrate reduction was estimated at 1.13. Based on the performance of laboratory-scale continuous-flow reactors and model simulations, wetland performance can be maintained at high nitrogen removal efficiency (>90%) with an HRT of 3 days or higher and at temperature values as low as 5 °C, as long as there is sufficient biodegradable carbon available to achieve complete denitrification. The results of this study show that based on the climate in the southeastern United States, a constructed wetland can be used for the treatment of nitrate-contaminated groundwater to low, acceptable nitrate levels. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Finite Element Simulation of Three Full-Scale Crash Tests for Cessna 172 Aircraft

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Warren, Jerry E., Jr.

    2017-01-01

    The NASA Emergency Locator Transmitter Survivability and Reliability (ELT-SAR) project was initiated in 2013 to assess the crash performance standards for the next generation of emergency locator transmitter (ELT) systems. Three Cessna 172 aircraft were acquired to perform crash testing at NASA Langley Research Center's Landing and Impact Research Facility. Full-scale crash tests were conducted in the summer of 2015 and each test article was subjected to severe, but survivable, impact conditions including a flare-to-stall during emergency landing, and two controlled-flight-into-terrain scenarios. Full-scale finite element analyses were performed using a commercial explicit solver, ABAQUS. The first test simulated impacting a concrete surface represented analytically by a rigid plane. Tests 2 and 3 simulated impacting a dirt surface represented analytically by an Eulerian grid of brick elements using a Mohr-Coulomb material model. The objective of this paper is to summarize the test and analysis results for the three full-scale crash tests. Simulation models of the airframe which correlate well with the tests are needed for future studies of alternate ELT mounting configurations.

  6. Simulation-Based Probabilistic Seismic Hazard Assessment Using System-Level, Physics-Based Models: Assembling Virtual California

    NASA Astrophysics Data System (ADS)

    Rundle, P. B.; Rundle, J. B.; Morein, G.; Donnellan, A.; Turcotte, D.; Klein, W.

    2004-12-01

    The research community is rapidly moving towards the development of an earthquake forecast technology based on the use of complex, system-level earthquake fault system simulations. Using these topologically and dynamically realistic simulations, it is possible to develop ensemble forecasting methods similar to that used in weather and climate research. To effectively carry out such a program, one needs 1) a topologically realistic model to simulate the fault system; 2) data sets to constrain the model parameters through a systematic program of data assimilation; 3) a computational technology making use of modern paradigms of high performance and parallel computing systems; and 4) software to visualize and analyze the results. In particular, we focus attention on a new version of our code Virtual California (version 2001) in which we model all of the major strike slip faults in California, from the Mexico-California border to the Mendocino Triple Junction. Virtual California is a "backslip model", meaning that the long term rate of slip on each fault segment in the model is matched to the observed rate. We use the historic data set of earthquakes larger than magnitude M > 6 to define the frictional properties of 650 fault segments (degrees of freedom) in the model. To compute the dynamics and the associated surface deformation, we use message passing as implemented in the MPICH standard distribution on a Beowulf clusters consisting of >10 cpus. We also will report results from implementing the code on significantly larger machines so that we can begin to examine much finer spatial scales of resolution, and to assess scaling properties of the code. We present results of simulations both as static images and as mpeg movies, so that the dynamical aspects of the computation can be assessed by the viewer. We compute a variety of statistics from the simulations, including magnitude-frequency relations, and compare these with data from real fault systems. We report recent

  7. Investigating Brittle Rock Failure and Associated Seismicity Using Laboratory Experiments and Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Zhao, Qi

    Rock failure process is a complex phenomenon that involves elastic and plastic deformation, microscopic cracking, macroscopic fracturing, and frictional slipping of fractures. Understanding this complex behaviour has been the focus of a significant amount of research. In this work, the combined finite-discrete element method (FDEM) was first employed to study (1) the influence of rock discontinuities on hydraulic fracturing and associated seismicity and (2) the influence of in-situ stress on seismic behaviour. Simulated seismic events were analyzed using post-processing tools including frequency-magnitude distribution (b-value), spatial fractal dimension (D-value), seismic rate, and fracture clustering. These simulations demonstrated that at the local scale, fractures tended to propagate following the rock mass discontinuities; while at reservoir scale, they developed in the direction parallel to the maximum in-situ stress. Moreover, seismic signature (i.e., b-value, D-value, and seismic rate) can help to distinguish different phases of the failure process. The FDEM modelling technique and developed analysis tools were then coupled with laboratory experiments to further investigate the different phases of the progressive rock failure process. Firstly, a uniaxial compression experiment, monitored using a time-lapse ultrasonic tomography method, was carried out and reproduced by the numerical model. Using this combination of technologies, the entire deformation and failure processes were studied at macroscopic and microscopic scales. The results not only illustrated the rock failure and seismic behaviours at different stress levels, but also suggested several precursory behaviours indicating the catastrophic failure of the rock. Secondly, rotary shear experiments were conducted using a newly developed rock physics experimental apparatus ERDmu-T) that was paired with X-ray micro-computed tomography (muCT). This combination of technologies has significant advantages

  8. Global-Scale Hydrology: Simple Characterization of Complex Simulation

    NASA Technical Reports Server (NTRS)

    Koster, Randal D.

    1999-01-01

    Atmospheric general circulation models (AGCMS) are unique and valuable tools for the analysis of large-scale hydrology. AGCM simulations of climate provide tremendous amounts of hydrological data with a spatial and temporal coverage unmatched by observation systems. To the extent that the AGCM behaves realistically, these data can shed light on the nature of the real world's hydrological cycle. In the first part of the seminar, I will describe the hydrological cycle in a typical AGCM, with some emphasis on the validation of simulated precipitation against observations. The second part of the seminar will focus on a key goal in large-scale hydrology studies, namely the identification of simple, overarching controls on hydrological behavior hidden amidst the tremendous amounts of data produced by the highly complex AGCM parameterizations. In particular, I will show that a simple 50-year-old climatological relation (and a recent extension we made to it) successfully predicts, to first order, both the annual mean and the interannual variability of simulated evaporation and runoff fluxes. The seminar will conclude with an example of a practical application of global hydrology studies. The accurate prediction of weather statistics several months in advance would have tremendous societal benefits, and conventional wisdom today points at the use of coupled ocean-atmosphere-land models for such seasonal-to-interannual prediction. Understanding the hydrological cycle in AGCMs is critical to establishing the potential for such prediction. Our own studies show, among other things, that soil moisture retention can lead to significant precipitation predictability in many midlatitude and tropical regions.

  9. Laboratory simulations of Martian gullies on sand dunes

    NASA Astrophysics Data System (ADS)

    Védie, E.; Costard, F.; Font, M.; Lagarde, J. L.

    2008-11-01

    Small gullies, observed on Mars, could be formed by groundwater seepage from an underground aquifer or may result from the melting of near-surface ground ice at high obliquity. To test these different hypotheses, a cold room-based laboratory simulation has been performed. The experimental slope was designed to simulate debris flows on sand dune slopes at a range of angles, different granulometry and permafrost characteristics. Preliminary results suggest that the typical morphology of gullies observed on Mars can best be reproduced by the formation of linear debris flows related to the melting of a near-surface ground ice with silty materials. This physical modelling highlights the role of the periglacial conditions, especially the active-layer thickness during debris-flow formation.

  10. Strengthening laboratory systems in resource-limited settings.

    PubMed

    Olmsted, Stuart S; Moore, Melinda; Meili, Robin C; Duber, Herbert C; Wasserman, Jeffrey; Sama, Preethi; Mundell, Ben; Hilborne, Lee H

    2010-09-01

    Considerable resources have been invested in recent years to improve laboratory systems in resource-limited settings. We reviewed published reports, interviewed major donor organizations, and conducted case studies of laboratory systems in 3 countries to assess how countries and donors have worked together to improve laboratory services. While infrastructure and the provision of services have seen improvement, important opportunities remain for further advancement. Implementation of national laboratory plans is inconsistent, human resources are limited, and quality laboratory services rarely extend to lower tier laboratories (eg, health clinics, district hospitals). Coordination within, between, and among governments and donor organizations is also frequently problematic. Laboratory standardization and quality control are improving but remain challenging, making accreditation a difficult goal. Host country governments and their external funding partners should coordinate their efforts effectively around a host country's own national laboratory plan to advance sustainable capacity development throughout a country's laboratory system.

  11. Automatic Selection of Order Parameters in the Analysis of Large Scale Molecular Dynamics Simulations.

    PubMed

    Sultan, Mohammad M; Kiss, Gert; Shukla, Diwakar; Pande, Vijay S

    2014-12-09

    Given the large number of crystal structures and NMR ensembles that have been solved to date, classical molecular dynamics (MD) simulations have become powerful tools in the atomistic study of the kinetics and thermodynamics of biomolecular systems on ever increasing time scales. By virtue of the high-dimensional conformational state space that is explored, the interpretation of large-scale simulations faces difficulties not unlike those in the big data community. We address this challenge by introducing a method called clustering based feature selection (CB-FS) that employs a posterior analysis approach. It combines supervised machine learning (SML) and feature selection with Markov state models to automatically identify the relevant degrees of freedom that separate conformational states. We highlight the utility of the method in the evaluation of large-scale simulations and show that it can be used for the rapid and automated identification of relevant order parameters involved in the functional transitions of two exemplary cell-signaling proteins central to human disease states.

  12. Epidemilogical Simulation System, Version 2.4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-01-30

    EpiSims uses a detailed simulation of disease spread to evaluate demographically and geographically targeted biological threat reduction strategies. Abstract: EpiSims simulates the spread of disease and analyzes the consequences of intervention strategies in a large urban area at the level of individuals. The simulation combines models of three dynamical systems: urban social networks, disease transmission, and within-host progression of a disease. Validated population mobility and activity generation technology provides the social network models, Disease models are based on fusion of expert opinion and available data. EpiSims provides a previously unavailable detailed representation of the course of an outbreak in urbanmore » area. A letter of August 16, 2002 from the Office of Homeland Security states: "Ability of EpiSims to provide comprehensive data on daily activity patterns of individuals makes it far superior to traditional SIR models — clearly had an impact on pre-attack smallpox vaccination policy." EpiSims leverages a unique Los Alamos National Laboratory resource — the population mobility and activity data developed by TRANSIMS (Transportation Analysis and SiMulation System) — to create epidemiological analyses at an unprecedented level of detail. We create models of microscopic (individual-level) physical and biological processes from which, through simulation, emerge the macroscopic (urban regional level) quantities that are the inputs to alternative models. For example, the contact patterns of individuals in different demographic groups determine the overall mixing rates those groups. The characteristics of a person-to-person transmission together with their contact patterns determine the reproductive numbers — how many people will be infected on average by each case. Mixing rates and reproductive numbers are the basic parameters of other epidemiological models. Because interventions — and people’s reactions to them — are ultimately applied

  13. Advanced manned space flight simulation and training: An investigation of simulation host computer system concepts

    NASA Technical Reports Server (NTRS)

    Montag, Bruce C.; Bishop, Alfred M.; Redfield, Joe B.

    1989-01-01

    The findings of a preliminary investigation by Southwest Research Institute (SwRI) in simulation host computer concepts is presented. It is designed to aid NASA in evaluating simulation technologies for use in spaceflight training. The focus of the investigation is on the next generation of space simulation systems that will be utilized in training personnel for Space Station Freedom operations. SwRI concludes that NASA should pursue a distributed simulation host computer system architecture for the Space Station Training Facility (SSTF) rather than a centralized mainframe based arrangement. A distributed system offers many advantages and is seen by SwRI as the only architecture that will allow NASA to achieve established functional goals and operational objectives over the life of the Space Station Freedom program. Several distributed, parallel computing systems are available today that offer real-time capabilities for time critical, man-in-the-loop simulation. These systems are flexible in terms of connectivity and configurability, and are easily scaled to meet increasing demands for more computing power.

  14. Validation of laboratory-scale recycling test method of paper PSA label products

    Treesearch

    Carl Houtman; Karen Scallon; Richard Oldack

    2008-01-01

    Starting with test methods and a specification developed by the U.S. Postal Service (USPS) Environmentally Benign Pressure Sensitive Adhesive Postage Stamp Program, a laboratory-scale test method and a specification were developed and validated for pressure-sensitive adhesive labels, By comparing results from this new test method and pilot-scale tests, which have been...

  15. Simulation of load-sharing in standalone distributed generation system

    NASA Astrophysics Data System (ADS)

    Ajewole, Titus O.; Craven, Robert P. M.; Kayode, Olakunle; Babalola, Olufisayo S.

    2018-05-01

    This paper presents a study on load-sharing among the component generating units of a multi-source electric microgrid that is operated as an autonomous ac supply-mode system. Emerging trend in power system development permits deployment of microgrids for standalone or stand-by applications, thereby requiring active- and reactive power sharing among the discrete generating units contained in hybrid-source microgrids. In this study, therefore, a laboratory-scale model of a microgrid energized with three renewable energy-based sources is employed as a simulation platform to investigate power sharing among the power-generating units. Each source is represented by a source emulator that captures the real operational characteristics of the mimicked generating unit and, with implementation of real-life weather data and load profiles on the model; the sharing of the load among the generating units is investigated. There is a proportionate generation of power by the three source emulators, with their frequencies perfectly synchronized at the point of common coupling as a result of balance flow of power among them. This hybrid topology of renewable energy-based microgrid could therefore be seamlessly adapted into national energy mix by the indigenous electric utility providers in Nigeria.

  16. The Data Acquisition and Control Systems of the Jet Noise Laboratory at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Jansen, B. J., Jr.

    1998-01-01

    The features of the data acquisition and control systems of the NASA Langley Research Center's Jet Noise Laboratory are presented. The Jet Noise Laboratory is a facility that simulates realistic mixed flow turbofan jet engine nozzle exhaust systems in simulated flight. The system is capable of acquiring data for a complete take-off assessment of noise and nozzle performance. This paper describes the development of an integrated system to control and measure the behavior of model jet nozzles featuring dual independent high pressure combusting air streams with wind tunnel flow. The acquisition and control system is capable of simultaneous measurement of forces, moments, static and dynamic model pressures and temperatures, and jet noise. The design concepts for the coordination of the control computers and multiple data acquisition computers and instruments are discussed. The control system design and implementation are explained, describing the features, equipment, and the experiences of using a primarily Personal Computer based system. Areas for future development are examined.

  17. Operating characteristic analysis of a 400 mH class HTS DC reactor in connection with a laboratory scale LCC type HVDC system

    NASA Astrophysics Data System (ADS)

    Kim, Sung-Kyu; Kim, Kwangmin; Park, Minwon; Yu, In-Keun; Lee, Sangjin

    2015-11-01

    High temperature superconducting (HTS) devices are being developed due to their advantages. Most line commutated converter based high voltage direct current (HVDC) transmission systems for long-distance transmission require large inductance of DC reactor; however, generally, copper-based reactors cause a lot of electrical losses during the system operation. This is driving researchers to develop a new type of DC reactor using HTS wire. The authors have developed a 400 mH class HTS DC reactor and a laboratory scale test-bed for line-commutated converter type HVDC system and applied the HTS DC reactor to the HVDC system to investigate their operating characteristics. The 400 mH class HTS DC reactor is designed using a toroid type magnet. The HVDC system is designed in the form of a mono-pole system with thyristor-based 12-pulse power converters. In this paper, the investigation results of the HTS DC reactor in connection with the HVDC system are described. The operating characteristics of the HTS DC reactor are analyzed under various operating conditions of the system. Through the results, applicability of an HTS DC reactor in an HVDC system is discussed in detail.

  18. Degradation modeling of high temperature proton exchange membrane fuel cells using dual time scale simulation

    NASA Astrophysics Data System (ADS)

    Pohl, E.; Maximini, M.; Bauschulte, A.; vom Schloß, J.; Hermanns, R. T. E.

    2015-02-01

    HT-PEM fuel cells suffer from performance losses due to degradation effects. Therefore, the durability of HT-PEM is currently an important factor of research and development. In this paper a novel approach is presented for an integrated short term and long term simulation of HT-PEM accelerated lifetime testing. The physical phenomena of short term and long term effects are commonly modeled separately due to the different time scales. However, in accelerated lifetime testing, long term degradation effects have a crucial impact on the short term dynamics. Our approach addresses this problem by applying a novel method for dual time scale simulation. A transient system simulation is performed for an open voltage cycle test on a HT-PEM fuel cell for a physical time of 35 days. The analysis describes the system dynamics by numerical electrochemical impedance spectroscopy. Furthermore, a performance assessment is performed in order to demonstrate the efficiency of the approach. The presented approach reduces the simulation time by approximately 73% compared to conventional simulation approach without losing too much accuracy. The approach promises a comprehensive perspective considering short term dynamic behavior and long term degradation effects.

  19. Laboratory Simulation of Electrical Discharge in Surface Lunar Regolith

    NASA Astrophysics Data System (ADS)

    Shusterman, M.; Izenberg, N.; Wing, B. R.; Liang, S.

    2016-12-01

    Physical, chemical, and optical characteristics of space-weathered surface materials on airless bodies are produced primarily from bombardment by solar energetic particles and micrometeoroid impacts. On bodies such as the Moon and Mercury, soils in permanently shadowed regions (PSRs) are very cold, have low electrical conductivities, and are subjected to a high flux of incoming energetic particles accelerated by solar events. Theoretical models predict that up to 25% of gardened soils in the lunar polar regions are altered by dielectric breakdown; a potentially significant weathering process that is currently unconfirmed. Although electrical properties of lunar soils have been studied in relation to flight electronics and spacecraft safety, no studies have characterized potential alterations to soils resulting from electrical discharge. To replicate the surface charge field in PSRs, lunar regolith simulant JSC-1A was placed between two parallel plane electrodes under both low and high vacuum environments, 10e-3 torr and 2.5e-6 torr, respectively. Voltage was increased until discharge occurred within the sample. Grains were analyzed using an SVC fiber-fed point spectrometer, Olympus BX51 upright metallurgical microscope, and a Hitachi TM3000 scanning electron microscope with Bruker Quantax-70 X-ray spectrometer. Discharges occurring in samples under low vacuum resulted in surficial melting, silicate vapor deposition, coalescence of metallic iron, and micro-scale changes to surface topography. Samples treated under a high vacuum environment showed similar types of effects, but fewer in number compared to low vacuum samples. The variation in alteration abundances between the two environments implies that discharges may be occurring across surface contaminants, even at high vacuum conditions, inhibiting dielectric breakdown in our laboratory simulations.

  20. A laboratory system for the investigation of rain fade compensation techniques for Ka-band satellites

    NASA Technical Reports Server (NTRS)

    Svoboda, James S.; Kachmar, Brian A.

    1993-01-01

    The design and performance of a rain fade simulation/counteraction system on a laboratory simulated 30/20 GHz, time division multiple access (TDMA) satellite communications testbed is evaluated. Severe rain attenuation of electromagnetic radiation at 30/20 GHz occurs due to the carrier wavelength approaching the water droplet size. Rain in the downlink path lowers the signal power present at the receiver, resulting in a higher number of bit errors induced in the digital ground terminal. The laboratory simulation performed at NASA Lewis Research Center uses a programmable PIN diode attenuator to simulate 20 GHz satellite downlink geographic rain fade profiles. A computer based network control system monitors the downlink power and informs the network of any power threshold violations, which then prompts the network to issue commands that temporarily increase the gain of the satellite based traveling wave tube (TWT) amplifier. After the rain subsides, the network returns the TWT to the normal energy conserving power mode. Bit error rate (BER) data taken at the receiving ground terminal serves as a measure of the severity of rain degradation, and also evaluates the extent to which the network can improve the faded channel.

  1. Scaling of hydrologic and erosion parameters derived from rainfall simulation

    NASA Astrophysics Data System (ADS)

    Sheridan, Gary; Lane, Patrick; Noske, Philip; Sherwin, Christopher

    2010-05-01

    Rainfall simulation experiments conducted at the temporal scale of minutes and the spatial scale of meters are often used to derive parameters for erosion and water quality models that operate at much larger temporal and spatial scales. While such parameterization is convenient, there has been little effort to validate this approach via nested experiments across these scales. In this paper we first review the literature relevant to some of these long acknowledged issues. We then present rainfall simulation and erosion plot data from a range of sources, including mining, roading, and forestry, to explore the issues associated with the scaling of parameters such as infiltration properties and erodibility coefficients.

  2. Use of a large-scale rainfall simulator reveals novel insights into stemflow generation

    NASA Astrophysics Data System (ADS)

    Levia, D. F., Jr.; Iida, S. I.; Nanko, K.; Sun, X.; Shinohara, Y.; Sakai, N.

    2017-12-01

    Detailed knowledge of stemflow generation and its effects on both hydrological and biogoechemical cycling is important to achieve a holistic understanding of forest ecosystems. Field studies and a smaller set of experiments performed under laboratory conditions have increased our process-based knowledge of stemflow production. Building upon these earlier works, a large-scale rainfall simulator was employed to deepen our understanding of stemflow generation processes. The use of the large-scale rainfall simulator provides a unique opportunity to examine a range of rainfall intensities under constant conditions that are difficult under natural conditions due to the variable nature of rainfall intensities in the field. Stemflow generation and production was examined for three species- Cryptomeria japonica D. Don (Japanese cedar), Chamaecyparis obtusa (Siebold & Zucc.) Endl. (Japanese cypress), Zelkova serrata Thunb. (Japanese zelkova)- under both leafed and leafless conditions at several different rainfall intensities (15, 20, 30, 40, 50, and 100 mm h-1) using a large-scale rainfall simulator in National Research Institute for Earth Science and Disaster Resilience (Tsukuba, Japan). Stemflow production and rates and funneling ratios were examined in relation to both rainfall intensity and canopy structure. Preliminary results indicate a dynamic and complex response of the funneling ratios of individual trees to different rainfall intensities among the species examined. This is partly the result of different canopy structures, hydrophobicity of vegetative surfaces, and differential wet-up processes across species and rainfall intensities. This presentation delves into these differences and attempts to distill them into generalizable patterns, which can advance our theories of stemflow generation processes and ultimately permit better stewardship of forest resources. ________________ Funding note: This research was supported by JSPS Invitation Fellowship for Research in

  3. Comparative Study of the Effectiveness of Three Learning Environments: Hyper-Realistic Virtual Simulations, Traditional Schematic Simulations and Traditional Laboratory

    ERIC Educational Resources Information Center

    Martinez, Guadalupe; Naranjo, Francisco L.; Perez, Angel L.; Suero, Maria Isabel; Pardo, Pedro J.

    2011-01-01

    This study compared the educational effects of computer simulations developed in a hyper-realistic virtual environment with the educational effects of either traditional schematic simulations or a traditional optics laboratory. The virtual environment was constructed on the basis of Java applets complemented with a photorealistic visual output.…

  4. Characteristics of Tornado-Like Vortices Simulated in a Large-Scale Ward-Type Simulator

    NASA Astrophysics Data System (ADS)

    Tang, Zhuo; Feng, Changda; Wu, Liang; Zuo, Delong; James, Darryl L.

    2018-02-01

    Tornado-like vortices are simulated in a large-scale Ward-type simulator to further advance the understanding of such flows, and to facilitate future studies of tornado wind loading on structures. Measurements of the velocity fields near the simulator floor and the resulting floor surface pressures are interpreted to reveal the mean and fluctuating characteristics of the flow as well as the characteristics of the static-pressure deficit. We focus on the manner in which the swirl ratio and the radial Reynolds number affect these characteristics. The transition of the tornado-like flow from a single-celled vortex to a dual-celled vortex with increasing swirl ratio and the impact of this transition on the flow field and the surface-pressure deficit are closely examined. The mean characteristics of the surface-pressure deficit caused by tornado-like vortices simulated at a number of swirl ratios compare well with the corresponding characteristics recorded during full-scale tornadoes.

  5. Theory and Simulations of Solar System Plasmas

    NASA Technical Reports Server (NTRS)

    Goldstein, Melvyn L.

    2011-01-01

    "Theory and simulations of solar system plasmas" aims to highlight results from microscopic to global scales, achieved by theoretical investigations and numerical simulations of the plasma dynamics in the solar system. The theoretical approach must allow evidencing the universality of the phenomena being considered, whatever the region is where their role is studied; at the Sun, in the solar corona, in the interplanetary space or in planetary magnetospheres. All possible theoretical issues concerning plasma dynamics are welcome, especially those using numerical models and simulations, since these tools are mandatory whenever analytical treatments fail, in particular when complex nonlinear phenomena are at work. Comparative studies for ongoing missions like Cassini, Cluster, Demeter, Stereo, Wind, SDO, Hinode, as well as those preparing future missions and proposals, like, e.g., MMS and Solar Orbiter, are especially encouraged.

  6. Using Multi-Scale Modeling Systems and Satellite Data to Study the Precipitation Processes

    NASA Technical Reports Server (NTRS)

    Tao, Wei--Kuo; Chern, J.; Lamg, S.; Matsui, T.; Shen, B.; Zeng, X.; Shi, R.

    2010-01-01

    In recent years, exponentially increasing computer power extended Cloud Resolving Model (CRM) integrations from hours to months, the number of computational grid points from less than a thousand to close to ten million. Three-dimensional models are now more prevalent. Much attention is devoted to precipitating cloud systems where the crucial 1-km scales are resolved in horizontal domains as large as 10,000 km in two-dimensions, and 1,000 x 1,000 sq km in three-dimensions. Cloud resolving models now provide statistical information useful for developing more realistic physically based parameterizations for climate models and numerical weather prediction models. It is also expected that NWP and mesoscale models can be run in grid size similar to cloud resolving models through nesting technique. Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model). (2) a regional scale model (a NASA unified weather research and forecast, W8F). (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the results from using multi-scale modeling systems to study the interactions between clouds, precipitation, and aerosols will be presented. Also how to use the multi-satellite simulator to improve precipitation processes will be discussed.

  7. Simulation of the hydraulic performance of highway filter drains through laboratory models and stormwater management tools.

    PubMed

    Sañudo-Fontaneda, Luis A; Jato-Espino, Daniel; Lashford, Craig; Coupe, Stephen J

    2017-05-23

    Road drainage is one of the most relevant assets in transport infrastructure due to its inherent influence on traffic management and road safety. Highway filter drains (HFDs), also known as "French Drains", are the main drainage system currently in use in the UK, throughout 7000 km of its strategic road network. Despite being a widespread technique across the whole country, little research has been completed on their design considerations and their subsequent impact on their hydraulic performance, representing a gap in the field. Laboratory experiments have been proven to be a reliable indicator for the simulation of the hydraulic performance of stormwater best management practices (BMPs). In addition to this, stormwater management tools (SMT) have been preferentially chosen as a design tool for BMPs by practitioners from all over the world. In this context, this research aims to investigate the hydraulic performance of HFDs by comparing the results from laboratory simulation and two widely used SMT such as the US EPA's stormwater management model (SWMM) and MicroDrainage®. Statistical analyses were applied to a series of rainfall scenarios simulated, showing a high level of accuracy between the results obtained in laboratory and using SMT as indicated by the high and low values of the Nash-Sutcliffe and R 2 coefficients and root-mean-square error (RMSE) reached, which validated the usefulness of SMT to determine the hydraulic performance of HFDs.

  8. Emission characteristics of PBDEs during flame-retardant plastics extruding process: field investigation and laboratorial simulation.

    PubMed

    Deng, Chao; Li, Ying; Li, Jinhui; Chen, Yuan; Li, Huafen

    2017-10-01

    Though mechanical recycling of WEEE plastics is supposed to be a promising method, PBDEs release and the resulting contamination during its processing remain unclear yet. The distribution of PBDEs pollution in production lines was investigated from two flame-retardant plastic modification plants in Southern China. This was followed by laboratory simulation experiments to characterize the emission processes. PBDEs concentrations ranged from 37 to 31,305 ng/L in cooling water and from 40,043 to 216,653 ng/g dry wt in solid samples taken during the field investigation. In the laboratory simulation, concentrations ranged from 146 to 433 ng/L in cooling water and from 411,436 to 747,516 ng/Nm 3 in flue gas. All samples were dominated by BDE-209 among the congeners. Temperatures and impurities in plastic substrate can significantly affect PBDEs release. Special attention should be paid to the risks of water directly discharge from the cooling system, especially for the biological sludge and sediments, as well as flue gas emissions to the environment.

  9. Multiscale simulations of patchy particle systems combining Molecular Dynamics, Path Sampling and Green's Function Reaction Dynamics

    NASA Astrophysics Data System (ADS)

    Bolhuis, Peter

    Important reaction-diffusion processes, such as biochemical networks in living cells, or self-assembling soft matter, span many orders in length and time scales. In these systems, the reactants' spatial dynamics at mesoscopic length and time scales of microns and seconds is coupled to the reactions between the molecules at microscopic length and time scales of nanometers and milliseconds. This wide range of length and time scales makes these systems notoriously difficult to simulate. While mean-field rate equations cannot describe such processes, the mesoscopic Green's Function Reaction Dynamics (GFRD) method enables efficient simulation at the particle level provided the microscopic dynamics can be integrated out. Yet, many processes exhibit non-trivial microscopic dynamics that can qualitatively change the macroscopic behavior, calling for an atomistic, microscopic description. The recently developed multiscale Molecular Dynamics Green's Function Reaction Dynamics (MD-GFRD) approach combines GFRD for simulating the system at the mesocopic scale where particles are far apart, with microscopic Molecular (or Brownian) Dynamics, for simulating the system at the microscopic scale where reactants are in close proximity. The association and dissociation of particles are treated with rare event path sampling techniques. I will illustrate the efficiency of this method for patchy particle systems. Replacing the microscopic regime with a Markov State Model avoids the microscopic regime completely. The MSM is then pre-computed using advanced path-sampling techniques such as multistate transition interface sampling. I illustrate this approach on patchy particle systems that show multiple modes of binding. MD-GFRD is generic, and can be used to efficiently simulate reaction-diffusion systems at the particle level, including the orientational dynamics, opening up the possibility for large-scale simulations of e.g. protein signaling networks.

  10. Fumigation of a laboratory-scale HVAC system with hydrogen peroxide for decontamination following a biological contamination incident.

    PubMed

    Meyer, K M; Calfee, M W; Wood, J P; Mickelsen, L; Attwood, B; Clayton, M; Touati, A; Delafield, R

    2014-03-01

    To evaluate hydrogen peroxide vapour (H2 O2 ) for its ability to inactivate Bacillus spores within a laboratory-scale heating, ventilation and air-conditioning (HVAC) duct system. Experiments were conducted in a closed-loop duct system, constructed of either internally lined or unlined galvanized metal. Bacterial spores were aerosol-deposited onto 18-mm-diameter test material coupons and strategically placed at several locations within the duct environment. Various concentrations of H2 O2 and exposure times were evaluated to determine the sporicidal efficacy and minimum exposure needed for decontamination. For the unlined duct, high variability was observed in the recovery of spores between sample locations, likely due to complex, unpredictable flow patterns within the ducts. In comparison, the lined duct exhibited a significant desorption of the H2 O2 following the fumigant dwell period and thus resulted in complete decontamination at all sampling locations. These findings suggest that decontamination of Bacillus spore-contaminated unlined HVAC ducts by hydrogen peroxide fumigation may require more stringent conditions (higher concentrations, longer dwell duration) than internally insulated ductwork. These data may help emergency responders when developing remediation plans during building decontamination. © 2013 The Society for Applied Microbiology This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  11. Particle Size Distribution of Serratia marcescens Aerosols Created During Common Laboratory Procedures and Simulated Laboratory Accidents

    PubMed Central

    Kenny, Michael T.; Sabel, Fred L.

    1968-01-01

    Andersen air samplers were used to determine the particle size distribution of Serratia marcescens aerosols created during several common laboratory procedures and simulated laboratory accidents. Over 1,600 viable particles per cubic foot of air sampled were aerosolized during blending operations. More than 98% of these particles were less than 5 μ in size. In contrast, 80% of the viable particles aerosolized by handling lyophilized cultures were larger than 5 μ. Harvesting infected eggs, sonic treatment, centrifugation, mixing cultures, and dropping infectious material produced aerosols composed primarily of particles in the 1.0- to 7.5-μ size range. Images Fig. 1 PMID:4877498

  12. Bacterial communities in full-scale wastewater treatment systems.

    PubMed

    Cydzik-Kwiatkowska, Agnieszka; Zielińska, Magdalena

    2016-04-01

    Bacterial metabolism determines the effectiveness of biological treatment of wastewater. Therefore, it is important to define the relations between the species structure and the performance of full-scale installations. Although there is much laboratory data on microbial consortia, our understanding of dependencies between the microbial structure and operational parameters of full-scale wastewater treatment plants (WWTP) is limited. This mini-review presents the types of microbial consortia in WWTP. Information is given on extracellular polymeric substances production as factor that is key for formation of spatial structures of microorganisms. Additionally, we discuss data on microbial groups including nitrifiers, denitrifiers, Anammox bacteria, and phosphate- and glycogen-accumulating bacteria in full-scale aerobic systems that was obtained with the use of molecular techniques, including high-throughput sequencing, to shed light on dependencies between the microbial ecology of biomass and the overall efficiency and functional stability of wastewater treatment systems. Sludge bulking in WWTPs is addressed, as well as the microbial composition of consortia involved in antibiotic and micropollutant removal.

  13. UAS in the NAS Project: Large-Scale Communication Architecture Simulations with NASA GRC Gen5 Radio Model

    NASA Technical Reports Server (NTRS)

    Kubat, Gregory

    2016-01-01

    This report provides a description and performance characterization of the large-scale, Relay architecture, UAS communications simulation capability developed for the NASA GRC, UAS in the NAS Project. The system uses a validated model of the GRC Gen5 CNPC, Flight-Test Radio model. Contained in the report is a description of the simulation system and its model components, recent changes made to the system to improve performance, descriptions and objectives of sample simulations used for test and verification, and a sampling and observations of results and performance data.

  14. Development of high-resolution multi-scale modelling system for simulation of coastal-fluvial urban flooding

    NASA Astrophysics Data System (ADS)

    Comer, Joanne; Indiana Olbert, Agnieszka; Nash, Stephen; Hartnett, Michael

    2017-02-01

    Urban developments in coastal zones are often exposed to natural hazards such as flooding. In this research, a state-of-the-art, multi-scale nested flood (MSN_Flood) model is applied to simulate complex coastal-fluvial urban flooding due to combined effects of tides, surges and river discharges. Cork city on Ireland's southwest coast is a study case. The flood modelling system comprises a cascade of four dynamically linked models that resolve the hydrodynamics of Cork Harbour and/or its sub-region at four scales: 90, 30, 6 and 2 m. Results demonstrate that the internalization of the nested boundary through the use of ghost cells combined with a tailored adaptive interpolation technique creates a highly dynamic moving boundary that permits flooding and drying of the nested boundary. This novel feature of MSN_Flood provides a high degree of choice regarding the location of the boundaries to the nested domain and therefore flexibility in model application. The nested MSN_Flood model through dynamic downscaling facilitates significant improvements in accuracy of model output without incurring the computational expense of high spatial resolution over the entire model domain. The urban flood model provides full characteristics of water levels and flow regimes necessary for flood hazard identification and flood risk assessment.

  15. Evaluation of Cobas Integra 800 under simulated routine conditions in six laboratories.

    PubMed

    Redondo, Francisco L; Bermudez, Pilar; Cocco, Claudio; Colella, Francesca; Graziani, Maria Stella; Fiehn, Walter; Hierla, Thomas; Lemoël, Gisèle; Belliard, AnneMarie; Manene, Dieudonne; Meziani, Mourad; Liebel, Maryann; McQueen, Matthew J; Stockmann, Wolfgang

    2003-03-01

    The new selective access analyser Cobas Integra 800 from Roche Diagnostics was evaluated in an international multicentre study at six sites. Routine simulation experiments showed good performance and full functionality of the instrument and provocation of anomalous situations generated no problems. The new features on Cobas Integra 800, namely clot detection and dispensing control, worked according to specifications. The imprecision of Cobas Integra 800 fulfilled the proposed quality specifications regarding imprecision of analytical systems for clinical chemistry with few exceptions. Claims for linearity, drift, and carry-over were all within the defined specifications, except urea linearity. Interference exists in some cases, as could be expected due to the chemistries applied. Accuracy met the proposed quality specifications, except in some special cases. Method comparisons with Cobas Integra 700 showed good agreement; comparisons with other analysis systems yielded in several cases explicable deviations. Practicability of Cobas Integra 800 met or exceeded the requirements for more than 95% of all attributes rated. The strong points of the new analysis system were reagent handling, long stability of calibration curves, high number of tests on board, compatibility of the sample carrier to other Roche systems, and the sample integrity check for more reliable analytical results. The improvement of the workflow offered by the 5-position rack and STAT handling like on Cobas Integra 800 makes the instrument attractive for further consolidation in the medium-sized laboratory, for dedicated use of special analytes, and/or as back-up in the large routine laboratory.

  16. LABORATORY-SCALE SIMULATION OF RUNOFF RESPONSE FROM PERVIOUS-IMPERVIOUS SYSTEMS

    EPA Science Inventory

    Urban development yields landscapes that are composites of impervious and pervious areas, with a consequent reduction in infiltration and increase in stormwater runoff. Although basic rainfall-runoff models are used in the vast majority of runoff prediction in urban landscapes, t...

  17. High Fidelity, “Faster than Real-Time” Simulator for Predicting Power System Dynamic Behavior - Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flueck, Alex

    The “High Fidelity, Faster than Real­Time Simulator for Predicting Power System Dynamic Behavior” was designed and developed by Illinois Institute of Technology with critical contributions from Electrocon International, Argonne National Laboratory, Alstom Grid and McCoy Energy. Also essential to the project were our two utility partners: Commonwealth Edison and AltaLink. The project was a success due to several major breakthroughs in the area of large­scale power system dynamics simulation, including (1) a validated faster than real­ time simulation of both stable and unstable transient dynamics in a large­scale positive sequence transmission grid model, (2) a three­phase unbalanced simulation platform formore » modeling new grid devices, such as independently controlled single­phase static var compensators (SVCs), (3) the world’s first high fidelity three­phase unbalanced dynamics and protection simulator based on Electrocon’s CAPE program, and (4) a first­of­its­ kind implementation of a single­phase induction motor model with stall capability. The simulator results will aid power grid operators in their true time of need, when there is a significant risk of cascading outages. The simulator will accelerate performance and enhance accuracy of dynamics simulations, enabling operators to maintain reliability and steer clear of blackouts. In the long­term, the simulator will form the backbone of the newly conceived hybrid real­time protection and control architecture that will coordinate local controls, wide­area measurements, wide­area controls and advanced real­time prediction capabilities. The nation’s citizens will benefit in several ways, including (1) less down time from power outages due to the faster­than­real­time simulator’s predictive capability, (2) higher levels of reliability due to the detailed dynamics plus protection simulation capability, and (3) more resiliency due to the three­ phase unbalanced simulator

  18. JPL-20171130-EARTHf-0001-DIY Glacier Modeling with Virtual Earth System Laboratory

    NASA Image and Video Library

    2017-11-30

    Eric Larour, JPL Climate Scientist, explains the NASA research tool "VESL" -- Virtual Earth System Laboratory -- that allows anyone to run their own climate experiment. The user can use a slider to simulate and increase or decrease in the amount of snowfall on a particular glacier then see a video of the results, including the glacier melting's effect on sea level.

  19. Simulated convective systems using a cloud resolving model: Impact of large-scale temperature and moisture forcing using observations and GEOS-3 reanalysis

    NASA Technical Reports Server (NTRS)

    Shie, C.-L.; Tao, W.-K.; Hou, A.; Lin, X.

    2006-01-01

    The GCE (Goddard Cumulus Ensemble) model, which has been developed and improved at NASA Goddard Space Flight Center over the past two decades, is considered as one of the finer and state-of-the-art CRMs (Cloud Resolving Models) in the research community. As the chosen CRM for a NASA Interdisciplinary Science (IDS) Project, GCE has recently been successfully upgraded into an MPI (Message Passing Interface) version with which great improvement has been achieved in computational efficiency, scalability, and portability. By basically using the large-scale temperature and moisture advective forcing, as well as the temperature, water vapor and wind fields obtained from TRMM (Tropical Rainfall Measuring Mission) field experiments such as SCSMEX (South China Sea Monsoon Experiment) and KWAJEX (Kwajalein Experiment), our recent 2-D and 3-D GCE simulations were able to capture detailed convective systems typical of the targeted (simulated) regions. The GEOS-3 [Goddard EOS (Earth Observing System) Version-3] reanalysis data have also been proposed and successfully implemented for usage in the proposed/performed GCE long-term simulations (i.e., aiming at producing massive simulated cloud data -- Cloud Library) in compensating the scarcity of real field experimental data in both time and space (location). Preliminary 2-D or 3-D pilot results using GEOS-3 data have generally showed good qualitative agreement (yet some quantitative difference) with the respective numerical results using the SCSMEX observations. The first objective of this paper is to ensure the GEOS-3 data quality by comparing the model results obtained from several pairs of simulations using the real observations and GEOS-3 reanalysis data. The different large-scale advective forcing obtained from these two kinds of resources (i.e., sounding observations and GEOS-3 reanalysis) has been considered as a major critical factor in producing various model results. The second objective of this paper is therefore to

  20. Identification of oxidative coupling products of xylenols arising from laboratory-scale phytoremediation.

    PubMed

    Poerschmann, J; Schultze-Nobre, L; Ebert, R U; Górecki, T

    2015-01-01

    Oxidative coupling reactions take place during the passage of xylenols through a laboratory-scale helophyte-based constructed wetland system. Typical coupling product groups including tetramethyl-[1,1'-biphenyl] diols and tetramethyl diphenylether monools as stable organic intermediates could be identified by a combination of pre-chromatographic derivatization and GC/MS analysis. Structural assignment of individual analytes was performed by an increment system developed by Zenkevich to pre-calculate retention sequences. The most abundant analyte turned out to be 3,3',5,5'-tetramethyl-[1,1'-biphenyl]-4,4'-diol, which can be formed by a combination of radicals based on 2,6-xylenol or by an attack of a 2,6-xylenol-based radical on 2,6-xylenol. Organic intermediates originating from oxidative coupling could also be identified in anaerobic constructed wetland systems. This finding suggested the presence of (at least partly) oxic conditions in the rhizosphere. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Comparison of different types of medium scale field rainfall simulators

    NASA Astrophysics Data System (ADS)

    Dostál, Tomáš; Strauss, Peter; Schindewolf, Marcus; Kavka, Petr; Schmidt, Jürgen; Bauer, Miroslav; Neumann, Martin; Kaiser, Andreas; Iserloh, Thomas

    2015-04-01

    Rainfall simulators are used in numerous experiments to study runoff and soil erosion characteristics. However, they usually differ in their construction details, rainfall generation, plot size and other technical parameters. As field experiments using medium to large scale rainfall simulators (plot length 3 - 8 m) are very much time and labor consuming, close cooperation of individual teams and comparability of results is highly desirable to enlarge the database of results. Two experimental campaigns were organized to compare three field rainfall simulators of similar scale (plot size), but with different technical parameters. The results were then compared, to identify parameters that are crucial for soil loss and surface runoff formation and test if results from individual devices can be reliably compared. The rainfall simulators compared were: field rainfall simulator of CTU Prague (the Czech Republic) (Kavka et al., 2012; EGU2015-11025), field simulator of BAW (Austria) (Strauss et al., 2002) and field simulator of TU Bergakademie Freiberg (Germany) (Schindewolf & Schmidt 2012). The device of CTU Prague is usually applied to a plot size of 9,5 x 2 m employing 4 nozzles SS Full Jet 40WSQ mounted on folding arm, working pressure is 0.8 bar, height of nozzles is 2.65 m. The intensity of rainfall is regulated electronically, which leaves the nozzle opened only for certain time. The rainfall simulator of BAW is constructed as a modular system, which is usually applied for a length of 5 m (area 2 x 5 m), using 6 nozzles SS Full Jet 40WSQ. Usual working pressure is 0.25 bar. Elevation of nozzles is 2.6 m. The intensity of rainfall is regulated electronically, which leaves the nozzle opened only for certain time. The device of TU Bergakademie Freiberg is also standard modular system, working usually with a plot size of 3 x 1 m, using 3 oscillating VeeJet 80/100 nozzles with an usual operating pressure of 0.5 bar. Intensity is regulated by the frequency of sweeps above

  2. Subgrid Scale Modeling in Solar Convection Simulations using the ASH Code

    NASA Technical Reports Server (NTRS)

    Young, Y.-N.; Miesch, M.; Mansour, N. N.

    2003-01-01

    The turbulent solar convection zone has remained one of the most challenging and important subjects in physics. Understanding the complex dynamics in the solar con- vection zone is crucial for gaining insight into the solar dynamo problem. Many solar observatories have generated revealing data with great details of large scale motions in the solar convection zone. For example, a strong di erential rotation is observed: the angular rotation is observed to be faster at the equator than near the poles not only near the solar surface, but also deep in the convection zone. On the other hand, due to the wide range of dynamical scales of turbulence in the solar convection zone, both theory and simulation have limited success. Thus, cutting edge solar models and numerical simulations of the solar convection zone have focused more narrowly on a few key features of the solar convection zone, such as the time-averaged di erential rotation. For example, Brun & Toomre (2002) report computational finding of differential rotation in an anelastic model for solar convection. A critical shortcoming in this model is that the viscous dissipation is based on application of mixing length theory to stellar dynamics with some ad hoc parameter tuning. The goal of our work is to implement the subgrid scale model developed at CTR into the solar simulation code and examine how the differential rotation will be a affected as a result. Specifically, we implement a Smagorinsky-Lilly subgrid scale model into the ASH (anelastic spherical harmonic) code developed over the years by various authors. This paper is organized as follows. In x2 we briefly formulate the anelastic system that describes the solar convection. In x3 we formulate the Smagorinsky-Lilly subgrid scale model for unstably stratifed convection. We then present some preliminary results in x4, where we also provide some conclusions and future directions.

  3. Potential for improved radiation thermometry measurement uncertainty through implementing a primary scale in an industrial laboratory

    NASA Astrophysics Data System (ADS)

    Willmott, Jon R.; Lowe, David; Broughton, Mick; White, Ben S.; Machin, Graham

    2016-09-01

    A primary temperature scale requires realising a unit in terms of its definition. For high temperature radiation thermometry in terms of the International Temperature Scale of 1990 this means extrapolating from the signal measured at the freezing temperature of gold, silver or copper using Planck’s radiation law. The difficulty in doing this means that primary scales above 1000 °C require specialist equipment and careful characterisation in order to achieve the extrapolation with sufficient accuracy. As such, maintenance of the scale at high temperatures is usually only practicable for National Metrology Institutes, and calibration laboratories have to rely on a scale calibrated against transfer standards. At lower temperatures it is practicable for an industrial calibration laboratory to have its own primary temperature scale, which reduces the number of steps between the primary scale and end user. Proposed changes to the SI that will introduce internationally accepted high temperature reference standards might make it practicable to have a primary high temperature scale in a calibration laboratory. In this study such a scale was established by calibrating radiation thermometers directly to high temperature reference standards. The possible reduction in uncertainty to an end user as a result of the reduced calibration chain was evaluated.

  4. The impact of SLMTA in improving laboratory quality systems in the Caribbean Region.

    PubMed

    Guevara, Giselle; Gordon, Floris; Irving, Yvette; Whyms, Ismae; Parris, Keith; Beckles, Songee; Maruta, Talkmore; Ndlovu, Nqobile; Albalak, Rachel; Alemnji, George

    Past efforts to improve laboratory quality systems and to achieve accreditation for better patient care in the Caribbean Region have been slow. To describe the impact of the Strengthening of Laboratory Management Toward Accreditation (SLMTA) training programme and mentorship amongst five clinical laboratories in the Caribbean after 18 months. Five national reference laboratories from four countries participated in the SLMTA programme that incorporated classroom teaching and implementation of improvement projects. Mentors were assigned to the laboratories to guide trainees on their improvement projects and to assist in the development of Quality Management Systems (QMS). Audits were conducted at baseline, six months, exit (at 12 months) and post-SLMTA (at 18 months) using the Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) checklist to measure changes in implementation of the QMS during the period. At the end of each audit, a comprehensive implementation plan was developed in order to address gaps. Baseline audit scores ranged from 19% to 52%, corresponding to 0 stars on the SLIPTA five-star scale. After 18 months, one laboratory reached four stars, two reached three stars and two reached two stars. There was a corresponding decrease in nonconformities and development of over 100 management and technical standard operating procedures in each of the five laboratories. The tremendous improvement in these five Caribbean laboratories shows that SLMTA coupled with mentorship is an effective, user-friendly, flexible and customisable approach to the implementation of laboratory QMS. It is recommended that other laboratories in the region consider using the SLMTA training programme as they engage in quality systems improvement and preparation for accreditation.

  5. The impact of SLMTA in improving laboratory quality systems in the Caribbean Region

    PubMed Central

    Gordon, Floris; Irving, Yvette; Whyms, Ismae; Parris, Keith; Beckles, Songee; Maruta, Talkmore; Ndlovu, Nqobile; Albalak, Rachel; Alemnji, George

    2014-01-01

    Background Past efforts to improve laboratory quality systems and to achieve accreditation for better patient care in the Caribbean Region have been slow. Objective To describe the impact of the Strengthening of Laboratory Management Toward Accreditation (SLMTA) training programme and mentorship amongst five clinical laboratories in the Caribbean after 18 months. Method Five national reference laboratories from four countries participated in the SLMTA programme that incorporated classroom teaching and implementation of improvement projects. Mentors were assigned to the laboratories to guide trainees on their improvement projects and to assist in the development of Quality Management Systems (QMS). Audits were conducted at baseline, six months, exit (at 12 months) and post-SLMTA (at 18 months) using the Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) checklist to measure changes in implementation of the QMS during the period. At the end of each audit, a comprehensive implementation plan was developed in order to address gaps. Results Baseline audit scores ranged from 19% to 52%, corresponding to 0 stars on the SLIPTA five-star scale. After 18 months, one laboratory reached four stars, two reached three stars and two reached two stars. There was a corresponding decrease in nonconformities and development of over 100 management and technical standard operating procedures in each of the five laboratories. Conclusion The tremendous improvement in these five Caribbean laboratories shows that SLMTA coupled with mentorship is an effective, user-friendly, flexible and customisable approach to the implementation of laboratory QMS. It is recommended that other laboratories in the region consider using the SLMTA training programme as they engage in quality systems improvement and preparation for accreditation. PMID:27066396

  6. SCALE-UP OF RAPID SMALL-SCALE ADSORPTION TESTS TO FIELD-SCALE ADSORBERS: THEORETICAL AND EXPERIMENTAL BASIS

    EPA Science Inventory

    Design of full-scale adsorption systems typically includes expensive and time-consuming pilot studies to simulate full-scale adsorber performance. Accordingly, the rapid small-scale column test (RSSCT) was developed and evaluated experimentally. The RSSCT can simulate months of f...

  7. Rover Attitude and Pointing System Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam

    2009-01-01

    The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.

  8. Laboratory automation in clinical bacteriology: what system to choose?

    PubMed

    Croxatto, A; Prod'hom, G; Faverjon, F; Rochais, Y; Greub, G

    2016-03-01

    Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Impact disruption of gravity-dominated bodies: New simulation data and scaling

    NASA Astrophysics Data System (ADS)

    Movshovitz, N.; Nimmo, F.; Korycansky, D. G.; Asphaug, E.; Owen, J. M.

    2016-09-01

    We present results from a suite of 169 hydrocode simulations of collisions between planetary bodies with radii from 100 to 1000 km. The simulation data are used to derive a simple scaling law for the threshold for catastrophic disruption, defined as a collision that leads to half the total colliding mass escaping the system post impact. For a target radius 100 ≤ RT ≤ 1000km and a mass MT and a projectile radius rp ≤ RT and mass mp we find that a head-on impact with velocity magnitude v is catastrophic if the kinetic energy of the system in the center of mass frame, K = 0.5MTmpv2 /(MT +mp) , exceeds a threshold value K* that is a few times U =(3 / 5) GMT2/RT +(3 / 5) Gmp2/rp + GMTmp /(RT +rp) , the gravitational binding energy of the system at the moment of impact; G is the gravitational constant. In all head-on collision runs we find K* =(5.5 ± 2.9) U . Oblique impacts are catastrophic when the fraction of kinetic energy contained in the volume of the projectile intersecting the target during impact exceeds ∼2 K* for 30° impacts and ∼3.5 K* for 45° impacts. We compare predictions made with this scaling to those made with existing scaling laws in the literature extrapolated from numerical studies on smaller targets. We find significant divergence between predictions where in general our results suggest a lower threshold for disruption except for highly oblique impacts with rp ≪ RT. This has implications for the efficiency of collisional grinding in the asteroid belt (Morbidelli et al., [2009] Icarus, 204, 558-573), Kuiper belt (Greenstreet et al., [2015] Icarus, 258, 267-288), and early Solar System accretion (Chambers [2013], Icarus, 224, 43-56).

  10. Multimodel Simulation of Water Flow: Uncertainty Analysis

    USDA-ARS?s Scientific Manuscript database

    Simulations of soil water flow require measurements of soil hydraulic properties which are particularly difficult at the field scale. Laboratory measurements provide hydraulic properties at scales finer than the field scale, whereas pedotransfer functions (PTFs) integrate information on hydraulic pr...

  11. Region 7 Laboratory Information Management System

    EPA Pesticide Factsheets

    This is metadata documentation for the Region 7 Laboratory Information Management System (R7LIMS) which maintains records for the Regional Laboratory. Any Laboratory analytical work performed is stored in this system which replaces LIMS-Lite, and before that LAST. The EPA and its contractors may use this database. The Office of Policy & Management (PLMG) Division at EPA Region 7 is the primary managing entity; contractors can access this database but it is not accessible to the public.

  12. Improvement of CFD Methods for Modeling Full Scale Circulating Fluidized Bed Combustion Systems

    NASA Astrophysics Data System (ADS)

    Shah, Srujal; Klajny, Marcin; Myöhänen, Kari; Hyppänen, Timo

    With the currently available methods of computational fluid dynamics (CFD), the task of simulating full scale circulating fluidized bed combustors is very challenging. In order to simulate the complex fluidization process, the size of calculation cells should be small and the calculation should be transient with small time step size. For full scale systems, these requirements lead to very large meshes and very long calculation times, so that the simulation in practice is difficult. This study investigates the requirements of cell size and the time step size for accurate simulations, and the filtering effects caused by coarser mesh and longer time step. A modeling study of a full scale CFB furnace is presented and the model results are compared with experimental data.

  13. GenASiS Basics: Object-oriented utilitarian functionality for large-scale physics simulations

    DOE PAGES

    Cardall, Christian Y.; Budiardja, Reuben D.

    2015-06-11

    Aside from numerical algorithms and problem setup, large-scale physics simulations on distributed-memory supercomputers require more basic utilitarian functionality, such as physical units and constants; display to the screen or standard output device; message passing; I/O to disk; and runtime parameter management and usage statistics. Here we describe and make available Fortran 2003 classes furnishing extensible object-oriented implementations of this sort of rudimentary functionality, along with individual `unit test' programs and larger example problems demonstrating their use. Lastly, these classes compose the Basics division of our developing astrophysics simulation code GenASiS (General Astrophysical Simulation System), but their fundamental nature makes themmore » useful for physics simulations in many fields.« less

  14. Creation and Validation of a Novel Mobile Simulation Laboratory for High Fidelity, Prehospital, Difficult Airway Simulation.

    PubMed

    Bischof, Jason J; Panchal, Ashish R; Finnegan, Geoffrey I; Terndrup, Thomas E

    2016-10-01

    Introduction Endotracheal intubation (ETI) is a complex clinical skill complicated by the inherent challenge of providing care in the prehospital setting. Literature reports a low success rate of prehospital ETI attempts, partly due to the care environment and partly to the lack of consistent standardized training opportunities of prehospital providers in ETI. Hypothesis/Problem The availability of a mobile simulation laboratory (MSL) to study clinically critical interventions is needed in the prehospital setting to enhance instruction and maintain proficiency. This report is on the development and validation of a prehospital airway simulator and MSL that mimics in situ care provided in an ambulance. The MSL was a Type 3 ambulance with four cameras allowing audio-video recordings of observable behaviors. The prehospital airway simulator is a modified airway mannequin with increased static tongue pressure and a rigid cervical collar. Airway experts validated the model in a static setting through ETI at varying tongue pressures with a goal of a Grade 3 Cormack-Lehane (CL) laryngeal view. Following completion of this development, the MSL was launched with the prehospital airway simulator to distant communities utilizing a single facilitator/driver. Paramedics were recruited to perform ETI in the MSL, and the detailed airway management observations were stored for further analysis. Nineteen airway experts performed 57 ETI attempts at varying tongue pressures demonstrating increased CL views at higher tongue pressures. Tongue pressure of 60 mm Hg generated 31% Grade 3/4 CL view and was chosen for the prehospital trials. The MSL was launched and tested by 18 paramedics. First pass success was 33% with another 33% failing to intubate within three attempts. The MSL created was configured to deliver, record, and assess intubator behaviors with a difficult airway simulation. The MSL created a reproducible, high fidelity, mobile learning environment for assessment of

  15. Australia's marine virtual laboratory

    NASA Astrophysics Data System (ADS)

    Proctor, Roger; Gillibrand, Philip; Oke, Peter; Rosebrock, Uwe

    2014-05-01

    In all modelling studies of realistic scenarios, a researcher has to go through a number of steps to set up a model in order to produce a model simulation of value. The steps are generally the same, independent of the modelling system chosen. These steps include determining the time and space scales and processes of the required simulation; obtaining data for the initial set up and for input during the simulation time; obtaining observation data for validation or data assimilation; implementing scripts to run the simulation(s); and running utilities or custom-built software to extract results. These steps are time consuming and resource hungry, and have to be done every time irrespective of the simulation - the more complex the processes, the more effort is required to set up the simulation. The Australian Marine Virtual Laboratory (MARVL) is a new development in modelling frameworks for researchers in Australia. MARVL uses the TRIKE framework, a java-based control system developed by CSIRO that allows a non-specialist user configure and run a model, to automate many of the modelling preparation steps needed to bring the researcher faster to the stage of simulation and analysis. The tool is seen as enhancing the efficiency of researchers and marine managers, and is being considered as an educational aid in teaching. In MARVL we are developing a web-based open source application which provides a number of model choices and provides search and recovery of relevant observations, allowing researchers to: a) efficiently configure a range of different community ocean and wave models for any region, for any historical time period, with model specifications of their choice, through a user-friendly web application, b) access data sets to force a model and nest a model into, c) discover and assemble ocean observations from the Australian Ocean Data Network (AODN, http://portal.aodn.org.au/webportal/) in a format that is suitable for model evaluation or data assimilation, and

  16. Optical laboratory solution and error model simulation of a linear time-varying finite element equation

    NASA Technical Reports Server (NTRS)

    Taylor, B. K.; Casasent, D. P.

    1989-01-01

    The use of simplified error models to accurately simulate and evaluate the performance of an optical linear-algebra processor is described. The optical architecture used to perform banded matrix-vector products is reviewed, along with a linear dynamic finite-element case study. The laboratory hardware and ac-modulation technique used are presented. The individual processor error-source models and their simulator implementation are detailed. Several significant simplifications are introduced to ease the computational requirements and complexity of the simulations. The error models are verified with a laboratory implementation of the processor, and are used to evaluate its potential performance.

  17. Large scale in-situ BOrehole and Geofluid Simulator (i.BOGS) for the development and testing of borehole technologies at reservoir conditions

    NASA Astrophysics Data System (ADS)

    Duda, Mandy; Bracke, Rolf; Stöckhert, Ferdinand; Wittig, Volker

    2017-04-01

    A fundamental problem of technological applications related to the exploration and provision of geothermal energy is the inaccessibility of subsurface processes. As a result, actual reservoir properties can only be determined using (a) indirect measurement techniques such as seismic surveys, machine feedback and geophysical borehole logging, (b) laboratory experiments capable of simulating in-situ properties, but failing to preserve temporal and spatial scales, or vice versa, and (c) numerical simulations. Moreover, technological applications related to the drilling process, the completion and cementation of a wellbore or the stimulation and exploitation of the reservoir are exposed to high pressure and temperature conditions as well as corrosive environments resulting from both, rock formation and geofluid characteristics. To address fundamental and applied questions in the context of geothermal energy provision and subsurface exploration in general one of Europe's largest geoscientific laboratory infrastructures is introduced. The in-situ Borehole and Geofluid Simulator (i.BOGS) allows to simulate quasi scale-preserving processes at reservoir conditions up to depths of 5000 m and represents a large scale pressure vessel for iso-/hydrostatic and pore pressures up to 125 MPa and temperatures from -10°C to 180°C. The autoclave can either be filled with large rock core samples (25 cm in diameter, up to 3 m length) or with fluids and technical borehole devices (e.g. pumps, sensors). The pressure vessel is equipped with an ultrasound system for active transmission and passive recording of acoustic emissions, and can be complemented by additional sensors. The i.BOGS forms the basic module for the Match.BOGS finally consisting of three modules, i.e. (A) the i.BOGS, (B) the Drill.BOGS, a drilling module to be attached to the i.BOGS capable of applying realistic torques and contact forces to a drilling device that enters the i.BOGS, and (C) the Fluid.BOGS, a geofluid

  18. Teaching Engineering Design in a Laboratory Setting

    ERIC Educational Resources Information Center

    Hummon, Norman P.; Bullen, A. G. R.

    1974-01-01

    Discusses the establishment of an environmental systems laboratory at the University of Pittsburgh with the support of the Sloan Foundation. Indicates that the "real world" can be brought into the laboratory by simulating on computers, software systems, and data bases. (CC)

  19. A View on Future Building System Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wetter, Michael

    This chapter presents what a future environment for building system modeling and simulation may look like. As buildings continue to require increased performance and better comfort, their energy and control systems are becoming more integrated and complex. We therefore focus in this chapter on the modeling, simulation and analysis of building energy and control systems. Such systems can be classified as heterogeneous systems because they involve multiple domains, such as thermodynamics, fluid dynamics, heat and mass transfer, electrical systems, control systems and communication systems. Also, they typically involve multiple temporal and spatial scales, and their evolution can be described bymore » coupled differential equations, discrete equations and events. Modeling and simulating such systems requires a higher level of abstraction and modularisation to manage the increased complexity compared to what is used in today's building simulation programs. Therefore, the trend towards more integrated building systems is likely to be a driving force for changing the status quo of today's building simulation programs. Thischapter discusses evolving modeling requirements and outlines a path toward a future environment for modeling and simulation of heterogeneous building systems.A range of topics that would require many additional pages of discussion has been omitted. Examples include computational fluid dynamics for air and particle flow in and around buildings, people movement, daylight simulation, uncertainty propagation and optimisation methods for building design and controls. For different discussions and perspectives on the future of building modeling and simulation, we refer to Sahlin (2000), Augenbroe (2001) and Malkawi and Augenbroe (2004).« less

  20. Large-scale three-dimensional phase-field simulations for phase coarsening at ultrahigh volume fraction on high-performance architectures

    NASA Astrophysics Data System (ADS)

    Yan, Hui; Wang, K. G.; Jones, Jim E.

    2016-06-01

    A parallel algorithm for large-scale three-dimensional phase-field simulations of phase coarsening is developed and implemented on high-performance architectures. From the large-scale simulations, a new kinetics in phase coarsening in the region of ultrahigh volume fraction is found. The parallel implementation is capable of harnessing the greater computer power available from high-performance architectures. The parallelized code enables increase in three-dimensional simulation system size up to a 5123 grid cube. Through the parallelized code, practical runtime can be achieved for three-dimensional large-scale simulations, and the statistical significance of the results from these high resolution parallel simulations are greatly improved over those obtainable from serial simulations. A detailed performance analysis on speed-up and scalability is presented, showing good scalability which improves with increasing problem size. In addition, a model for prediction of runtime is developed, which shows a good agreement with actual run time from numerical tests.