Integration of Irma tactical scene generator into directed-energy weapon system simulation
NASA Astrophysics Data System (ADS)
Owens, Monte A.; Cole, Madison B., III; Laine, Mark R.
2003-08-01
Integrated high-fidelity physics-based simulations that include engagement models, image generation, electro-optical hardware models and control system algorithms have previously been developed by Boeing-SVS for various tracking and pointing systems. These simulations, however, had always used images with featureless or random backgrounds and simple target geometries. With the requirement to engage tactical ground targets in the presence of cluttered backgrounds, a new type of scene generation tool was required to fully evaluate system performance in this challenging environment. To answer this need, Irma was integrated into the existing suite of Boeing-SVS simulation tools, allowing scene generation capabilities with unprecedented realism. Irma is a US Air Force research tool used for high-resolution rendering and prediction of target and background signatures. The MATLAB/Simulink-based simulation achieves closed-loop tracking by running track algorithms on the Irma-generated images, processing the track errors through optical control algorithms, and moving simulated electro-optical elements. The geometry of these elements determines the sensor orientation with respect to the Irma database containing the three-dimensional background and target models. This orientation is dynamically passed to Irma through a Simulink S-function to generate the next image. This integrated simulation provides a test-bed for development and evaluation of tracking and control algorithms against representative images including complex background environments and realistic targets calibrated using field measurements.
Background and Recent Progress in Anomalous Transport Simulation
2017-07-19
NUMBER (Include area code) 19 July 2017 Briefing Charts 14 June 2017 - 19 July 2017 Background and Recent Progress in Anomalous Transport Simulation ...and Recent Progress in Anomalous Transport Simulation 19 Jul 2017 Justin Koo AFRL/RQRS Edwards AFB, CA 2DISTRIBUTION A: Approved for public release...Baalrud, S.D. and Chabert, P., “Theory for the anomalous electron transport in Hall effect thrusters. I. Insights from particle-in-cell simulations
NASA Astrophysics Data System (ADS)
Sun, Anbang; Teunissen, Jannis; Ebert, Ute
2014-11-01
We investigate discharge inception in air, in uniform background electric fields above and below the breakdown threshold. We perform 3D particle simulations that include a natural level of background ionization in the form of positive and \\text{O}2- ions. In background fields below breakdown, we use a strongly ionized seed of electrons and positive ions to enhance the field locally. In the region of enhanced field, we observe the growth of positive streamers, as in previous simulations with 2D plasma fluid models. The inclusion of background ionization has little effect in this case. When the background field is above the breakdown threshold, the situation is very different. Electrons can then detach from \\text{O}2- and start ionization avalanches in the whole volume. These avalanches together create one extended discharge, in contrast to the ‘double-headed’ streamers found in many fluid simulations.
NASA Technical Reports Server (NTRS)
Lipatov, A. S.; Sittler, E. C., Jr.; Hartle, R. E.; Cooper, J. F.; Simpson, D. G.
2011-01-01
In this report we discuss the ion velocity distribution dynamics from the 3D hybrid simulation. In our model the background, pickup, and ionospheric ions are considered as a particles, whereas the electrons are described as a fluid. Inhomogeneous photoionization, electron-impact ionization and charge exchange are included in our model. We also take into account the collisions between the ions and neutrals. The current simulation shows that mass loading by pickup ions H(+); H2(+), CH4(+) and N2(+) is stronger than in the previous simulations when O+ ions are introduced into the background plasma. In our hybrid simulations we use Chamberlain profiles for the atmospheric components. We also include a simple ionosphere model with average mass M = 28 amu ions that were generated inside the ionosphere. The moon is considered as a weakly conducting body. Special attention will be paid to comparing the simulated pickup ion velocity distribution with CAPS T9 observations. Our simulation shows an asymmetry of the ion density distribution and the magnetic field, including the formation of the Alfve n wing-like structures. The simulation also shows that the ring-like velocity distribution for pickup ions relaxes to a Maxwellian core and a shell-like halo.
High-Fidelity Simulation in Biomedical and Aerospace Engineering
NASA Technical Reports Server (NTRS)
Kwak, Dochan
2005-01-01
Contents include the following: Introduction / Background. Modeling and Simulation Challenges in Aerospace Engineering. Modeling and Simulation Challenges in Biomedical Engineering. Digital Astronaut. Project Columbia. Summary and Discussion.
NASA Astrophysics Data System (ADS)
Ekberg, Joakim; Timpka, Toomas; Morin, Magnus; Jenvald, Johan; Nyce, James M.; Gursky, Elin A.; Eriksson, Henrik
Computer simulations have emerged as important tools in the preparation for outbreaks of infectious disease. To support the collaborative planning and responding to the outbreaks, reports from simulations need to be transparent (accessible) with regard to the underlying parametric settings. This paper presents a design for generation of simulation reports where the background settings used in the simulation models are automatically visualized. We extended the ontology-management system Protégé to tag different settings into categories, and included these in report generation in parallel to the simulation outcomes. The report generator takes advantage of an XSLT specification and collects the documentation of the particular simulation settings into abridged XMLs including also summarized results. We conclude that even though inclusion of critical background settings in reports may not increase the accuracy of infectious disease simulations, it can prevent misunderstandings and less than optimal public health decisions.
Simulation of PEP-II Accelerator Backgrounds Using TURTLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barlow, R.J.; Fieguth, T.; /SLAC
2006-02-15
We present studies of accelerator-induced backgrounds in the BaBar detector at the SLAC B-Factory, carried out using LPTURTLE, a modified version of the DECAY TURTLE simulation package. Lost-particle backgrounds in PEP-II are dominated by a combination of beam-gas bremstrahlung, beam-gas Coulomb scattering, radiative-Bhabha events and beam-beam blow-up. The radiation damage and detector occupancy caused by the associated electromagnetic shower debris can limit the usable luminosity. In order to understand and mitigate such backgrounds, we have performed a full program of beam-gas and luminosity-background simulations, that include the effects of the detector solenoidal field, detailed modeling of limiting apertures in bothmore » collider rings, and optimization of the betatron collimation scheme in the presence of large transverse tails.« less
NASA Astrophysics Data System (ADS)
Jiang, Houshuo; Grosenbaugh, Mark A.
2002-11-01
Numerical simulations are used to study the laminar vortex ring formation in the presence of background flow. The numerical setup includes a round-headed axisymmetric body with a sharp-wedged opening at the posterior end where a column of fluid is pushed out by a piston inside the body. The piston motion is explicitly included into the simulations by using a deforming mesh. The numerical method is verified by simulating the standard vortex ring formation process in quiescent fluid for a wide range of piston stroke to cylinder diameter ratios (Lm/D). The results from these simulations confirm the existence of a universal formation time scale (formation number) found by others from experimental and numerical studies. For the case of vortex ring formation by the piston/cylinder arrangement in a constant background flow (i.e. the background flow is in the direction of the piston motion), the results show that a smaller fraction of the ejected circulation is delivered into the leading vortex ring, thereby decreasing the formation number. The mechanism behind this reduction is believed to be related to the modification of the shear layer profile between the jet flow and the background flow by the external boundary layer on the outer surface of the cylinder. In effect, the vorticity in the jet is cancelled by the opposite signed vorticity in the external boundary layer. Simulations using different end geometries confirm the general nature of the phenomenon. The thrust generated from the jet and the drag forces acting on the body are calculated with and without background flow for different piston programs. The implications of these results for squid propulsion are discussed.
Monte Carlo Simulations of Background Spectra in Integral Imager Detectors
NASA Technical Reports Server (NTRS)
Armstrong, T. W.; Colborn, B. L.; Dietz, K. L.; Ramsey, B. D.; Weisskopf, M. C.
1998-01-01
Predictions of the expected gamma-ray backgrounds in the ISGRI (CdTe) and PiCsIT (Csl) detectors on INTEGRAL due to cosmic-ray interactions and the diffuse gamma-ray background have been made using a coupled set of Monte Carlo radiation transport codes (HETC, FLUKA, EGS4, and MORSE) and a detailed, 3-D mass model of the spacecraft and detector assemblies. The simulations include both the prompt background component from induced hadronic and electromagnetic cascades and the delayed component due to emissions from induced radioactivity. Background spectra have been obtained with and without the use of active (BGO) shielding and charged particle rejection to evaluate the effectiveness of anticoincidence counting on background rejection.
Can muon-induced backgrounds explain the DAMA data?
NASA Astrophysics Data System (ADS)
Klinger, Joel; Kudryavtsev, Vitaly A.
2016-05-01
We present an accurate simulation of the muon-induced background in the DAMA/LIBRA experiment. Muon sampling underground has been performed using the MUSIC/MUSUN codes and subsequent interactions in the rock around the DAMA/LIBRA detector cavern and the experimental setup including shielding, have been simulated with GEANT4.9.6. In total we simulate the equivalent of 20 years of muon data. We have calculated the total muon-induced neutron flux in the DAMA/LIBRA detector cavern as Φμ n = 1.0 × 10-9 cm-2s-1, which is consistent with other simulations. After selecting events which satisfy the DAMA/LIBRA signal criteria, our simulation predicts 3.49 × 10-5 cpd/kg/keV which accounts for less than 0.3% of the DAMA/LIBRA modulation amplitude. We conclude from our work that muon-induced backgrounds are unable to contribute to the observed signal modulation.
NASA Technical Reports Server (NTRS)
Koga, J. K.; Lin, C. S.; Winglee, R. M.
1989-01-01
Injections of nonrelativistic electron beams from an isolated equipotential conductor into a uniform background of plasma and neutral gas were simulated using a 2-D electrostatic particle code. The ionization effects on spacecraft charging are examined by including interactions of electrons with neutral gas. The simulations show that the conductor charging potential decreases with increasing neutral background density due to the production of secondary electrons near the conductor surface. In the spacecraft wake, the background electrons accelerated towards the charged spacecraft produce an enhancement of secondary electrons and ions. Simulations run for longer times indicate that the spacecraft potential is further reduced and short wavelength beam-plasma oscillations appear. The results are applied to explain the spacecraft charging potential measured during the SEPAC experiments from Spacelab 1.
ATLAS Simulation using Real Data: Embedding and Overlay
NASA Astrophysics Data System (ADS)
Haas, Andrew; ATLAS Collaboration
2017-10-01
For some physics processes studied with the ATLAS detector, a more accurate simulation in some respects can be achieved by including real data into simulated events, with substantial potential improvements in the CPU, disk space, and memory usage of the standard simulation configuration, at the cost of significant database and networking challenges. Real proton-proton background events can be overlaid (at the detector digitization output stage) on a simulated hard-scatter process, to account for pileup background (from nearby bunch crossings), cavern background, and detector noise. A similar method is used to account for the large underlying event from heavy ion collisions, rather than directly simulating the full collision. Embedding replaces the muons found in Z→μμ decays in data with simulated taus at the same 4-momenta, thus preserving the underlying event and pileup from the original data event. In all these cases, care must be taken to exactly match detector conditions (beamspot, magnetic fields, alignments, dead sensors, etc.) between the real data event and the simulation. We will discuss the status of these overlay and embedding techniques within ATLAS software and computing.
STS-132 crew during their PDRS N-TSK MRM training in the building 16 cupola trainer.
2009-12-22
JSC2009-E-286974 (22 Dec. 2009) --- Astronauts Ken Ham (left background), STS-132 commander; Tony Antonelli (right background), pilot; and Mike Good, mission specialist, participate in an exercise in the systems engineering simulator in the Avionics Systems Laboratory at NASA?s Johnson Space Center. The facility includes moving scenes of full-sized International Space Station components over a simulated Earth.
Web-Based Simulation in Psychiatry Residency Training: A Pilot Study
ERIC Educational Resources Information Center
Gorrindo, Tristan; Baer, Lee; Sanders, Kathy M.; Birnbaum, Robert J.; Fromson, John A.; Sutton-Skinner, Kelly M.; Romeo, Sarah A.; Beresin, Eugene V.
2011-01-01
Background: Medical specialties, including surgery, obstetrics, anesthesia, critical care, and trauma, have adopted simulation technology for measuring clinical competency as a routine part of their residency training programs; yet, simulation technologies have rarely been adapted or used for psychiatry training. Objective: The authors describe…
Background Error Correlation Modeling with Diffusion Operators
2013-01-01
RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (Include area code) 07-10-2013 Book Chapter Background Error Correlation Modeling with Diffusion Operators...normalization Unclassified Unclassified Unclassified UU 27 Max Yaremchuk (228) 688-5259 Reset Chapter 8 Background error correlation modeling with diffusion ...field, then a structure like this simulates enhanced diffusive transport of model errors in the regions of strong cur- rents on the background of
Manual for the Jet Event and Background Simulation Library(JEBSimLib)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heinz, Matthias; Soltz, Ron; Angerami, Aaron
Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand and quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momentamore » are reduced by a user-de ned constant fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model consisting of a 3-dimensional Boltzmann distribution for particle types and momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested rst on pure jet events and then on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction.« less
Manual for the Jet Event and Background Simulation Library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heinz, M.; Soltz, R.; Angerami, A.
Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand and quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momentamore » are reduced by a user-de ned constant fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model consisting of a 3-dimensional Boltzmann distribution for particle types and momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested rst on pure jet events and then on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction.« less
Simulating environmental and psychological acoustic factors of the operating room.
Bennett, Christopher L; Dudaryk, Roman; Ayers, Andrew L; McNeer, Richard R
2015-12-01
In this study, an operating room simulation environment was adapted to include quadraphonic speakers, which were used to recreate a composed clinical soundscape. To assess validity of the composed soundscape, several acoustic parameters of this simulated environment were acquired in the presence of alarms only, background noise only, or both. These parameters were also measured for comparison from size-matched operating rooms at Jackson Memorial Hospital. The parameters examined included sound level, reverberation time, and predictive metrics of speech intelligibility in quiet and noise. It was found that the sound levels and acoustic parameters were comparable between the simulated environment and the actual operating rooms. The impact of the background noise on the perception of medical alarms was then examined, and was found to have little impact on the audibility of the alarms. This study is a first in kind report of a comparison between the environmental and psychological acoustical parameters of a hospital simulation environment and actual operating rooms.
NASA Technical Reports Server (NTRS)
Lytle, John
2001-01-01
This report provides an overview presentation of the 2000 NPSS (Numerical Propulsion System Simulation) Review and Planning Meeting. Topics include: 1) a background of the program; 2) 1999 Industry Feedback; 3) FY00 Status, including resource distribution and major accomplishments; 4) FY01 Major Milestones; and 5) Future direction for the program. Specifically, simulation environment/production software and NPSS CORBA Security Development are discussed.
Studies of the Low-energy Gamma Background
NASA Astrophysics Data System (ADS)
Bikit, K.; Mrđa, D.; Bikit, I.; Slivka, J.; Veskovic, M.; Knezevic, D.
The investigations of contribution to the low-energy part of background gamma spectrum (below 100 keV) and knowing detection efficiency for this region are important for both, a fundamental, as well as for applied research. In this work, the components contributing to the low-energy region of background gamma spectrum for shielded detector are analyzed, including the production and spectral distribution of muon-induced continuous low-energy radiation in the vicinity of high-purity germanium detector.In addition, the detection efficiency for low energy gamma region is determined using the GEANT 4 simulation package. This technique offers excellent opportunity to predict the detection response in mentioned region. Unfortunately, the frequently weakly known dead layer thickness on the surface of the extended-range detector, as well as some processes which are not incorporated in simulation (e.g. charge collection from detector active volume) may limit the reliability of simulation technique. Thus, the 14, 17, 21, 26, 33, 59.5 keV transitions in the calibrated 241Am point source were used to check the simulated efficiencies.
Multidisciplinary research leading to utilization of extraterrestrial resources
NASA Technical Reports Server (NTRS)
1972-01-01
Progress of the research accomplished during fiscal year 1972 is reported. The summaries presented include: (1) background analysis and coordination, (2) surface properties of rock in simulated lunar environment, (3) rock failure processes, strength and elastic properties in simulated lunar environment, (4) thermal fragmentation, and thermophysical and optical properties in simulated lunar environment, and (5) use of explosives on the moon.
Impact analysis of composite aircraft structures
NASA Technical Reports Server (NTRS)
Pifko, Allan B.; Kushner, Alan S.
1993-01-01
The impact analysis of composite aircraft structures is discussed. Topics discussed include: background remarks on aircraft crashworthiness; comments on modeling strategies for crashworthiness simulation; initial study of simulation of progressive failure of an aircraft component constructed of composite material; and research direction in composite characterization for impact analysis.
Cascaded analysis of signal and noise propagation through a heterogeneous breast model.
Mainprize, James G; Yaffe, Martin J
2010-10-01
The detectability of lesions in radiographic images can be impaired by patterns caused by the surrounding anatomic structures. The presence of such patterns is often referred to as anatomic noise. Others have previously extended signal and noise propagation theory to include variable background structure as an additional noise term and used in simulations for analysis by human and ideal observers. Here, the analytic forms of the signal and noise transfer are derived to obtain an exact expression for any input random distribution and the "power law" filter used to generate the texture of the tissue distribution. A cascaded analysis of propagation through a heterogeneous model is derived for x-ray projection through simulated heterogeneous backgrounds. This is achieved by considering transmission through the breast as a correlated amplification point process. The analytic forms of the cascaded analysis were compared to monoenergetic Monte Carlo simulations of x-ray propagation through power law structured backgrounds. As expected, it was found that although the quantum noise power component scales linearly with the x-ray signal, the anatomic noise will scale with the square of the x-ray signal. There was a good agreement between results obtained using analytic expressions for the noise power and those from Monte Carlo simulations for different background textures, random input functions, and x-ray fluence. Analytic equations for the signal and noise properties of heterogeneous backgrounds were derived. These may be used in direct analysis or as a tool to validate simulations in evaluating detectability.
No Friends but the Mountains: A Simulation on Kurdistan.
ERIC Educational Resources Information Center
Major, Marc R.
1996-01-01
Presents a simulation that focuses on Kurdish nationalism and the struggle for autonomy and independence from the states that rule over Kurdish lands. Students assume the roles of either one of the countries directly involved or the governing body of the United Nations. Includes extensive background material. (MJP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merkin, V. G.; Lionello, R.; Linker, J.
2016-11-01
Two well-established magnetohydrodynamic (MHD) codes are coupled to model the solar corona and the inner heliosphere. The corona is simulated using the MHD algorithm outside a sphere (MAS) model. The Lyon–Fedder–Mobarry (LFM) model is used in the heliosphere. The interface between the models is placed in a spherical shell above the critical point and allows both models to work in either a rotating or an inertial frame. Numerical tests are presented examining the coupled model solutions from 20 to 50 solar radii. The heliospheric simulations are run with both LFM and the MAS extension into the heliosphere, and use themore » same polytropic coronal MAS solutions as the inner boundary condition. The coronal simulations are performed for idealized magnetic configurations, with an out-of-equilibrium flux rope inserted into an axisymmetric background, with and without including the solar rotation. The temporal evolution at the inner boundary of the LFM and MAS solutions is shown to be nearly identical, as are the steady-state background solutions, prior to the insertion of the flux rope. However, after the coronal mass ejection has propagated through the significant portion of the simulation domain, the heliospheric solutions diverge. Additional simulations with different resolution are then performed and show that the MAS heliospheric solutions approach those of LFM when run with progressively higher resolution. Following these detailed tests, a more realistic simulation driven by the thermodynamic coronal MAS is presented, which includes solar rotation and an azimuthally asymmetric background and extends to the Earth’s orbit.« less
The ACT Vision Mission Study Simulation Effort
NASA Astrophysics Data System (ADS)
Wunderer, C. B.; Kippen, R. M.; Bloser, P. F.; Boggs, S. E.; McConnell, M. L.; Hoover, A.; Oberlack, U.; Sturner, S.; Tournear, D.; Weidenspointner, G.; Zoglauer, A.
2004-12-01
The Advanced Compton Telescope (ACT) has been selected by NASA for a one-year "Vision Mission" study. The main goal of this study is to determine feasible instrument configurations to achieve ACT's sensitivity requirements, and to give recommendations for technology development. Space-based instruments operating in the energy range of nuclear lines are subject to complex backgrounds generated by cosmic-ray interactions and diffuse gamma rays; typically measurements are significantly background-dominated. Therefore accurate, detailed simulations of the background induced in different ACT configurations, and exploration of event selection and reconstruction techniques for reducing these backgrounds, are crucial to determining both the capabilities of a given instrument configuration and the technology enhancements that would result in the most significant performance improvements. The ACT Simulation team has assembled a complete suite of tools that allows the generation of particle backgrounds for a given orbit (based on CREME96), their propagation through any instrument and spacecraft geometry (using MGGPOD) - including delayed photon emission from instrument activation - as well as the event selection and reconstruction of Compton-scatter events in the given detectors (MEGAlib). The package can deal with polarized photon beams as well as e.g. anticoincidence shields. We will report on the progress of the ACT simulation effort and the suite of tools used. We thank Elena Novikova at NRL for her contributions, and NASA for support of this research.
Acceleration techniques for dependability simulation. M.S. Thesis
NASA Technical Reports Server (NTRS)
Barnette, James David
1995-01-01
As computer systems increase in complexity, the need to project system performance from the earliest design and development stages increases. We have to employ simulation for detailed dependability studies of large systems. However, as the complexity of the simulation model increases, the time required to obtain statistically significant results also increases. This paper discusses an approach that is application independent and can be readily applied to any process-based simulation model. Topics include background on classical discrete event simulation and techniques for random variate generation and statistics gathering to support simulation.
Magnetospheric Reconnection in Modified Current-Sheet Equilibria
NASA Astrophysics Data System (ADS)
Newman, D. L.; Goldman, M. V.; Lapenta, G.; Markidis, S.
2012-10-01
Particle simulations of magnetic reconnection in Earth's magnetosphere are frequently initialized with a current-carrying Harris equilibrium superposed on a current-free uniform background plasma. The Harris equilibrium satisfies local charge neutrality, but requires that the sheet current be dominated by the hotter species -- often the ions in Earth's magnetosphere. This constraint is not necessarily consistent with observations. A modified kinetic equilibrium that relaxes this constraint on the currents was proposed by Yamada et al. [Phys. Plasmas., 7, 1781 (2000)] with no background population. These modified equilibria were characterized by an asymptotic converging or diverging electrostatic field normal to the current sheet. By reintroducing the background plasma, we have developed new families of equilibria where the asymptotic fields are suppressed by Debye shielding. Because the electrostatic potential profiles of these new equilibria contain wells and/or barriers capable of spatially isolating different populations of electrons and/or ions, these solutions can be further generalized to include classes of asymmetric kinetic equilibria. Examples of both symmetric and asymmetric equilibria will be presented. The dynamical evolution of these equilibria, when perturbed, will be further explored by means of implicit 2D PIC reconnection simulations, including comparisons with simulations employing standard Harris-equilibrium initializations.
Description of the dynamic infrared background/target simulator (DIBS)
NASA Astrophysics Data System (ADS)
Lujan, Ignacio
1988-01-01
The purpose of the Dynamic Infrared Background/Target Simulator (DIBS) is to project dynamic infrared scenes to a test sensor; e.g., a missile seeker that is sensitive to infrared energy. The projected scene will include target(s) and background. This system was designed to present flicker-free infrared scenes in the 8 micron to 12 micron wavelength region. The major subassemblies of the DIBS are the laser write system (LWS), vanadium dioxide modulator assembly, scene data buffer (SDB), and the optical image translator (OIT). This paper describes the overall concept and design of the infrared scene projector followed by some details of the LWS and VO2 modulator. Also presented are brief descriptions of the SDB and OIT.
Ishizaki, Makiko; Maeda, Hatsuo; Okamoto, Ikuko
2014-01-01
Color-weak persons, who in Japan represent approximately 5% of male and 0.2% of female population, may not be able to discriminate among colors of tablets. Thus using color-weak simulation by Variantor™ we evaluated the effects of background colors (light, medium, and dark gray, purple, blue, and blue green) on discrimination among yellow, yellow red, red, and mixed group tablets by our established method. In addition, the influence of white 10-mm ruled squares on background sheets was examined, and the change in color of the tablets and background sheets through the simulation measured. Variance analysis of the data obtained from 42 volunteers demonstrated that with color-weak vision, the best discrimination among yellow, yellow red, or mixed group tablets was achieved on a dark gray background sheet, and a blue background sheet was useful to discriminate among each tablet group in all colors including red. These results were compared with those previously obtained with healthy and cataractous vision, suggesting that gap in color hue and chroma as well as value between background sheets and tablets affects discrimination with color-weak vision. The observed positive effects of white ruled squares, in contrast to those observed on healthy and cataractous vision, demonstrate that a background sheet arranged by two colors allows color-weak persons to discriminate among all sets of tablets in a sharp and feasible manner.
Sales Role Play: An Online Simulation
ERIC Educational Resources Information Center
Newberry, Robert; Collins, Marianne
2017-01-01
The online role play simulation as described in this article addresses critical skills as identified by practitioners and includes background materials, buyer and seller profiles, a sale/no-sale decision matrix, as well as a grading rubric, thereby facilitating a variety of selling scenarios. Both the buyer and the seller have integral roles in…
MassiveNuS: cosmological massive neutrino simulations
NASA Astrophysics Data System (ADS)
Liu, Jia; Bird, Simeon; Zorrilla Matilla, José Manuel; Hill, J. Colin; Haiman, Zoltán; Madhavacheril, Mathew S.; Petri, Andrea; Spergel, David N.
2018-03-01
The non-zero mass of neutrinos suppresses the growth of cosmic structure on small scales. Since the level of suppression depends on the sum of the masses of the three active neutrino species, the evolution of large-scale structure is a promising tool to constrain the total mass of neutrinos and possibly shed light on the mass hierarchy. In this work, we investigate these effects via a large suite of N-body simulations that include massive neutrinos using an analytic linear-response approximation: the Cosmological Massive Neutrino Simulations (MassiveNuS). The simulations include the effects of radiation on the background expansion, as well as the clustering of neutrinos in response to the nonlinear dark matter evolution. We allow three cosmological parameters to vary: the neutrino mass sum Mν in the range of 0–0.6 eV, the total matter density Ωm, and the primordial power spectrum amplitude As. The rms density fluctuation in spheres of 8 comoving Mpc/h (σ8) is a derived parameter as a result. Our data products include N-body snapshots, halo catalogues, merger trees, ray-traced galaxy lensing convergence maps for four source redshift planes between zs=1–2.5, and ray-traced cosmic microwave background lensing convergence maps. We describe the simulation procedures and code validation in this paper. The data are publicly available at http://columbialensing.org.
NASA Astrophysics Data System (ADS)
Huang, Z.; Jia, X.; Rubin, M.; Fougere, N.; Gombosi, T. I.; Tenishev, V.; Combi, M. R.; Bieler, A. M.; Toth, G.; Hansen, K. C.; Shou, Y.
2014-12-01
We study the plasma environment of the comet Churyumov-Gerasimenko, which is the target of the Rosetta mission, by performing large scale numerical simulations. Our model is based on BATS-R-US within the Space Weather Modeling Framework that solves the governing multifluid MHD equations, which describe the behavior of the cometary heavy ions, the solar wind protons, and electrons. The model includes various mass loading processes, including ionization, charge exchange, dissociative ion-electron recombination, as well as collisional interactions between different fluids. The neutral background used in our MHD simulations is provided by a kinetic Direct Simulation Monte Carlo (DSMC) model. We will simulate how the cometary plasma environment changes at different heliocentric distances.
NASA Technical Reports Server (NTRS)
Jago, S.; Baty, D.; Oconnor, S.; Palmer, E.
1981-01-01
The concept of a cockpit display of traffic information (CDTI) includes the integration of air traffic, navigation, and other pertinent information in a single electronic display in the cockpit. Concise display symbology was developed for use in later full-mission simulator evaluations of the CDTI concept. Experimental variables used included the update interval motion of the aircraft, the update type, (that is, whether the two aircraft were updated at the same update interval or not), the background (grid pattern or no background), and encounter type (straight or curved). Only the type of encounter affected performance.
Traffic and Driving Simulator Based on Architecture of Interactive Motion.
Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza
2015-01-01
This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination.
Traffic and Driving Simulator Based on Architecture of Interactive Motion
Paz, Alexander; Veeramisti, Naveen; Khaddar, Romesh; de la Fuente-Mella, Hanns; Modorcea, Luiza
2015-01-01
This study proposes an architecture for an interactive motion-based traffic simulation environment. In order to enhance modeling realism involving actual human beings, the proposed architecture integrates multiple types of simulation, including: (i) motion-based driving simulation, (ii) pedestrian simulation, (iii) motorcycling and bicycling simulation, and (iv) traffic flow simulation. The architecture has been designed to enable the simulation of the entire network; as a result, the actual driver, pedestrian, and bike rider can navigate anywhere in the system. In addition, the background traffic interacts with the actual human beings. This is accomplished by using a hybrid mesomicroscopic traffic flow simulation modeling approach. The mesoscopic traffic flow simulation model loads the results of a user equilibrium traffic assignment solution and propagates the corresponding traffic through the entire system. The microscopic traffic flow simulation model provides background traffic around the vicinities where actual human beings are navigating the system. The two traffic flow simulation models interact continuously to update system conditions based on the interactions between actual humans and the fully simulated entities. Implementation efforts are currently in progress and some preliminary tests of individual components have been conducted. The implementation of the proposed architecture faces significant challenges ranging from multiplatform and multilanguage integration to multievent communication and coordination. PMID:26491711
Modular, high power, variable R dynamic electrical load simulator
NASA Technical Reports Server (NTRS)
Joncas, K. P.
1974-01-01
The design of a previously developed basic variable R load simulator was entended to increase its power dissipation and transient handling capabilities. The delivered units satisfy all design requirements, and provides for a high power, modular simulation capability uniquely suited to the simulation of complex load responses. In addition to presenting conclusions and recommendations and pertinent background information, the report covers program accomplishments; describes the simulator basic circuits, transfer characteristic, protective features, assembly, and specifications; indicates the results of simulator evaluation, including burn-in and acceptance testing; provides acceptance test data; and summarizes the monthly progress reports.
Force Measurement on the GLAST Delta II Flight
NASA Technical Reports Server (NTRS)
Gordon, Scott; Kaufman, Daniel
2009-01-01
This viewgraph presentation reviews the interface force measurement at spacecraft separation of GLAST Delta II. The contents include: 1) Flight Force Measurement (FFM) Background; 2) Team Members; 3) GLAST Mission Overview; 4) Methodology Development; 5) Ground Test Validation; 6) Flight Data; 7) Coupled Loads Simulation (VCLA & Reconstruction); 8) Basedrive Simulation; 9) Findings; and 10) Summary and Conclusions.
ForCent Model Development and Testing using the Enriched Background Isotope Study (EBIS) Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parton, William; Hanson, Paul J; Swanston, Chris
The ForCent forest ecosystem model was developed by making major revisions to the DayCent model including: (1) adding a humus organic pool, (2) incorporating a detailed root growth model, and (3) including plant phenological growth patterns. Observed plant production and soil respiration data from 1993 to 2000 were used to demonstrate that the ForCent model could accurately simulate ecosystem carbon dynamics for the Oak Ridge National Laboratory deciduous forest. A comparison of ForCent versus observed soil pool 14C signature (? 14C) data from the Enriched Background Isotope Study 14C experiment (1999-2006) shows that the model correctly simulates the temporal dynamicsmore » of the 14C label as it moved from the surface litter and roots into the mineral soil organic matter pools. ForCent model validation was performed by comparing the observed Enriched Background Isotope Study experimental data with simulated live and dead root biomass ? 14C data, and with soil respiration ? 14C (mineral soil, humus layer, leaf litter layer, and total soil respiration) data. Results show that the model correctly simulates the impact of the Enriched Background Isotope Study 14C experimental treatments on soil respiration ? 14C values for the different soil organic matter pools. Model results suggest that a two-pool root growth model correctly represents root carbon dynamics and inputs to the soil. The model fitting process and sensitivity analysis exposed uncertainty in our estimates of the fraction of mineral soil in the slow and passive pools, dissolved organic carbon flux out of the litter layer into the mineral soil, and mixing of the humus layer into the mineral soil layer.« less
Status of the Simbol-X Background Simulation Activities
NASA Astrophysics Data System (ADS)
Tenzer, C.; Briel, U.; Bulgarelli, A.; Chipaux, R.; Claret, A.; Cusumano, G.; Dell'Orto, E.; Fioretti, V.; Foschini, L.; Hauf, S.; Kendziorra, E.; Kuster, M.; Laurent, P.; Tiengo, A.
2009-05-01
The Simbol-X background simulation group is working towards a simulation based background and mass model which can be used before and during the mission. Using the Geant4 toolkit, a Monte-Carlo code to simulate the detector background of the Simbol-X focal plane instrument has been developed with the aim to optimize the design of the instrument. Achieving an overall low instrument background has direct impact on the sensitivity of Simbol-X and thus will be crucial for the success of the mission. We present results of recent simulation studies concerning the shielding of the detectors with respect to the diffuse cosmic hard X-ray background and to the cosmic-ray proton induced background. Besides estimates of the level and spectral shape of the remaining background expected in the low and high energy detector, also anti-coincidence rates and resulting detector dead time predictions are discussed.
Background simulations for the wide field imager aboard the ATHENA X-ray Observatory
NASA Astrophysics Data System (ADS)
Hauf, Steffen; Kuster, Markus; Hoffmann, Dieter H. H.; Lang, Philipp-Michael; Neff, Stephan; Pia, Maria Grazia; Strüder, Lothar
2012-09-01
The ATHENA X-ray observatory was a European Space Agency project for a L-class mission. ATHENA was to be based upon a simplified IXO design with the number of instruments and the focal length of the Wolter optics being reduced. One of the two instruments, the Wide Field Imager (WFI) was to be a DePFET based focal plane pixel detector, allowing for high time and spatial resolution spectroscopy in the energy-range between 0.1 and 15 keV. In order to fulfill the mission goals a high sensitivity is essential, especially to study faint and extended sources. Thus a detailed understanding of the detector background induced by cosmic ray particles is crucial. During the mission design generally extensive Monte-Carlo simulations are used to estimate the detector background in order to optimize shielding components and software rejection algorithms. The Geant4 toolkit1,2 is frequently the tool of choice for this purpose. Alongside validation of the simulation environment with XMM-Newton EPIC-pn and Space Shuttle STS-53 data we present estimates for the ATHENA WFI cosmic ray induced background including long-term activation, which demonstrate that DEPFET-technology based detectors are able to achieve the required sensitivity.
NASA Technical Reports Server (NTRS)
Milam, Laura J.
1990-01-01
The Cosmic Background Explorer Observatory (COBE) underwent a thermal vacuum thermal balance test in the Space Environment Simulator (SES). This was the largest and most complex test ever conducted at this facility. The 4 x 4 m (13 x 13 ft) spacecraft weighed approx. 2223 kg (4900 lbs) for the test. The test set up included simulator panels for the inboard solar array panels, simulator panels for the flight cowlings, Sun and Earth Sensor stimuli, Thermal Radio Frequency Shield heater stimuli and a cryopanel for thermal control in the Attitude Control System Shunt Dissipator area. The fixturing also included a unique 4.3 m (14 ft) diameter Gaseous Helium Cryopanel which provided a 20 K environment for the calibration of one of the spacecraft's instruments, the Differential Microwave Radiometer. This cryogenic panel caused extra contamination concerns and a special method was developed and written into the test procedure to prevent the high buildup of condensibles on the panel which could have led to backstreaming of the thermal vacuum chamber. The test was completed with a high quality simulated space environment provided to the spacecraft. The test requirements, test set up, and special fixturing are described.
NASA Technical Reports Server (NTRS)
Milam, Laura J.
1991-01-01
The Cosmic Background Explorer Observatory (COBE) underwant a thermal vacuum thermal balance test in the Space Environment Simulator (SES). This was the largest and most complex test ever conducted at this facility. The 4 x 4 m (13 x 13 ft) spacecraft weighed approx. 2223 kg (4900 lbs) for the test. The test set up included simulator panels for the inboard solar array panels, simulator panels for the flight cowlings, Sun and Earth Sensor stimuli, Thermal Radio Frequency Shield heater stimuli and a cryopanel for thermal control in the Attitude Control System Shunt Dissipator area. The fixturing also included a unique 4.3 m (14 ft) diameter Gaseous Helium Cryopanel which provided a 20 K environment for the calibration of one of the spacecraft's instruments, the Differential Microwave Radiometer. This cryogenic panel caused extra contamination concerns and a special method was developed and written into the test procedure to prevent the high buildup of condensibles on the panel which could have led to backstreaming of the thermal vacuum chamber. The test was completed with a high quality simulated space environment provided to the spacecraft. The test requirements, test set up, and special fixturing are described.
NASA Astrophysics Data System (ADS)
Packard, Corey D.; Viola, Timothy S.; Klein, Mark D.
2017-10-01
The ability to predict spectral electro-optical (EO) signatures for various targets against realistic, cluttered backgrounds is paramount for rigorous signature evaluation. Knowledge of background and target signatures, including plumes, is essential for a variety of scientific and defense-related applications including contrast analysis, camouflage development, automatic target recognition (ATR) algorithm development and scene material classification. The capability to simulate any desired mission scenario with forecast or historical weather is a tremendous asset for defense agencies, serving as a complement to (or substitute for) target and background signature measurement campaigns. In this paper, a systematic process for the physical temperature and visible-through-infrared radiance prediction of several diverse targets in a cluttered natural environment scene is presented. The ability of a virtual airborne sensor platform to detect and differentiate targets from a cluttered background, from a variety of sensor perspectives and across numerous wavelengths in differing atmospheric conditions, is considered. The process described utilizes the thermal and radiance simulation software MuSES and provides a repeatable, accurate approach for analyzing wavelength-dependent background and target (including plume) signatures in multiple band-integrated wavebands (multispectral) or hyperspectrally. The engineering workflow required to combine 3D geometric descriptions, thermal material properties, natural weather boundary conditions, all modes of heat transfer and spectral surface properties is summarized. This procedure includes geometric scene creation, material and optical property attribution, and transient physical temperature prediction. Radiance renderings, based on ray-tracing and the Sandford-Robertson BRDF model, are coupled with MODTRAN for the inclusion of atmospheric effects. This virtual hyperspectral/multispectral radiance prediction methodology has been extensively validated and provides a flexible process for signature evaluation and algorithm development.
NASA Astrophysics Data System (ADS)
Takahashi, Ryuichi; Hamana, Takashi; Shirasaki, Masato; Namikawa, Toshiya; Nishimichi, Takahiro; Osato, Ken; Shiroyama, Kosei
2017-11-01
We present 108 full-sky gravitational lensing simulation data sets generated by performing multiple-lens plane ray-tracing through high-resolution cosmological N-body simulations. The data sets include full-sky convergence and shear maps from redshifts z = 0.05 to 5.3 at intervals of 150 {h}-1{Mpc} comoving radial distance (corresponding to a redshift interval of {{Δ }}z≃ 0.05 at the nearby universe), enabling the construction of a mock shear catalog for an arbitrary source distribution up to z = 5.3. The dark matter halos are identified from the same N-body simulations with enough mass resolution to resolve the host halos of the Sloan Digital Sky Survey (SDSS) CMASS and luminous red galaxies (LRGs). Angular positions and redshifts of the halos are provided by a ray-tracing calculation, enabling the creation of a mock halo catalog to be used for galaxy-galaxy and cluster-galaxy lensing. The simulation also yields maps of gravitational lensing deflections for a source redshift at the last scattering surface, and we provide 108 realizations of lensed cosmic microwave background (CMB) maps in which the post-Born corrections caused by multiple light scattering are included. We present basic statistics of the simulation data, including the angular power spectra of cosmic shear, CMB temperature and polarization anisotropies, galaxy-galaxy lensing signals for halos, and their covariances. The angular power spectra of the cosmic shear and CMB anisotropies agree with theoretical predictions within 5% up to {\\ell }=3000 (or at an angular scale θ > 0.5 arcmin). The simulation data sets are generated primarily for the ongoing Subaru Hyper Suprime-Cam survey, but are freely available for download at http://cosmo.phys.hirosaki-u.ac.jp/takahasi/allsky_raytracing/.
Takahashi, Ryuichi; Hamana, Takashi; Shirasaki, Masato; ...
2017-11-14
We present 108 full-sky gravitational lensing simulation data sets generated by performing multiple-lens plane ray-tracing through high-resolution cosmological N-body simulations. The data sets include full-sky convergence and shear maps from redshifts z = 0.05 to 5.3 at intervals ofmore » $$150\\,{h}^{-1}\\mathrm{Mpc}$$ comoving radial distance (corresponding to a redshift interval of $${\\rm{\\Delta }}z\\simeq 0.05$$ at the nearby universe), enabling the construction of a mock shear catalog for an arbitrary source distribution up to z = 5.3. The dark matter halos are identified from the same N-body simulations with enough mass resolution to resolve the host halos of the Sloan Digital Sky Survey (SDSS) CMASS and luminous red galaxies (LRGs). Angular positions and redshifts of the halos are provided by a ray-tracing calculation, enabling the creation of a mock halo catalog to be used for galaxy–galaxy and cluster–galaxy lensing. The simulation also yields maps of gravitational lensing deflections for a source redshift at the last scattering surface, and we provide 108 realizations of lensed cosmic microwave background (CMB) maps in which the post-Born corrections caused by multiple light scattering are included. We present basic statistics of the simulation data, including the angular power spectra of cosmic shear, CMB temperature and polarization anisotropies, galaxy–galaxy lensing signals for halos, and their covariances. The angular power spectra of the cosmic shear and CMB anisotropies agree with theoretical predictions within 5% up to $${\\ell }=3000$$ (or at an angular scale $$\\theta \\gt 0.5$$ arcmin). The simulation data sets are generated primarily for the ongoing Subaru Hyper Suprime-Cam survey, but are freely available for download at http://cosmo.phys.hirosaki-u.ac.jp/takahasi/allsky_raytracing/.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takahashi, Ryuichi; Hamana, Takashi; Shirasaki, Masato
We present 108 full-sky gravitational lensing simulation data sets generated by performing multiple-lens plane ray-tracing through high-resolution cosmological N-body simulations. The data sets include full-sky convergence and shear maps from redshifts z = 0.05 to 5.3 at intervals ofmore » $$150\\,{h}^{-1}\\mathrm{Mpc}$$ comoving radial distance (corresponding to a redshift interval of $${\\rm{\\Delta }}z\\simeq 0.05$$ at the nearby universe), enabling the construction of a mock shear catalog for an arbitrary source distribution up to z = 5.3. The dark matter halos are identified from the same N-body simulations with enough mass resolution to resolve the host halos of the Sloan Digital Sky Survey (SDSS) CMASS and luminous red galaxies (LRGs). Angular positions and redshifts of the halos are provided by a ray-tracing calculation, enabling the creation of a mock halo catalog to be used for galaxy–galaxy and cluster–galaxy lensing. The simulation also yields maps of gravitational lensing deflections for a source redshift at the last scattering surface, and we provide 108 realizations of lensed cosmic microwave background (CMB) maps in which the post-Born corrections caused by multiple light scattering are included. We present basic statistics of the simulation data, including the angular power spectra of cosmic shear, CMB temperature and polarization anisotropies, galaxy–galaxy lensing signals for halos, and their covariances. The angular power spectra of the cosmic shear and CMB anisotropies agree with theoretical predictions within 5% up to $${\\ell }=3000$$ (or at an angular scale $$\\theta \\gt 0.5$$ arcmin). The simulation data sets are generated primarily for the ongoing Subaru Hyper Suprime-Cam survey, but are freely available for download at http://cosmo.phys.hirosaki-u.ac.jp/takahasi/allsky_raytracing/.« less
Cheng, Adam; Donoghue, Aaron; Gilfoyle, Elaine; Eppich, Walter
2012-03-01
To review the essential elements of crisis resource management and provide a resource for instructors by describing how to use simulation-based training to teach crisis resource management principles in pediatric acute care contexts. A MEDLINE-based literature source. OUTLINE OF REVIEW: This review is divided into three main sections: Background, Principles of Crisis Resource Management, and Tools and Resources. The background section provides the brief history and definition of crisis resource management. The next section describes all the essential elements of crisis resource management, including leadership and followership, communication, teamwork, resource use, and situational awareness. This is followed by a review of evidence supporting the use of simulation-based crisis resource management training in health care. The last section provides the resources necessary to develop crisis resource management training using a simulation-based approach. This includes a description of how to design pediatric simulation scenarios, how to effectively debrief, and a list of potential assessment tools that instructors can use to evaluate crisis resource management performance during simulation-based training. Crisis resource management principles form the foundation for efficient team functioning and subsequent error reduction in high-stakes environments such as acute care pediatrics. Effective instructor training is required for those programs wishing to teach these principles using simulation-based learning. Dissemination and integration of these principles into pediatric critical care practice has the potential for a tremendous impact on patient safety and outcomes.
SEPAC data analysis in support of the environmental interaction program
NASA Technical Reports Server (NTRS)
Lin, Chin S.
1990-01-01
Injections of nonrelativistic electron beams from an isolated equipotential conductor into a uniform background of plasma and neutral gas were simulated using a two dimensional electrostatic particle code. The ionization effects of spacecraft charging are examined by including interactions of electrons with neutral gas. The simulations show that the conductor charging potential decreases with increasing neutral background density due to the production of secondary electrons near the conductor surface. In the spacecraft wake, the background electrons accelerated towards the charged space craft produced an enhancement of secondary electrons and ions. Simulations run for longer times indicate that the spacecraft potential is further reduced and short wavelength beam-plasma oscillations appear. The results are applied to explain the space craft charging potential measured during the SEPAC experiments from Spacelab 1. A second paper is presented in which a two dimensional electrostatic particle code was used to study the beam radial expansion of a nonrelativistic electron beam injected from an isolated equipotential conductor into a background plasma. The simulations indicate that the beam radius is generally proportional to the beam electron gyroradius when the conductor is charged to a large potential. The simulations also suggest that the charge buildup at the beam stagnation point causes the beam radial expansion. From a survey of the simulation results, it is found that the ratio of the beam radius to the beam electron gyroradius increases with the square root of beam density and decreases inversely with beam injection velocity. This dependence is explained in terms of the ratio of the beam electron Debye length to the ambient electron Debye length. These results are most applicable to the SEPAC electron beam injection experiments from Spacelab 1, where high charging potential was observed.
NASA Technical Reports Server (NTRS)
Simpson, W. R.
1994-01-01
An advanced sensor test capability is now operational at the Air Force Arnold Engineering Development Center (AEDC) for calibration and performance characterization of infrared sensors. This facility, known as the 7V, is part of a broad range of test capabilities under development at AEDC to provide complete ground test support to the sensor community for large-aperture surveillance sensors and kinetic kill interceptors. The 7V is a state-of-the-art cryo/vacuum facility providing calibration and mission simulation against space backgrounds. Key features of the facility include high-fidelity scene simulation with precision track accuracy and in-situ target monitoring, diffraction limited optical system, NIST traceable broadband and spectral radiometric calibration, outstanding jitter control, environmental systems for 20 K, high-vacuum, low-background simulation, and an advanced data acquisition system.
NASA Astrophysics Data System (ADS)
Dhanya, M.; Chandrasekar, A.
2016-02-01
The background error covariance structure influences a variational data assimilation system immensely. The simulation of a weather phenomenon like monsoon depression can hence be influenced by the background correlation information used in the analysis formulation. The Weather Research and Forecasting Model Data assimilation (WRFDA) system includes an option for formulating multivariate background correlations for its three-dimensional variational (3DVar) system (cv6 option). The impact of using such a formulation in the simulation of three monsoon depressions over India is investigated in this study. Analysis and forecast fields generated using this option are compared with those obtained using the default formulation for regional background error correlations (cv5) in WRFDA and with a base run without any assimilation. The model rainfall forecasts are compared with rainfall observations from the Tropical Rainfall Measurement Mission (TRMM) and the other model forecast fields are compared with a high-resolution analysis as well as with European Centre for Medium-Range Weather Forecasts (ECMWF) ERA-Interim reanalysis. The results of the study indicate that inclusion of additional correlation information in background error statistics has a moderate impact on the vertical profiles of relative humidity, moisture convergence, horizontal divergence and the temperature structure at the depression centre at the analysis time of the cv5/cv6 sensitivity experiments. Moderate improvements are seen in two of the three depressions investigated in this study. An improved thermodynamic and moisture structure at the initial time is expected to provide for improved rainfall simulation. The results of the study indicate that the skill scores of accumulated rainfall are somewhat better for the cv6 option as compared to the cv5 option for at least two of the three depression cases studied, especially at the higher threshold levels. Considering the importance of utilising improved flow-dependent correlation structures for efficient data assimilation, the need for more studies on the impact of background error covariances is obvious.
Monte-Carlo background simulations of present and future detectors in x-ray astronomy
NASA Astrophysics Data System (ADS)
Tenzer, C.; Kendziorra, E.; Santangelo, A.
2008-07-01
Reaching a low-level and well understood internal instrumental background is crucial for the scientific performance of an X-ray detector and, therefore, a main objective of the instrument designers. Monte-Carlo simulations of the physics processes and interactions taking place in a space-based X-ray detector as a result of its orbital environment can be applied to explain the measured background of existing missions. They are thus an excellent tool to predict and optimize the background of future observatories. Weak points of a design and the main sources of the background can be identified and methods to reduce them can be implemented and studied within the simulations. Using the Geant4 Monte-Carlo toolkit, we have created a simulation environment for space-based detectors and we present results of such background simulations for XMM-Newton's EPIC pn-CCD camera. The environment is also currently used to estimate and optimize the background of the future instruments Simbol-X and eRosita.
Simulating southwestern U.S. desert dust influences on supercell thunderstorms
NASA Astrophysics Data System (ADS)
Lerach, David G.; Cotton, William R.
2018-05-01
Three-dimensional numerical simulations were performed to evaluate potential southwestern U.S. dust indirect microphysical and direct radiative impacts on a real severe storms outbreak. Increased solar absorption within the dust plume led to modest increases in pre-storm atmospheric stability at low levels, resulting in weaker convective updrafts and less widespread precipitation. Dust microphysical impacts on convection were minor in comparison, due in part to the lofted dust concentrations being relatively few in number when compared to the background (non-dust) aerosol population. While dust preferentially serving as cloud condensation nuclei (CCN) versus giant CCN had opposing effects on warm rain production, both scenarios resulted in ample supercooled water and subsequent glaciation aloft, yielding larger graupel and hail. Associated latent heating from condensation and freezing contributed little to overall updraft invigoration. With reduced rain production overall, the simulations that included dust effects experienced slightly reduced grid-cumulative precipitation and notably warmer and spatially smaller cold pools. Dust serving as ice nucleating particles did not appear to play a significant role. The presence of dust ultimately reduced the number of supercells produced but allowed for supercell evolution characterized by consistently higher values of relative vertical vorticity within simulated mesocyclones. Dust radiative and microphysical effects were relatively small in magnitude when compared to those from altering the background convective available potential energy and vertical wind shear. It is difficult to generalize such findings from a single event, however, due to a number of case-specific environmental factors. These include the nature of the low-level moisture advection and characteristics of the background aerosol distribution.
Understanding Uncertainties and Biases in Jet Quenching in High-Energy Nucleus-Nucleus Collisions
NASA Astrophysics Data System (ADS)
Heinz, Matthias
2017-09-01
Jets are the collimated streams of particles resulting from hard scattering in the initial state of high-energy collisions. In heavy-ion collisions, jets interact with the quark-gluon plasma (QGP) before freezeout, providing a probe into the internal structure and properties of the QGP. In order to study jets, background must be subtracted from the measured event, potentially introducing a bias. We aim to understand quantify this subtraction bias. PYTHIA, a library to simulate pure jet events, is used to simulate a model for a signature with one pure jet (a photon) and one quenched jet, where all quenched particle momenta are reduced by the same fraction. Background for the event is simulated using multiplicity values generated by the TRENTO initial state model of heavy-ion collisions fed into a thermal model from which to sample particle types and a 3-dimensional Boltzmann distribution from which to sample particle momenta. Data from the simulated events is used to train a statistical model, which computes a posterior distribution of the quench factor for a data set. The model was tested first on pure jet events and later on full events including the background. This model will allow for a quantitative determination of biases induced by various methods of background subtraction. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
McGill, M J; Hart, W D; McKay, J A; Spinhirne, J D
1999-10-20
Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar system: the double-edge and the multichannel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only approximately 10-20% compared with nighttime performance, provided that a proper solar filter is included in the instrument design.
STS-132 crew during their PDRS N-TSK MRM training in the building 16 cupola trainer.
2009-12-22
JSC2009-E-286962 (22 Dec. 2009) --- Astronauts Ken Ham (right background), STS-132 commander; Tony Antonelli (left), pilot; and Mike Good, mission specialist, participate in an exercise in the systems engineering simulator in the Avionics Systems Laboratory at NASA?s Johnson Space Center. The facility includes moving scenes of full-sized International Space Station components over a simulated Earth.
STS-132 crew during their PDRS N-TSK MRM training in the building 16 cupola trainer.
2009-12-22
JSC2009-E-286976 (22 Dec. 2009) --- Astronauts Ken Ham (left), STS-132 commander; Tony Antonelli (right background), pilot; and Mike Good, mission specialist, participate in an exercise in the systems engineering simulator in the Avionics Systems Laboratory at NASA?s Johnson Space Center. The facility includes moving scenes of full-sized International Space Station components over a simulated Earth.
STS-132 crew during their PDRS N-TSK MRM training in the building 16 cupola trainer.
2009-12-22
JSC2009-E-286972 (22 Dec. 2009) --- Astronauts Ken Ham (right background), STS-132 commander; Tony Antonelli (left), pilot; and Mike Good, mission specialist, participate in an exercise in the systems engineering simulator in the Avionics Systems Laboratory at NASA?s Johnson Space Center. The facility includes moving scenes of full-sized International Space Station components over a simulated Earth.
NASA Astrophysics Data System (ADS)
Sadleir, Rosalind J.; Sajib, Saurav Z. K.; Kim, Hyung Joong; Kwon, Oh In; Woo, Eung Je
2013-05-01
MREIT is a new imaging modality that can be used to reconstruct high-resolution conductivity images of the human body. Since conductivity values of cancerous tissues in the breast are significantly higher than those of surrounding normal tissues, breast imaging using MREIT may provide a new noninvasive way of detecting early stage of cancer. In this paper, we present results of experimental and numerical simulation studies of breast MREIT. We built a realistic three-dimensional model of the human breast connected to a simplified model of the chest including the heart and evaluated the ability of MREIT to detect cancerous anomalies in a background material with similar electrical properties to breast tissue. We performed numerical simulations of various scenarios in breast MREIT including assessment of the effects of fat inclusions and effects related to noise levels, such as changing the amplitude of injected currents, effect of added noise and number of averages. Phantom results showed straightforward detection of cancerous anomalies in a background was possible with low currents and few averages. The simulation results showed it should be possible to detect a cancerous anomaly in the breast, while restricting the maximal current density in the heart below published levels for nerve excitation.
Drift wave turbulence simulations in LAPD
NASA Astrophysics Data System (ADS)
Popovich, P.; Umansky, M.; Carter, T. A.; Auerbach, D. W.; Friedman, B.; Schaffner, D.; Vincena, S.
2009-11-01
We present numerical simulations of turbulence in LAPD plasmas using the 3D electromagnetic code BOUT (BOUndary Turbulence). BOUT solves a system of fluid moment equations in a general toroidal equlibrium geometry near the plasma boundary. The underlying assumptions for the validity of the fluid model are well satisfied for drift waves in LAPD plasmas (typical plasma parameters ne˜1x10^12cm-3, Te˜10eV, and B ˜1kG), which makes BOUT a perfect tool for simulating LAPD. We have adapted BOUT for the cylindrical geometry of LAPD and have extended the model to include the background flows required for simulations of recent bias-driven rotation experiments. We have successfully verified the code for several linear instabilities, including resistive drift waves, Kelvin-Helmholtz and rotation-driven interchange. We will discuss first non-linear simulations and quasi-stationary solutions with self-consistent plasma flows and saturated density profiles.
Responses of a constructed plant community to simulated glyphosate and dicamba drift
Background/Questions/Methods As part of its regulation of pesticides, the US Environmental Protection Agency must consider environmental risks, including impacts to nontarget plants exposed to pesticide drift. Normally these risk assessments consider impacts to individual spec...
ERIC Educational Resources Information Center
Downes, Ann
1986-01-01
Provides background information on radio galaxies. Topic areas addressed include: what produces the radio emission; radio telescopes; locating radio galaxies; how distances to radio galaxies are found; physics of radio galaxies; computer simulations of radio galaxies; and the evolution of radio galaxies with cosmic time. (JN)
ForCent model development and testing using the Enriched Background Isotope Study experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parton, W.J.; Hanson, P. J.; Swanston, C.
The ForCent forest ecosystem model was developed by making major revisions to the DayCent model including: (1) adding a humus organic pool, (2) incorporating a detailed root growth model, and (3) including plant phenological growth patterns. Observed plant production and soil respiration data from 1993 to 2000 were used to demonstrate that the ForCent model could accurately simulate ecosystem carbon dynamics for the Oak Ridge National Laboratory deciduous forest. A comparison of ForCent versus observed soil pool {sup 14}C signature ({Delta} {sup 14}C) data from the Enriched Background Isotope Study {sup 14}C experiment (1999-2006) shows that the model correctly simulatesmore » the temporal dynamics of the {sup 14}C label as it moved from the surface litter and roots into the mineral soil organic matter pools. ForCent model validation was performed by comparing the observed Enriched Background Isotope Study experimental data with simulated live and dead root biomass {Delta} {sup 14}C data, and with soil respiration {Delta} {sup 14}C (mineral soil, humus layer, leaf litter layer, and total soil respiration) data. Results show that the model correctly simulates the impact of the Enriched Background Isotope Study {sup 14}C experimental treatments on soil respiration {Delta} {sup 14}C values for the different soil organic matter pools. Model results suggest that a two-pool root growth model correctly represents root carbon dynamics and inputs to the soil. The model fitting process and sensitivity analysis exposed uncertainty in our estimates of the fraction of mineral soil in the slow and passive pools, dissolved organic carbon flux out of the litter layer into the mineral soil, and mixing of the humus layer into the mineral soil layer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darafsheh, A; Kassaee, A; Finlay, J
Purpose: The nature of the background visible light observed during fiber optic dosimetry of proton beams, whether it is due to Cherenkov radiation or not, has been debated in the literature recently. In this work, experimentally and by means of Monte Carlo simulations, we shed light on this problem and investigated the nature of the background visible light observed in fiber optics irradiated with proton beams. Methods: A bare silica fiber optics was embedded in tissue-mimicking phantoms and irradiated with clinical proton beams with energies of 100–225 MeV at Roberts Proton Therapy Center. Luminescence spectroscopy was performed by a CCD-coupledmore » spectrograph to analyze in detail the emission spectrum of the fiber tip across the visible range of 400–700 nm. Monte Carlo simulation was performed by using FLUKA Monte Carlo code to simulate Cherenkov light and ionizing radiation dose deposition in the fiber. Results: The experimental spectra of the irradiated silica fiber shows two distinct peaks at 450 and 650 nm, whose spectral shape is different from that of Cherenkov radiation. We believe that the nature of these peaks are connected to the point defects of silica including oxygen-deficiency center (ODC) and non-bridging oxygen hole center (NBOHC). Monte Carlo simulations confirmed the experimental observations that Cherenkov radiation cannot be solely responsible for such a signal. Conclusion: We showed that Cherenkov radiation is not the dominant visible signal observed in bare fiber optics irradiated with proton beams. We observed two distinct peaks at 450 and 650 nm whose nature is connected with the point defects of silica fiber including oxygen-deficiency center and non-bridging oxygen hole center.« less
NASA Astrophysics Data System (ADS)
Packard, Corey D.; Klein, Mark D.; Viola, Timothy S.; Hepokoski, Mark A.
2016-10-01
The ability to predict electro-optical (EO) signatures of diverse targets against cluttered backgrounds is paramount for signature evaluation and/or management. Knowledge of target and background signatures is essential for a variety of defense-related applications. While there is no substitute for measured target and background signatures to determine contrast and detection probability, the capability to simulate any mission scenario with desired environmental conditions is a tremendous asset for defense agencies. In this paper, a systematic process for the thermal and visible-through-infrared simulation of camouflaged human dismounts in cluttered outdoor environments is presented. This process, utilizing the thermal and EO/IR radiance simulation tool TAIThermIR (and MuSES), provides a repeatable and accurate approach for analyzing contrast, signature and detectability of humans in multiple wavebands. The engineering workflow required to combine natural weather boundary conditions and the human thermoregulatory module developed by ThermoAnalytics is summarized. The procedure includes human geometry creation, human segmental physiology description and transient physical temperature prediction using environmental boundary conditions and active thermoregulation. Radiance renderings, which use Sandford-Robertson BRDF optical surface property descriptions and are coupled with MODTRAN for the calculation of atmospheric effects, are demonstrated. Sensor effects such as optical blurring and photon noise can be optionally included, increasing the accuracy of detection probability outputs that accompany each rendering. This virtual evaluation procedure has been extensively validated and provides a flexible evaluation process that minimizes the difficulties inherent in human-subject field testing. Defense applications such as detection probability assessment, camouflage pattern evaluation, conspicuity tests and automatic target recognition are discussed.
NASA Astrophysics Data System (ADS)
Tenerani, Anna; Velli, Marco
2017-07-01
Alfvénic fluctuations in the solar wind display many properties reflecting an ongoing nonlinear cascade, e.g., a well-defined spectrum in frequency, together with some characteristics more commonly associated with the linear propagation of waves from the Sun, such as the variation of fluctuation amplitude with distance, dominated by solar wind expansion effects. Therefore, both nonlinearities and expansion must be included simultaneously in any successful model of solar wind turbulence evolution. Because of the disparate spatial scales involved, direct numerical simulations of turbulence in the solar wind represent an arduous task, especially if one wants to go beyond the incompressible approximation. Indeed, most simulations neglect solar wind expansion effects entirely. Here we develop a numerical model to simulate turbulent fluctuations from the outer corona to 1 au and beyond, including the sub-Alfvénic corona. The accelerating expanding box (AEB) extends the validity of previous expanding box models by taking into account both the acceleration of the solar wind and the inhomogeneity of background density and magnetic field. Our method incorporates a background accelerating wind within a magnetic field that naturally follows the Parker spiral evolution using a two-scale analysis in which the macroscopic spatial effect coupling fluctuations with background gradients becomes a time-dependent coupling term in a homogeneous box. In this paper we describe the AEB model in detail and discuss its main properties, illustrating its validity by studying Alfvén wave propagation across the Alfvén critical point.
Sensitivity analysis for simulating pesticide impacts on honey bee colonies
Background/Question/Methods Regulatory agencies assess risks to honey bees from pesticides through a tiered process that includes predictive modeling with empirical toxicity and chemical data of pesticides as a line of evidence. We evaluate the Varroapop colony model, proposed by...
NASA Technical Reports Server (NTRS)
Ellison, D. C.; Jones, F. C.; Eichler, D.
1983-01-01
Both hydrodynamic calculations (Drury and Volk, 1981, and Axford et al., 1982) and kinetic simulations imply the existence of thermal subshocks in high-Mach-number cosmic-ray-mediated shocks. The injection efficiency of particles from the thermal background into the diffusive shock-acceleration process is determined in part by the sharpness and compression ratio of these subshocks. Results are reported for a Monte Carlo simulation that includes both the back reaction of accelerated particles on the inflowing plasma, producing a smoothing of the shock transition, and the free escape of particles allowing arbitrarily large overall compression ratios in high-Mach-number steady-state shocks. Energy spectra and estimates of the proportion of thermal ions accelerated to high energy are obtained.
NASA Astrophysics Data System (ADS)
Ellison, D. C.; Jones, F. C.; Eichler, D.
1983-08-01
Both hydrodynamic calculations (Drury and Volk, 1981, and Axford et al., 1982) and kinetic simulations imply the existence of thermal subshocks in high-Mach-number cosmic-ray-mediated shocks. The injection efficiency of particles from the thermal background into the diffusive shock-acceleration process is determined in part by the sharpness and compression ratio of these subshocks. Results are reported for a Monte Carlo simulation that includes both the back reaction of accelerated particles on the inflowing plasma, producing a smoothing of the shock transition, and the free escape of particles allowing arbitrarily large overall compression ratios in high-Mach-number steady-state shocks. Energy spectra and estimates of the proportion of thermal ions accelerated to high energy are obtained.
Hydrodynamic Simulation of the Cosmological X-Ray Background
NASA Astrophysics Data System (ADS)
Croft, Rupert A. C.; Di Matteo, Tiziana; Davé, Romeel; Hernquist, Lars; Katz, Neal; Fardal, Mark A.; Weinberg, David H.
2001-08-01
We use a hydrodynamic simulation of an inflationary cold dark matter model with a cosmological constant to predict properties of the extragalactic X-ray background (XRB). We focus on emission from the intergalactic medium (IGM), with particular attention to diffuse emission from warm-hot gas that lies in relatively smooth filamentary structures between galaxies and galaxy clusters. We also include X-rays from point sources associated with galaxies in the simulation, and we make maps of the angular distribution of the emission. Although much of the X-ray luminous gas has a filamentary structure, the filaments are not evident in the simulated maps because of projection effects. In the soft (0.5-2 keV) band, our calculated mean intensity of radiation from intergalactic and cluster gas is 2.3×10-12 ergs-1 cm-2 deg-2, 35% of the total softband emission. This intensity is compatible at the ~1 σ level with estimates of the unresolved soft background intensity from deep ROSAT and Chandra measurements. Only 4% of the hard (2-10 keV) emission is associated with intergalactic gas. Relative to active galactic nuclei flux, the IGM component of the XRB peaks at a lower redshift (median z~0.45) and spans a narrower redshift range, so its clustering makes an important contribution to the angular correlation function of the total emission. The clustering on the scales accessible to our simulation (0.1‧-10') is significant, with an amplitude roughly consistent with an extrapolation of recent ROSAT results to small scales. A cross-correlation analysis of the XRB against nearby galaxies taken from a simulated redshift survey also yields a strong signal from the IGM. Our conclusions about the soft background intensity differ from those of some recent papers that have argued that the expected emission from gas in galaxy, group, and cluster halos would exceed the observed background unless much of the gas is expelled by supernova feedback. We obtain reasonable compatibility with current observations in a simulation that incorporates cooling, star formation, and only modest feedback. A clear prediction of our model is that the unresolved portion of the soft XRB will remain mostly unresolved even as observations reach deeper point-source sensitivity.
The Python Sky Model: software for simulating the Galactic microwave sky
NASA Astrophysics Data System (ADS)
Thorne, B.; Dunkley, J.; Alonso, D.; Næss, S.
2017-08-01
We present a numerical code to simulate maps of Galactic emission in intensity and polarization at microwave frequencies, aiding in the design of cosmic microwave background experiments. This python code builds on existing efforts to simulate the sky by providing an easy-to-use interface and is based on publicly available data from the WMAP (Wilkinson Microwave Anisotropy Probe) and Planck satellite missions. We simulate synchrotron, thermal dust, free-free and anomalous microwave emission over the whole sky, in addition to the cosmic microwave background, and include a set of alternative prescriptions for the frequency dependence of each component, for example, polarized dust with multiple temperatures and a decorrelation of the signals with frequency, which introduce complexity that is consistent with current data. We also present a new prescription for adding small-scale realizations of these components at resolutions greater than current all-sky measurements. The usefulness of the code is demonstrated by forecasting the impact of varying foreground complexity on the recovered tensor-to-scalar ratio for the LiteBIRD satellite. The code is available at: https://github.com/bthorne93/PySM_public.
The simulation library of the Belle II software system
NASA Astrophysics Data System (ADS)
Kim, D. Y.; Ritter, M.; Bilka, T.; Bobrov, A.; Casarosa, G.; Chilikin, K.; Ferber, T.; Godang, R.; Jaegle, I.; Kandra, J.; Kodys, P.; Kuhr, T.; Kvasnicka, P.; Nakayama, H.; Piilonen, L.; Pulvermacher, C.; Santelj, L.; Schwenker, B.; Sibidanov, A.; Soloviev, Y.; Starič, M.; Uglov, T.
2017-10-01
SuperKEKB, the next generation B factory, has been constructed in Japan as an upgrade of KEKB. This brand new e+ e- collider is expected to deliver a very large data set for the Belle II experiment, which will be 50 times larger than the previous Belle sample. Both the triggered physics event rate and the background event rate will be increased by at least 10 times than the previous ones, and will create a challenging data taking environment for the Belle II detector. The software system of the Belle II experiment is designed to execute this ambitious plan. A full detector simulation library, which is a part of the Belle II software system, is created based on Geant4 and has been tested thoroughly. Recently the library has been upgraded with Geant4 version 10.1. The library is behaving as expected and it is utilized actively in producing Monte Carlo data sets for various studies. In this paper, we will explain the structure of the simulation library and the various interfaces to other packages including geometry and beam background simulation.
Investigation of soft component in cosmic ray detection
NASA Astrophysics Data System (ADS)
Oláh, László; Varga, Dezső
2017-07-01
Cosmic ray detection is a research area which finds various applications in tomographic imaging of large size objects. In such applications, the background sources which contaminate cosmic muon signal require a good understanding of the creation processes, as well as reliable simulation frameworks with high predictive power are needed. One of the main background source is the ;soft component;, that is electrons and positrons. In this paper a simulation framework based on GEANT4 has been established to pin down the key features of the soft component. We have found that the electron and positron flux shows a remarkable invariance against various model parameters including the muon emission altitude or primary particle energy distribution. The correlation between simultaneously arriving particles have been quantitatively investigated, demonstrating that electrons and positrons tend to arrive within a close distance and with low relative angle. This feature, which is highly relevant for counting detectors, has been experimentally verified under open sky and at shallow depth underground. The simulation results have been compared to existing other measurements as well as other simulation programs.
NASA Astrophysics Data System (ADS)
Ringbom, A.
2010-12-01
A detailed knowledge of both the spatial and isotopic distribution of anthropogenic radioxenon is essential in investigations of the performance of the radioxenon part of the IMS, as well as in the development of techniques to discriminate radioxenon signatures from a nuclear explosion from other sources. Further, the production processes in the facilities causing the radioxenon background has to be understood and be compatible with simulations. In this work, several aspects of the observed atmospheric radioxenon background are investigated, including the global distribution as well as the current understanding of the observed isotopic ratios. Analyzed radioxenon data from the IMS, as well as from other measurement stations, are used to create an up-to-date description of the global radioxenon background, including all four CTBT relevant xenon isotopes (133Xe, 131mXe, 133mXe, and 135Xe). In addition, measured isotopic ratios will be compared to simulations of neutron induced fission of 235U, and the uncertainties will be discussed. Finally, the impact of the radioxenon background on the detection capability of the IMS will be investigated. This work is a continuation of studies [1,2] that was presented at the International Scientific Studies conference held in Vienna in 2009. [1] A. Ringbom, et.al., “Characterization of the global distribution of atmospheric radioxenons”, International Scientific Studies Conference on CTBT Verification, 10-12 June 2009. [2] R. D'Amours and A. Ringbom, “A study on the global detection capability of IMS for all CTBT relevant xenon isotopes“, International Scientific Studies Conference on CTBT Verification, 10-12 June 2009.
Residents’ perceptions of simulation as a clinical learning approach
Walsh, Catharine M.; Garg, Ankit; Ng, Stella L.; Goyal, Fenny; Grover, Samir C.
2017-01-01
Background Simulation is increasingly being integrated into medical education; however, there is little research into trainees’ perceptions of this learning modality. We elicited trainees’ perceptions of simulation-based learning, to inform how simulation is developed and applied to support training. Methods We conducted an instrumental qualitative case study entailing 36 semi-structured one-hour interviews with 12 residents enrolled in an introductory simulation-based course. Trainees were interviewed at three time points: pre-course, post-course, and 4–6 weeks later. Interview transcripts were analyzed using a qualitative descriptive analytic approach. Results Residents’ perceptions of simulation included: 1) simulation serves pragmatic purposes; 2) simulation provides a safe space; 3) simulation presents perils and pitfalls; and 4) optimal design for simulation: integration and tension. Key findings included residents’ markedly narrow perception of simulation’s capacity to support non-technical skills development or its use beyond introductory learning. Conclusion Trainees’ learning expectations of simulation were restricted. Educators should critically attend to the way they present simulation to learners as, based on theories of problem-framing, trainees’ a priori perceptions may delimit the focus of their learning experiences. If they view simulation as merely a replica of real cases for the purpose of practicing basic skills, they may fail to benefit from the full scope of learning opportunities afforded by simulation. PMID:28344719
MRXCAT: Realistic numerical phantoms for cardiovascular magnetic resonance
2014-01-01
Background Computer simulations are important for validating novel image acquisition and reconstruction strategies. In cardiovascular magnetic resonance (CMR), numerical simulations need to combine anatomical information and the effects of cardiac and/or respiratory motion. To this end, a framework for realistic CMR simulations is proposed and its use for image reconstruction from undersampled data is demonstrated. Methods The extended Cardiac-Torso (XCAT) anatomical phantom framework with various motion options was used as a basis for the numerical phantoms. Different tissue, dynamic contrast and signal models, multiple receiver coils and noise are simulated. Arbitrary trajectories and undersampled acquisition can be selected. The utility of the framework is demonstrated for accelerated cine and first-pass myocardial perfusion imaging using k-t PCA and k-t SPARSE. Results MRXCAT phantoms allow for realistic simulation of CMR including optional cardiac and respiratory motion. Example reconstructions from simulated undersampled k-t parallel imaging demonstrate the feasibility of simulated acquisition and reconstruction using the presented framework. Myocardial blood flow assessment from simulated myocardial perfusion images highlights the suitability of MRXCAT for quantitative post-processing simulation. Conclusion The proposed MRXCAT phantom framework enables versatile and realistic simulations of CMR including breathhold and free-breathing acquisitions. PMID:25204441
The Impact of New Technology on Accounting Education.
ERIC Educational Resources Information Center
Shaoul, Jean
The introduction of computers in the Department of Accounting and Finance at Manchester University is described. General background outlining the increasing need for microcomputers in the accounting curriculum (including financial modelling tools and decision support systems such as linear programming, statistical packages, and simulation) is…
High-Absorptance Radiative Heat Sink
NASA Technical Reports Server (NTRS)
Cafferty, T.
1983-01-01
Absorptance of black-painted open-cell aluminum honeycomb improved by cutting honeycomb at angle or bias rather than straight across. This ensures honeycomb cavities escapes. At each reflection radiation attenuated by absorption. Applications include space-background simulators, space radiators, solar absorbers, and passive coolers for terrestrial use.
Crime and Justice: 10 Activities.
ERIC Educational Resources Information Center
Constitutional Rights Foundation, Los Angeles, CA.
This manual contains learning activities to aid secondary teachers in clarifying and enriching the Scholastic materials "Living Law." The format of the manual includes a brief overview, background information, teacher instructions, and a description of each activity. Case studies, simulations, and role-playing activities are provided. Topics…
Pomareda, Víctor; Magrans, Rudys; Jiménez-Soto, Juan M; Martínez, Dani; Tresánchez, Marcel; Burgués, Javier; Palacín, Jordi; Marco, Santiago
2017-04-20
We present the estimation of a likelihood map for the location of the source of a chemical plume dispersed under atmospheric turbulence under uniform wind conditions. The main contribution of this work is to extend previous proposals based on Bayesian inference with binary detections to the use of concentration information while at the same time being robust against the presence of background chemical noise. For that, the algorithm builds a background model with robust statistics measurements to assess the posterior probability that a given chemical concentration reading comes from the background or from a source emitting at a distance with a specific release rate. In addition, our algorithm allows multiple mobile gas sensors to be used. Ten realistic simulations and ten real data experiments are used for evaluation purposes. For the simulations, we have supposed that sensors are mounted on cars which do not have among its main tasks navigating toward the source. To collect the real dataset, a special arena with induced wind is built, and an autonomous vehicle equipped with several sensors, including a photo ionization detector (PID) for sensing chemical concentration, is used. Simulation results show that our algorithm, provides a better estimation of the source location even for a low background level that benefits the performance of binary version. The improvement is clear for the synthetic data while for real data the estimation is only slightly better, probably because our exploration arena is not able to provide uniform wind conditions. Finally, an estimation of the computational cost of the algorithmic proposal is presented.
STS-135 crew during Rendezvous Training session in Building 16 dome
2011-03-23
JSC2011-E-028144 (23 March 2011) --- NASA astronauts Chris Ferguson (left foreground), STS-135 commander; Doug Hurley (left background), pilot; and Sandy Magnus (left), mission specialist, speak with news media representatives during an exercise in the systems engineering simulator in the Avionics Systems Laboratory at NASA's Johnson Space Center. The facility includes moving scenes of full-sized International Space Station components over a simulated Earth. Photo credit: NASA or National Aeronautics and Space Administration
A Coordinated Initialization Process for the Distributed Space Exploration Simulation
NASA Technical Reports Server (NTRS)
Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David
2007-01-01
A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions
Optimal Search for an Astrophysical Gravitational-Wave Background
NASA Astrophysics Data System (ADS)
Smith, Rory; Thrane, Eric
2018-04-01
Roughly every 2-10 min, a pair of stellar-mass black holes merge somewhere in the Universe. A small fraction of these mergers are detected as individually resolvable gravitational-wave events by advanced detectors such as LIGO and Virgo. The rest contribute to a stochastic background. We derive the statistically optimal search strategy (producing minimum credible intervals) for a background of unresolved binaries. Our method applies Bayesian parameter estimation to all available data. Using Monte Carlo simulations, we demonstrate that the search is both "safe" and effective: it is not fooled by instrumental artifacts such as glitches and it recovers simulated stochastic signals without bias. Given realistic assumptions, we estimate that the search can detect the binary black hole background with about 1 day of design sensitivity data versus ≈40 months using the traditional cross-correlation search. This framework independently constrains the merger rate and black hole mass distribution, breaking a degeneracy present in the cross-correlation approach. The search provides a unified framework for population studies of compact binaries, which is cast in terms of hyperparameter estimation. We discuss a number of extensions and generalizations, including application to other sources (such as binary neutron stars and continuous-wave sources), simultaneous estimation of a continuous Gaussian background, and applications to pulsar timing.
Digital simulation of hybrid loop operation in RFI backgrounds.
NASA Technical Reports Server (NTRS)
Ziemer, R. E.; Nelson, D. R.
1972-01-01
A digital computer model for Monte-Carlo simulation of an imperfect second-order hybrid phase-locked loop (PLL) operating in radio-frequency interference (RFI) and Gaussian noise backgrounds has been developed. Characterization of hybrid loop performance in terms of cycle slipping statistics and phase error variance, through computer simulation, indicates that the hybrid loop has desirable performance characteristics in RFI backgrounds over the conventional PLL or the costas loop.
NASA Astrophysics Data System (ADS)
Perez, J. C.; Chandran, B. D. G.
2017-12-01
In this work we present recent results from high-resolution direct numerical simulations and a phenomenological model that describes the radial evolution of reflection-driven Alfven Wave turbulence in the solar atmosphere and the inner solar wind. The simulations are performed inside a narrow magnetic flux tube that models a coronal hole extending from the solar surface through the chromosphere and into the solar corona to approximately 21 solar radii. The simulations include prescribed empirical profiles that account for the inhomogeneities in density, background flow, and the background magnetic field present in coronal holes. Alfven waves are injected into the solar corona by imposing random, time-dependent velocity and magnetic field fluctuations at the photosphere. The phenomenological model incorporates three important features observed in the simulations: dynamic alignment, weak/strong nonlinear AW-AW interactions, and that the outward-propagating AWs launched by the Sun split into two populations with different characteristic frequencies. Model and simulations are in good agreement and show that when the key physical parameters are chosen within observational constraints, reflection-driven Alfven turbulence is a plausible mechanism for the heating and acceleration of the fast solar wind. By flying a virtual Parker Solar Probe (PSP) through the simulations, we will also establish comparisons between the model and simulations with the kind of single-point measurements that PSP will provide.
Development of simulation computer complex specification
NASA Technical Reports Server (NTRS)
1973-01-01
The Training Simulation Computer Complex Study was one of three studies contracted in support of preparations for procurement of a shuttle mission simulator for shuttle crew training. The subject study was concerned with definition of the software loads to be imposed on the computer complex to be associated with the shuttle mission simulator and the development of procurement specifications based on the resulting computer requirements. These procurement specifications cover the computer hardware and system software as well as the data conversion equipment required to interface the computer to the simulator hardware. The development of the necessary hardware and software specifications required the execution of a number of related tasks which included, (1) simulation software sizing, (2) computer requirements definition, (3) data conversion equipment requirements definition, (4) system software requirements definition, (5) a simulation management plan, (6) a background survey, and (7) preparation of the specifications.
SAR image classification based on CNN in real and simulation datasets
NASA Astrophysics Data System (ADS)
Peng, Lijiang; Liu, Ming; Liu, Xiaohua; Dong, Liquan; Hui, Mei; Zhao, Yuejin
2018-04-01
Convolution neural network (CNN) has made great success in image classification tasks. Even in the field of synthetic aperture radar automatic target recognition (SAR-ATR), state-of-art results has been obtained by learning deep representation of features on the MSTAR benchmark. However, the raw data of MSTAR have shortcomings in training a SAR-ATR model because of high similarity in background among the SAR images of each kind. This indicates that the CNN would learn the hierarchies of features of backgrounds as well as the targets. To validate the influence of the background, some other SAR images datasets have been made which contains the simulation SAR images of 10 manufactured targets such as tank and fighter aircraft, and the backgrounds of simulation SAR images are sampled from the whole original MSTAR data. The simulation datasets contain the dataset that the backgrounds of each kind images correspond to the one kind of backgrounds of MSTAR targets or clutters and the dataset that each image shares the random background of whole MSTAR targets or clutters. In addition, mixed datasets of MSTAR and simulation datasets had been made to use in the experiments. The CNN architecture proposed in this paper are trained on all datasets mentioned above. The experimental results shows that the architecture can get high performances on all datasets even the backgrounds of the images are miscellaneous, which indicates the architecture can learn a good representation of the targets even though the drastic changes on background.
Issues and progress in determining background ozone and particle concentrations
NASA Astrophysics Data System (ADS)
Pinto, J. P.
2011-12-01
Exposure to ambient ozone is associated with a variety of health outcomes ranging from mild breathing discomfort to mortality. For the purpose of health risk and policy assessments EPA evaluates the anthropogenic increase in ozone above background concentrations and has defined the North American (NA) background concentration of O3 as that which would occur in the U.S. in the absence of anthropogenic emissions of precursors in the U.S., Canada, and Mexico. Monthly average NA background ozone has been used to evaluate health risks, but EPA and state air quality managers must also estimate day specific ozone background levels for high ozone episodes as part of urban scale photochemical modeling efforts to support ozone regulatory programs. The background concentration of O3 is of more concern than other air pollutants because it typically represents a much larger fraction of observed O3 than do the backgrounds of other criteria pollutants (particulate matter (PM), CO, NO2, SO2). NA background cannot be determined directly from ambient monitoring data because of the influence of NA precursor emissions on formation of ozone within NA. Instead, estimates of NA background O3 have been based on GEOS-Chem using simulations in which NA anthropogenic precursor emissions are zeroed out. Thus, modeled NA background O3 includes contributions from natural sources of precursors (including CH4, NMVOCs, NOx, and CO) everywhere in the world, anthropogenic sources of precursors outside of NA, and downward transport of O3 from the stratosphere. Although monitoring data cannot determine NA background directly, measurements by satellites, aircraft, ozonesondes and surface monitors have proved to be highly useful for identifying sources of background O3 and for evaluating the performance of the GEOS-Chem model. Model simulated NA background concentrations are strong functions of location and season with large inter-day variability and with values increasing with elevation and higher in spring than in summer, and tend to be highest in the Intermountain West during spring. Estimates of annual average NA and other background definitions that have been considered will be presented. Issues associated with modeling background concentrations for both health-risk assessments and for episodic regulatory air quality programs will be discussed, and proposals for new atmospheric measurements and model improvements needed to quantify more accurately background contributions to ozone will also be presented. The views expressed are those of the author and do not necessarily represent the views or policies of the U.S. Environmental Protection Agency.
NASA Astrophysics Data System (ADS)
Bednar, Earl; Drager, Steven L.
2007-04-01
Quantum information processing's objective is to utilize revolutionary computing capability based on harnessing the paradigm shift offered by quantum computing to solve classically hard and computationally challenging problems. Some of our computationally challenging problems of interest include: the capability for rapid image processing, rapid optimization of logistics, protecting information, secure distributed simulation, and massively parallel computation. Currently, one important problem with quantum information processing is that the implementation of quantum computers is difficult to realize due to poor scalability and great presence of errors. Therefore, we have supported the development of Quantum eXpress and QuIDD Pro, two quantum computer simulators running on classical computers for the development and testing of new quantum algorithms and processes. This paper examines the different methods used by these two quantum computing simulators. It reviews both simulators, highlighting each simulators background, interface, and special features. It also demonstrates the implementation of current quantum algorithms on each simulator. It concludes with summary comments on both simulators.
Hydrocarbon-Fueled Rocket Engine Plume Diagnostics: Analytical Developments and Experimental Results
NASA Technical Reports Server (NTRS)
Tejwani, Gopal D.; McVay, Gregory P.; Langford, Lester A.; St. Cyr, William W.
2006-01-01
A viewgraph presentation describing experimental results and analytical developments about plume diagnostics for hydrocarbon-fueled rocket engines is shown. The topics include: 1) SSC Plume Diagnostics Background; 2) Engine Health Monitoring Approach; 3) Rocket Plume Spectroscopy Simulation Code; 4) Spectral Simulation for 10 Atomic Species and for 11 Diatomic Molecular Electronic Bands; 5) "Best" Lines for Plume Diagnostics for Hydrocarbon-Fueled Rocket Engines; 6) Experimental Set Up for the Methane Thruster Test Program and Experimental Results; and 7) Summary and Recommendations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tenerani, Anna; Velli, Marco
Alfvénic fluctuations in the solar wind display many properties reflecting an ongoing nonlinear cascade, e.g., a well-defined spectrum in frequency, together with some characteristics more commonly associated with the linear propagation of waves from the Sun, such as the variation of fluctuation amplitude with distance, dominated by solar wind expansion effects. Therefore, both nonlinearities and expansion must be included simultaneously in any successful model of solar wind turbulence evolution. Because of the disparate spatial scales involved, direct numerical simulations of turbulence in the solar wind represent an arduous task, especially if one wants to go beyond the incompressible approximation. Indeed,more » most simulations neglect solar wind expansion effects entirely. Here we develop a numerical model to simulate turbulent fluctuations from the outer corona to 1 au and beyond, including the sub-Alfvénic corona. The accelerating expanding box (AEB) extends the validity of previous expanding box models by taking into account both the acceleration of the solar wind and the inhomogeneity of background density and magnetic field. Our method incorporates a background accelerating wind within a magnetic field that naturally follows the Parker spiral evolution using a two-scale analysis in which the macroscopic spatial effect coupling fluctuations with background gradients becomes a time-dependent coupling term in a homogeneous box. In this paper we describe the AEB model in detail and discuss its main properties, illustrating its validity by studying Alfvén wave propagation across the Alfvén critical point.« less
Johnson, Timothy R; Kuhn, Kristine M
2015-12-01
This paper introduces the ltbayes package for R. This package includes a suite of functions for investigating the posterior distribution of latent traits of item response models. These include functions for simulating realizations from the posterior distribution, profiling the posterior density or likelihood function, calculation of posterior modes or means, Fisher information functions and observed information, and profile likelihood confidence intervals. Inferences can be based on individual response patterns or sets of response patterns such as sum scores. Functions are included for several common binary and polytomous item response models, but the package can also be used with user-specified models. This paper introduces some background and motivation for the package, and includes several detailed examples of its use.
Advanced Multiple Processor Configuration Study. Final Report.
ERIC Educational Resources Information Center
Clymer, S. J.
This summary of a study on multiple processor configurations includes the objectives, background, approach, and results of research undertaken to provide the Air Force with a generalized model of computer processor combinations for use in the evaluation of proposed flight training simulator computational designs. An analysis of a real-time flight…
Acting Out Immunity: A Simulation of a Complicated Concept.
ERIC Educational Resources Information Center
Bealer, Jonathan; Bealer, Virginia
1996-01-01
Presents a lecture and play in which the students themselves become the elements of the immune system. Aims at facilitating student comprehension and retention of the complicated processes associated with the immune system. Includes objectives, outline, background information sources, instructor guide, student narrator guide, extension, and topics…
Presentation slides provide background on model evaluation techniques. Also included in the presentation is an operational evaluation of 2001 Community Multiscale Air Quality (CMAQ) annual simulation, and an evaluation of PM2.5 for the CMAQ air quality forecast (AQF) ...
Speaking Personally--With John "Pathfinder" Lester
ERIC Educational Resources Information Center
Beaubois, Terry
2013-01-01
John Lester is currently the chief learning officer at ReactionGrid, a software company developing 3-D simulations and multiuser virtual world platforms. Lester's background includes working with Linden Lab on Second Life's education activities and neuroscience research. His primary focus is on collaborative learning and instructional…
Wavelet-Bayesian inference of cosmic strings embedded in the cosmic microwave background
NASA Astrophysics Data System (ADS)
McEwen, J. D.; Feeney, S. M.; Peiris, H. V.; Wiaux, Y.; Ringeval, C.; Bouchet, F. R.
2017-12-01
Cosmic strings are a well-motivated extension to the standard cosmological model and could induce a subdominant component in the anisotropies of the cosmic microwave background (CMB), in addition to the standard inflationary component. The detection of strings, while observationally challenging, would provide a direct probe of physics at very high-energy scales. We develop a framework for cosmic string inference from observations of the CMB made over the celestial sphere, performing a Bayesian analysis in wavelet space where the string-induced CMB component has distinct statistical properties to the standard inflationary component. Our wavelet-Bayesian framework provides a principled approach to compute the posterior distribution of the string tension Gμ and the Bayesian evidence ratio comparing the string model to the standard inflationary model. Furthermore, we present a technique to recover an estimate of any string-induced CMB map embedded in observational data. Using Planck-like simulations, we demonstrate the application of our framework and evaluate its performance. The method is sensitive to Gμ ∼ 5 × 10-7 for Nambu-Goto string simulations that include an integrated Sachs-Wolfe contribution only and do not include any recombination effects, before any parameters of the analysis are optimized. The sensitivity of the method compares favourably with other techniques applied to the same simulations.
NASA Astrophysics Data System (ADS)
Otsuka, Fumiko; Matsukiyo, Shuichi; Kis, Arpad; Nakanishi, Kento; Hada, Tohru
2018-02-01
Field-aligned diffusion of energetic ions in the Earth’s foreshock is investigated by using the quasi-linear theory (QLT) and test particle simulation. Non-propagating MHD turbulence in the solar wind rest frame is assumed to be purely transverse with respect to the background field. We use a turbulence model based on a multi-power-law spectrum including an intense peak that corresponds to upstream ULF waves resonantly generated by the field-aligned beam (FAB). The presence of the ULF peak produces a concave shape of the diffusion coefficient when it is plotted versus the ion energy. The QLT including the effect of the ULF wave explains the simulation result well, when the energy density of the turbulent magnetic field is 1% of that of the background magnetic field and the power-law index of the wave spectrum is less than 2. The numerically obtained e-folding distances from 10 to 32 keV ions match with the observational values in the event discussed in the companion paper, which contains an intense ULF peak in the spectra generated by the FAB. Evolution of the power spectrum of the ULF waves when approaching the shock significantly affects the energy dependence of the e-folding distance.
Massive black hole and gas dynamics in galaxy nuclei mergers - I. Numerical implementation
NASA Astrophysics Data System (ADS)
Lupi, Alessandro; Haardt, Francesco; Dotti, Massimo
2015-01-01
Numerical effects are known to plague adaptive mesh refinement (AMR) codes when treating massive particles, e.g. representing massive black holes (MBHs). In an evolving background, they can experience strong, spurious perturbations and then follow unphysical orbits. We study by means of numerical simulations the dynamical evolution of a pair MBHs in the rapidly and violently evolving gaseous and stellar background that follows a galaxy major merger. We confirm that spurious numerical effects alter the MBH orbits in AMR simulations, and show that numerical issues are ultimately due to a drop in the spatial resolution during the simulation, drastically reducing the accuracy in the gravitational force computation. We therefore propose a new refinement criterion suited for massive particles, able to solve in a fast and precise way for their orbits in highly dynamical backgrounds. The new refinement criterion we designed enforces the region around each massive particle to remain at the maximum resolution allowed, independently upon the local gas density. Such maximally resolved regions then follow the MBHs along their orbits, and effectively avoids all spurious effects caused by resolution changes. Our suite of high-resolution, AMR hydrodynamic simulations, including different prescriptions for the sub-grid gas physics, shows that the new refinement implementation has the advantage of not altering the physical evolution of the MBHs, accounting for all the non-trivial physical processes taking place in violent dynamical scenarios, such as the final stages of a galaxy major merger.
Development of CANDLES low background HPGe detector and half-life measurement of 180Tam
NASA Astrophysics Data System (ADS)
Chan, W. M.; Kishimoto, T.; Umehara, S.; Matsuoka, K.; Suzuki, K.; Yoshida, S.; Nakajima, K.; Iida, T.; Fushimi, K.; Nomachi, M.; Ogawa, I.; Tamagawa, Y.; Hazama, R.; Takemoto, Y.; Nakatani, N.; Takihira, Y.; Tozawa, M.; Kakubata, H.; Trang, V. T. T.; Ohata, T.; Tetsuno, K.; Maeda, T.; Khai, B. T.; Li, X. L.; Batpurev, T.
2018-01-01
A low background HPGe detector system was developed at CANDLES Experimental Hall for multipurpose use. Various low background techniques were employed, including hermatic shield design, radon gas suppression, and background reduction analysis. A new pulse shape discrimination (PSD) method was specially created for coaxial Ge detector. Using this PSD method, microphonics noise and background event at low energy region less than 200 keV can be rejected effectively. Monte Carlo simulation by GEANT4 was performed to acquire the detection efficiency and study the interaction of gamma-rays with detector system. For rare decay measurement, the detector was utilized to detect the nature's most stable isomer tantalum-180m (180Tam) decay. Two phases of tantalum physics run were completed with total livetime of 358.2 days, which Phase II has upgraded shield configuration. The world most stringent half-life limit of 180Tam has been successfully achieved.
Research on cloud background infrared radiation simulation based on fractal and statistical data
NASA Astrophysics Data System (ADS)
Liu, Xingrun; Xu, Qingshan; Li, Xia; Wu, Kaifeng; Dong, Yanbing
2018-02-01
Cloud is an important natural phenomenon, and its radiation causes serious interference to infrared detector. Based on fractal and statistical data, a method is proposed to realize cloud background simulation, and cloud infrared radiation data field is assigned using satellite radiation data of cloud. A cloud infrared radiation simulation model is established using matlab, and it can generate cloud background infrared images for different cloud types (low cloud, middle cloud, and high cloud) in different months, bands and sensor zenith angles.
NASA Technical Reports Server (NTRS)
Aveiro, H. C.; Hysell, D. L.; Caton, R. G.; Groves, K. M.; Klenzing, J.; Pfaff, R. F.; Stoneback, R.; Heelis, R. A.
2012-01-01
A three-dimensional numerical simulation of plasma density irregularities in the postsunset equatorial F region ionosphere leading to equatorial spread F (ESF) is described. The simulation evolves under realistic background conditions including bottomside plasma shear flow and vertical current. It also incorporates C/NOFS satellite data which partially specify the forcing. A combination of generalized Rayleigh-Taylor instability (GRT) and collisional shear instability (CSI) produces growing waveforms with key features that agree with C/NOFS satellite and ALTAIR radar observations in the Pacific sector, including features such as gross morphology and rates of development. The transient response of CSI is consistent with the observation of bottomside waves with wavelengths close to 30 km, whereas the steady state behavior of the combined instability can account for the 100+ km wavelength waves that predominate in the F region.
NASA Astrophysics Data System (ADS)
Li, Qi; Tan, Jonathan C.; Christie, Duncan; Bisbas, Thomas G.; Wu, Benjamin
2018-05-01
We present a series of adaptive mesh refinement hydrodynamic simulations of flat rotation curve galactic gas disks, with a detailed treatment of the interstellar medium (ISM) physics of the atomic to molecular phase transition under the influence of diffuse far-ultraviolet (FUV) radiation fields and cosmic-ray backgrounds. We explore the effects of different FUV intensities, including a model with a radial gradient designed to mimic the Milky Way. The effects of cosmic rays, including radial gradients in their heating and ionization rates, are also explored. The final simulations in this series achieve 4 pc resolution across the ˜20 kpc global disk diameter, with heating and cooling followed down to temperatures of ˜10 K. The disks are evolved for 300 Myr, which is enough time for the ISM to achieve a quasi-statistical equilibrium. In particular, the mass fraction of molecular gas is stabilized by ˜200 Myr. Additional global ISM properties are analyzed. Giant molecular clouds (GMCs) are also identified and the statistical properties of their populations are examined. GMCs are tracked as the disks evolve. GMC collisions, which may be a means of triggering star cluster formation, are counted and their rates are compared with analytic models. Relatively frequent GMC collision rates are seen in these simulations, and their implications for understanding GMC properties, including the driving of internal turbulence, are discussed.
NASA Technical Reports Server (NTRS)
Lipatov, A. S.; Cooper, J F.; Paterson, W. R.; Sittler, E. C., Jr.; Hartle, R. E.; Simpson, David G.
2013-01-01
The hybrid kinetic model supports comprehensive simulation of the interaction between different spatial and energetic elements of the Europa moon-magnetosphere system with respect to a variable upstream magnetic field and flux or density distributions of plasma and energetic ions, electrons, and neutral atoms. This capability is critical for improving the interpretation of the existing Europa flyby measurements from the Galileo Orbiter mission, and for planning flyby and orbital measurements (including the surface and atmospheric compositions) for future missions. The simulations are based on recent models of the atmosphere of Europa (Cassidy et al., 2007; Shematovich et al., 2005). In contrast to previous approaches with MHD simulations, the hybrid model allows us to fully take into account the finite gyroradius effect and electron pressure, and to correctly estimate the ion velocity distribution and the fluxes along the magnetic field (assuming an initial Maxwellian velocity distribution for upstream background ions). Photoionization, electron-impact ionization, charge exchange and collisions between the ions and neutrals are also included in our model. We consider the models with Oþ þ and Sþ þ background plasma, and various betas for background ions and electrons, and pickup electrons. The majority of O2 atmosphere is thermal with an extended non-thermal population (Cassidy et al., 2007). In this paper, we discuss two tasks: (1) the plasma wake structure dependence on the parameters of the upstream plasma and Europa's atmosphere (model I, cases (a) and (b) with a homogeneous Jovian magnetosphere field, an inductive magnetic dipole and high oceanic shell conductivity); and (2) estimation of the possible effect of an induced magnetic field arising from oceanic shell conductivity. This effect was estimated based on the difference between the observed and modeled magnetic fields (model II, case (c) with an inhomogeneous Jovian magnetosphere field, an inductive magnetic dipole and low oceanic shell conductivity).
An Improved Method for Demonstrating Visual Selection by Wild Birds.
ERIC Educational Resources Information Center
Allen, J. A.; And Others
1990-01-01
An activity simulating natural selection in which wild birds are predators, green and brown pastry "baits" are prey, and trays containing colored stones as the backgrounds is presented. Two different methods of measuring selection are used to describe the results. The materials and methods, results, and discussion are included. (KR)
Waterborne Disease Case Investigation: Public Health Nursing Simulation.
Alexander, Gina K; Canclini, Sharon B; Fripp, Jon; Fripp, William
2017-01-01
The lack of safe drinking water is a significant public health threat worldwide. Registered nurses assess the physical environment, including the quality of the water supply, and apply environmental health knowledge to reduce environmental exposures. The purpose of this research brief is to describe a waterborne disease simulation for students enrolled in a public health nursing (PHN) course. A total of 157 undergraduate students completed the simulation in teams, using the SBAR (Situation-Background-Assessment-Recommendation) reporting tool. Simulation evaluation consisted of content analysis of the SBAR tools and debriefing notes. Student teams completed the simulation and articulated the implications for PHN practice. Student teams discussed assessment findings and primarily recommended four nursing interventions: health teaching focused on water, sanitation, and hygiene; community organizing; collaboration; and advocacy to ensure a safe water supply. With advanced planning and collaboration with partners, waterborne disease simulation may enhance PHN education. [J Nurs Educ. 2017;56(1):39-42.]. Copyright 2017, SLACK Incorporated.
Influence of impurities on the high temperature conductivity of SrTiO3
NASA Astrophysics Data System (ADS)
Bowes, Preston C.; Baker, Jonathon N.; Harris, Joshua S.; Behrhorst, Brian D.; Irving, Douglas L.
2018-01-01
In studies of high temperature electrical conductivity (HiTEC) of dielectrics, the impurity in the highest concentration is assumed to form a single defect that controls HiTEC. However, carrier concentrations are typically at or below the level of background impurities, and all impurities may complex with native defects. Canonical defect models ignore complex formation and lump defects from multiple impurities into a single effective defect to reduce the number of associated reactions. To evaluate the importance of background impurities and defect complexes on HiTEC, a grand canonical defect model was developed with input from density functional theory calculations using hybrid exchange correlation functionals. The influence of common background impurities and first nearest neighbor complexes with oxygen vacancies (vO) was studied for three doping cases: nominally undoped, donor doped, and acceptor doped SrTiO3. In each case, conductivity depended on the ensemble of impurity defects simulated with the extent of the dependence governed by the character of the dominant impurity and its tendency to complex with vO. Agreement between simulated and measured conductivity profiles as a function of temperature and oxygen partial pressure improved significantly when background impurities were included in the nominally undoped case. Effects of the impurities simulated were reduced in the Nb and Al doped cases as both elements did not form complexes and were present in concentrations well exceeding all other active impurities. The influence of individual impurities on HiTEC in SrTiO3 was isolated and discussed and motivates further experiments on singly doped SrTiO3.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Movahed, M. Sadegh; Khosravi, Shahram, E-mail: m.s.movahed@ipm.ir, E-mail: khosravi@ipm.ir
2011-03-01
In this paper we study the footprint of cosmic string as the topological defects in the very early universe on the cosmic microwave background radiation. We develop the method of level crossing analysis in the context of the well-known Kaiser-Stebbins phenomenon for exploring the signature of cosmic strings. We simulate a Gaussian map by using the best fit parameter given by WMAP-7 and then superimpose cosmic strings effects on it as an incoherent and active fluctuations. In order to investigate the capability of our method to detect the cosmic strings for the various values of tension, Gμ, a simulated puremore » Gaussian map is compared with that of including cosmic strings. Based on the level crossing analysis, the superimposed cosmic string with Gμ∼>4 × 10{sup −9} in the simulated map without instrumental noise and the resolution R = 1' could be detected. In the presence of anticipated instrumental noise the lower bound increases just up to Gμ∼>5.8 × 10{sup −9}.« less
NASA Astrophysics Data System (ADS)
Ots, Riinu; Young, Dominique E.; Vieno, Massimo; Xu, Lu; Dunmore, Rachel E.; Allan, James D.; Coe, Hugh; Williams, Leah R.; Herndon, Scott C.; Ng, Nga L.; Hamilton, Jacqueline F.; Bergström, Robert; Di Marco, Chiara; Nemitz, Eiko; Mackenzie, Ian A.; Kuenen, Jeroen J. P.; Green, David C.; Reis, Stefan; Heal, Mathew R.
2016-05-01
We present high-resolution (5 km × 5 km) atmospheric chemical transport model (ACTM) simulations of the impact of newly estimated traffic-related emissions on secondary organic aerosol (SOA) formation over the UK for 2012. Our simulations include additional diesel-related intermediate-volatility organic compound (IVOC) emissions derived directly from comprehensive field measurements at an urban background site in London during the 2012 Clean Air for London (ClearfLo) campaign. Our IVOC emissions are added proportionally to VOC emissions, as opposed to proportionally to primary organic aerosol (POA) as has been done by previous ACTM studies seeking to simulate the effects of these missing emissions. Modelled concentrations are evaluated against hourly and daily measurements of organic aerosol (OA) components derived from aerosol mass spectrometer (AMS) measurements also made during the ClearfLo campaign at three sites in the London area. According to the model simulations, diesel-related IVOCs can explain on average ˜ 30 % of the annual SOA in and around London. Furthermore, the 90th percentile of modelled daily SOA concentrations for the whole year is 3.8 µg m-3, constituting a notable addition to total particulate matter. More measurements of these precursors (currently not included in official emissions inventories) is recommended. During the period of concurrent measurements, SOA concentrations at the Detling rural background location east of London were greater than at the central London location. The model shows that this was caused by an intense pollution plume with a strong gradient of imported SOA passing over the rural location. This demonstrates the value of modelling for supporting the interpretation of measurements taken at different sites or for short durations.
NASA Astrophysics Data System (ADS)
Jha, V.; Kahre, M. A.
2017-12-01
The Mars atmosphere has low levels of dust during Northern Hemisphere (NH) spring and summer (the non-dusty season) and increased levels during NH autumn and winter (the dusty season). In the absence of regional or global storms, dust devils and local storms maintain a background minimum dust loading during the non-dusty season. While observational surveys and Global Climate Model (GCM) studies suggest that dust devils are likely to be major contributors to the background haze during NH spring and summer, a complete understanding of the relative contribution of dust devils and local dust storms has not yet been achieved. We present preliminary results from an investigation that focuses on the effects of radiatively active water ice clouds on dust lifting processes during these seasons. Water ice clouds are known to affect atmospheric temperatures directly by absorption and emission of thermal infrared radiation and indirectly through dynamical feedbacks. Our goal is to understand how clouds affect the contribution by local (wind stress) dust storms to the background dust haze during NH spring and summer. The primary tool for this work is the NASA Ames Mars GCM, which contains physical parameterizations for a fully interactive dust cycle. Three simulations that included wind stress dust lifting were executed for a period of 5 Martian years: a case that included no cloud formation, a case that included radiatively inert cloud formation and a case that included radiatively active cloud (RAC) formation. Results show that when radiatively active clouds are included, the clouds in the aphelion cloud belt radiatively heat the atmosphere aloft in the tropics (Figure 1). This heating produces a stronger overturning circulation, which in turn produces an enhanced low-level flow in the Hadley cell return branch. The stronger low-level flow drives higher surface stresses and increased dust lifting in those locations. We examine how realistic these simulated results are by comparing the spatial pattern of predicted wind stress lifting with a catalog of observed local storms. Better agreement is achieved in the radiatively active cloud case. These results suggest that wind stress lifting may contribute more to maintaining the background dust haze during NH spring and summer than what previous studies have shown.
Modeling surface backgrounds from radon progeny plate-out
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perumpilly, G.; Guiseppe, V. E.; Snyder, N.
2013-08-08
The next generation low-background detectors operating deep underground aim for unprecedented low levels of radioactive backgrounds. The surface deposition and subsequent implantation of radon progeny in detector materials will be a source of energetic background events. We investigate Monte Carlo and model-based simulations to understand the surface implantation profile of radon progeny. Depending on the material and region of interest of a rare event search, these partial energy depositions can be problematic. Motivated by the use of Ge crystals for the detection of neutrinoless double-beta decay, we wish to understand the detector response of surface backgrounds from radon progeny. Wemore » look at the simulation of surface decays using a validated implantation distribution based on nuclear recoils and a realistic surface texture. Results of the simulations and measured α spectra are presented.« less
ERIC Educational Resources Information Center
Bessent, E. Wailand; And Others
Provided in the manual are background material, problems, and worksheets designed for graduate students involved in a computer assisted instruction (CAI) approach to supervisor training. Included are a faculty handbook for a simulated school in a mythical community, a practice problem to familiarize the student with terminal operation, and eight…
NASA Technical Reports Server (NTRS)
Aikins, Jan
2005-01-01
Contents include the following: General Background and Introduction of Capability Roadmaps. Agency Objective. Strategic Planning Transformation. Advanced Planning Organizational Roles. Public Involvement in Strategic Planning. Strategic Roadmaps and Schedule. Capability Roadmaps and Schedule. Purpose of NRC Review. Capability Roadmap Development (Progress to Date).
The Virtual Brain: a simulator of primate brain network dynamics.
Sanz Leon, Paula; Knock, Stuart A; Woodman, M Marmaduke; Domide, Lia; Mersmann, Jochen; McIntosh, Anthony R; Jirsa, Viktor
2013-01-01
We present The Virtual Brain (TVB), a neuroinformatics platform for full brain network simulations using biologically realistic connectivity. This simulation environment enables the model-based inference of neurophysiological mechanisms across different brain scales that underlie the generation of macroscopic neuroimaging signals including functional MRI (fMRI), EEG and MEG. Researchers from different backgrounds can benefit from an integrative software platform including a supporting framework for data management (generation, organization, storage, integration and sharing) and a simulation core written in Python. TVB allows the reproduction and evaluation of personalized configurations of the brain by using individual subject data. This personalization facilitates an exploration of the consequences of pathological changes in the system, permitting to investigate potential ways to counteract such unfavorable processes. The architecture of TVB supports interaction with MATLAB packages, for example, the well known Brain Connectivity Toolbox. TVB can be used in a client-server configuration, such that it can be remotely accessed through the Internet thanks to its web-based HTML5, JS, and WebGL graphical user interface. TVB is also accessible as a standalone cross-platform Python library and application, and users can interact with the scientific core through the scripting interface IDLE, enabling easy modeling, development and debugging of the scientific kernel. This second interface makes TVB extensible by combining it with other libraries and modules developed by the Python scientific community. In this article, we describe the theoretical background and foundations that led to the development of TVB, the architecture and features of its major software components as well as potential neuroscience applications.
The Virtual Brain: a simulator of primate brain network dynamics
Sanz Leon, Paula; Knock, Stuart A.; Woodman, M. Marmaduke; Domide, Lia; Mersmann, Jochen; McIntosh, Anthony R.; Jirsa, Viktor
2013-01-01
We present The Virtual Brain (TVB), a neuroinformatics platform for full brain network simulations using biologically realistic connectivity. This simulation environment enables the model-based inference of neurophysiological mechanisms across different brain scales that underlie the generation of macroscopic neuroimaging signals including functional MRI (fMRI), EEG and MEG. Researchers from different backgrounds can benefit from an integrative software platform including a supporting framework for data management (generation, organization, storage, integration and sharing) and a simulation core written in Python. TVB allows the reproduction and evaluation of personalized configurations of the brain by using individual subject data. This personalization facilitates an exploration of the consequences of pathological changes in the system, permitting to investigate potential ways to counteract such unfavorable processes. The architecture of TVB supports interaction with MATLAB packages, for example, the well known Brain Connectivity Toolbox. TVB can be used in a client-server configuration, such that it can be remotely accessed through the Internet thanks to its web-based HTML5, JS, and WebGL graphical user interface. TVB is also accessible as a standalone cross-platform Python library and application, and users can interact with the scientific core through the scripting interface IDLE, enabling easy modeling, development and debugging of the scientific kernel. This second interface makes TVB extensible by combining it with other libraries and modules developed by the Python scientific community. In this article, we describe the theoretical background and foundations that led to the development of TVB, the architecture and features of its major software components as well as potential neuroscience applications. PMID:23781198
Preparing for ICESat-2: Simulated Geolocated Photon Data for Cryospheric Data Products
NASA Astrophysics Data System (ADS)
Harbeck, K.; Neumann, T.; Lee, J.; Hancock, D.; Brenner, A. C.; Markus, T.
2017-12-01
ICESat-2 will carry NASA's next-generation laser altimeter, ATLAS (Advanced Topographic Laser Altimeter System), which is designed to measure changes in ice sheet height, sea ice freeboard, and vegetation canopy height. There is a critical need for data that simulate what certain ICESat-2 science data products will "look like" post-launch in order to aid the data product development process. There are several sources for simulated photon-counting lidar data, including data from NASA's MABEL (Multiple Altimeter Beam Experimental Lidar) instrument, and M-ATLAS (MABEL data that has been scaled geometrically and radiometrically to be more similar to that expected from ATLAS). From these sources, we are able to develop simulated granules of the geolocated photon cloud product; also referred to as ATL03. These simulated ATL03 granules can be further processed into the upper-level data products that report ice sheet height, sea ice freeboard, and vegetation canopy height. For ice sheet height (ATL06) and sea ice height (ATL07) simulations, both MABEL and M-ATLAS data products are used. M-ATLAS data use ATLAS engineering design cases for signal and background noise rates over certain surface types, and also provides large vertical windows of data for more accurate calculations of atmospheric background rates. MABEL data give a more accurate representation of background noise rates over areas of water (i.e., melt ponds, crevasses or sea ice leads) versus land or solid ice. Through a variety of data manipulation procedures, we provide a product that mimics the appearance and parameter characterization of ATL03 data granules. There are three primary goals for generating this simulated ATL03 dataset: (1) allowing end users to become familiar with using the large photon cloud datasets that will be the primary science data product from ICESat-2, (2) the process ensures that ATL03 data can flow seamlessly through upper-level science data product algorithms, and (3) the process ensures parameter traceability through ATL03 and upper-level data products. We will present a summary of how simulated data products are generated, the cryospheric data product applications for this simulated data (specifically ice sheet height and sea ice freeboard), and where these simulated datasets are available to the ICESat-2 data user community.
Benchmarking shielding simulations for an accelerator-driven spallation neutron source
Cherkashyna, Nataliia; Di Julio, Douglas D.; Panzner, Tobias; ...
2015-08-09
The shielding at an accelerator-driven spallation neutron facility plays a critical role in the performance of the neutron scattering instruments, the overall safety, and the total cost of the facility. Accurate simulation of shielding components is thus key for the design of upcoming facilities, such as the European Spallation Source (ESS), currently in construction in Lund, Sweden. In this paper, we present a comparative study between the measured and the simulated neutron background at the Swiss Spallation Neutron Source (SINQ), at the Paul Scherrer Institute (PSI), Villigen, Switzerland. The measurements were carried out at several positions along the SINQ monolithmore » wall with the neutron dosimeter WENDI-2, which has a well-characterized response up to 5 GeV. The simulations were performed using the Monte-Carlo radiation transport code Geant4, and include a complete transport from the proton beam to the measurement locations in a single calculation. An agreement between measurements and simulations is about a factor of 2 for the points where the measured radiation dose is above the background level, which is a satisfactory result for such simulations spanning many energy regimes, different physics processes and transport through several meters of shielding materials. The neutrons contributing to the radiation field emanating from the monolith were confirmed to originate from neutrons with energies above 1 MeV in the target region. The current work validates Geant4 as being well suited for deep-shielding calculations at accelerator-based spallation sources. We also extrapolate what the simulated flux levels might imply for short (several tens of meters) instruments at ESS.« less
NASA Technical Reports Server (NTRS)
Yim, John T.; Burt, Jonathan M.
2015-01-01
The background gas in a vacuum facility for electric propulsion ground testing is examined in detail through a series of cold flow simulations using a direct simulation Monte Carlo (DSMC) code. The focus here is on the background gas itself, its structure and characteristics, rather than assessing its interaction and impact on thruster operation. The background gas, which is often incorrectly characterized as uniform, is found to have a notable velocity within a test facility. The gas velocity has an impact on the proper measurement of pressure and the calculation of ingestion flux to a thruster. There are also considerations for best practices for tests that involve the introduction of supplemental gas flows to artificially increase the background pressure. All of these effects need to be accounted for to properly characterize the operation of electric propulsion thrusters across different ground test vacuum facilities.
Libin, Alexander; Lauderdale, Manon; Millo, Yuri; Shamloo, Christine; Spencer, Rachel; Green, Brad; Donnellan, Joyce; Wellesley, Christine; Groah, Suzanne
2010-04-01
Simulation- and video game-based role-playing techniques have been proven effective in changing behavior and enhancing positive decision making in a variety of professional settings, including education, the military, and health care. Although the need for developing assessment frameworks for learning outcomes has been clearly defined, there is a significant gap between the variety of existing multimedia-based instruction and technology-mediated learning systems and the number of reliable assessment algorithms. This study, based on a mixed methodology research design, aims to develop an embedded assessment algorithm, a Knowledge Assessment Module (NOTE), to capture both user interaction with the educational tool and knowledge gained from the training. The study is regarded as the first step in developing an assessment framework for a multimedia educational tool for health care professionals, Anatomy of Care (AOC), that utilizes Virtual Experience Immersive Learning Simulation (VEILS) technology. Ninety health care personnel of various backgrounds took part in online AOC training, choosing from five possible scenarios presenting difficult situations of everyday care. The results suggest that although the simulation-based training tool demonstrated partial effectiveness in improving learners' decision-making capacity, a differential learner-oriented approach might be more effective and capable of synchronizing educational efforts with identifiable relevant individual factors such as sociobehavioral profile and professional background.
Spatiotemporal models for the simulation of infrared backgrounds
NASA Astrophysics Data System (ADS)
Wilkes, Don M.; Cadzow, James A.; Peters, R. Alan, II; Li, Xingkang
1992-09-01
It is highly desirable for designers of automatic target recognizers (ATRs) to be able to test their algorithms on targets superimposed on a wide variety of background imagery. Background imagery in the infrared spectrum is expensive to gather from real sources, consequently, there is a need for accurate models for producing synthetic IR background imagery. We have developed a model for such imagery that will do the following: Given a real, infrared background image, generate another image, distinctly different from the one given, that has the same general visual characteristics as well as the first and second-order statistics of the original image. The proposed model consists of a finite impulse response (FIR) kernel convolved with an excitation function, and histogram modification applied to the final solution. A procedure for deriving the FIR kernel using a signal enhancement algorithm has been developed, and the histogram modification step is a simple memoryless nonlinear mapping that imposes the first order statistics of the original image onto the synthetic one, thus the overall model is a linear system cascaded with a memoryless nonlinearity. It has been found that the excitation function relates to the placement of features in the image, the FIR kernel controls the sharpness of the edges and the global spectrum of the image, and the histogram controls the basic coloration of the image. A drawback to this method of simulating IR backgrounds is that a database of actual background images must be collected in order to produce accurate FIR and histogram models. If this database must include images of all types of backgrounds obtained at all times of the day and all times of the year, the size of the database would be prohibitive. In this paper we propose improvements to the model described above that enable time-dependent modeling of the IR background. This approach can greatly reduce the number of actual IR backgrounds that are required to produce a sufficiently accurate mathematical model for synthesizing a similar IR background for different times of the day. Original and synthetic IR backgrounds will be presented. Previous research in simulating IR backgrounds was performed by Strenzwilk, et al., Botkin, et al., and Rapp. The most recent work of Strenzwilk, et al. was based on the use of one-dimensional ARMA models for synthesizing the images. Their results were able to retain the global statistical and spectral behavior of the original image, but the synthetic image was not visually very similar to the original. The research presented in this paper is the result of an attempt to improve upon their results, and represents a significant improvement in quality over previously obtained results.
NASA Astrophysics Data System (ADS)
Odaka, Hirokazu; Asai, Makoto; Hagino, Kouichi; Koi, Tatsumi; Madejski, Greg; Mizuno, Tsunefumi; Ohno, Masanori; Saito, Shinya; Sato, Tamotsu; Wright, Dennis H.; Enoto, Teruaki; Fukazawa, Yasushi; Hayashi, Katsuhiro; Kataoka, Jun; Katsuta, Junichiro; Kawaharada, Madoka; Kobayashi, Shogo B.; Kokubun, Motohide; Laurent, Philippe; Lebrun, Francois; Limousin, Olivier; Maier, Daniel; Makishima, Kazuo; Mimura, Taketo; Miyake, Katsuma; Mori, Kunishiro; Murakami, Hiroaki; Nakamori, Takeshi; Nakano, Toshio; Nakazawa, Kazuhiro; Noda, Hirofumi; Ohta, Masayuki; Ozaki, Masanobu; Sato, Goro; Sato, Rie; Tajima, Hiroyasu; Takahashi, Hiromitsu; Takahashi, Tadayuki; Takeda, Shin'ichiro; Tanaka, Takaaki; Tanaka, Yasuyuki; Terada, Yukikatsu; Uchiyama, Hideki; Uchiyama, Yasunobu; Watanabe, Shin; Yamaoka, Kazutaka; Yasuda, Tetsuya; Yatsu, Yoichi; Yuasa, Takayuki; Zoglauer, Andreas
2018-05-01
Hard X-ray astronomical observatories in orbit suffer from a significant amount of background due to radioactivation induced by cosmic-ray protons and/or geomagnetically trapped protons. Within the framework of a full Monte Carlo simulation, we present modeling of in-orbit instrumental background which is dominated by radioactivation. To reduce the computation time required by straightforward simulations of delayed emissions from activated isotopes, we insert a semi-analytical calculation that converts production probabilities of radioactive isotopes by interaction of the primary protons into decay rates at measurement time of all secondary isotopes. Therefore, our simulation method is separated into three steps: (1) simulation of isotope production, (2) semi-analytical conversion to decay rates, and (3) simulation of decays of the isotopes at measurement time. This method is verified by a simple setup that has a CdTe semiconductor detector, and shows a 100-fold improvement in efficiency over the straightforward simulation. To demonstrate its experimental performance, the simulation framework was tested against data measured with a CdTe sensor in the Hard X-ray Imager onboard the Hitomi X-ray Astronomy Satellite, which was put into a low Earth orbit with an altitude of 570 km and an inclination of 31°, and thus experienced a large amount of irradiation from geomagnetically trapped protons during its passages through the South Atlantic Anomaly. The simulation is able to treat full histories of the proton irradiation and multiple measurement windows. The simulation results agree very well with the measured data, showing that the measured background is well described by the combination of proton-induced radioactivation of the CdTe detector itself and thick Bi4Ge3O12 scintillator shields, leakage of cosmic X-ray background and albedo gamma-ray radiation, and emissions from naturally contaminated isotopes in the detector system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odaka, Hirokazu; Asai, Makoto; Hagino, Kouichi
Hard X-ray astronomical observatories in orbit suffer from a significant amount of background due to radioactivation induced by cosmic-ray protons and/or geomagnetically trapped protons. Within the framework of a full Monte Carlo simulation, we present modeling of in-orbit instrumental background which is dominated by radioactivation. To reduce the computation time required by straightforward simulations of delayed emissions from activated isotopes, we insert a semi-analytical calculation that converts production probabilities of radioactive isotopes by interaction of the primary protons into decay rates at measurement time of all secondary isotopes. Therefore, our simulation method is separated into three steps: (1) simulation ofmore » isotope production, (2) semi-analytical conversion to decay rates, and (3) simulation of decays of the isotopes at measurement time. This method is verified by a simple setup that has a CdTe semiconductor detector, and shows a 100-fold improvement in efficiency over the straightforward simulation. To demonstrate its experimental performance, the simulation framework was tested against data measured with a CdTe sensor in the Hard X-ray Imager onboard the Hitomi X-ray Astronomy Satellite, which was put into a low Earth orbit with an altitude of 570 km and an inclination of 31°, and thus experienced a large amount of irradiation from geomagnetically trapped protons during its passages through the South Atlantic Anomaly. The simulation is able to treat full histories of the proton irradiation and multiple measurement windows. As a result, the simulation results agree very well with the measured data, showing that the measured background is well described by the combination of proton-induced radioactivation of the CdTe detector itself and thick Bi 4Ge 3O 12 scintillator shields, leakage of cosmic X-ray background and albedo gamma-ray radiation, and emissions from naturally contaminated isotopes in the detector system.« less
Odaka, Hirokazu; Asai, Makoto; Hagino, Kouichi; ...
2018-02-19
Hard X-ray astronomical observatories in orbit suffer from a significant amount of background due to radioactivation induced by cosmic-ray protons and/or geomagnetically trapped protons. Within the framework of a full Monte Carlo simulation, we present modeling of in-orbit instrumental background which is dominated by radioactivation. To reduce the computation time required by straightforward simulations of delayed emissions from activated isotopes, we insert a semi-analytical calculation that converts production probabilities of radioactive isotopes by interaction of the primary protons into decay rates at measurement time of all secondary isotopes. Therefore, our simulation method is separated into three steps: (1) simulation ofmore » isotope production, (2) semi-analytical conversion to decay rates, and (3) simulation of decays of the isotopes at measurement time. This method is verified by a simple setup that has a CdTe semiconductor detector, and shows a 100-fold improvement in efficiency over the straightforward simulation. To demonstrate its experimental performance, the simulation framework was tested against data measured with a CdTe sensor in the Hard X-ray Imager onboard the Hitomi X-ray Astronomy Satellite, which was put into a low Earth orbit with an altitude of 570 km and an inclination of 31°, and thus experienced a large amount of irradiation from geomagnetically trapped protons during its passages through the South Atlantic Anomaly. The simulation is able to treat full histories of the proton irradiation and multiple measurement windows. As a result, the simulation results agree very well with the measured data, showing that the measured background is well described by the combination of proton-induced radioactivation of the CdTe detector itself and thick Bi 4Ge 3O 12 scintillator shields, leakage of cosmic X-ray background and albedo gamma-ray radiation, and emissions from naturally contaminated isotopes in the detector system.« less
Soffientini, Chiara D; De Bernardi, Elisabetta; Casati, Rosangela; Baselli, Giuseppe; Zito, Felicia
2017-01-01
Design, realization, scan, and characterization of a phantom for PET Automatic Segmentation (PET-AS) assessment are presented. Radioactive zeolites immersed in a radioactive heterogeneous background simulate realistic wall-less lesions with known irregular shape and known homogeneous or heterogeneous internal activity. Three different zeolite families were evaluated in terms of radioactive uptake homogeneity, necessary to define activity and contour ground truth. Heterogeneous lesions were simulated by the perfect matching of two portions of a broken zeolite, soaked in two different 18 F-FDG radioactive solutions. Heterogeneous backgrounds were obtained with tissue paper balls and sponge pieces immersed into radioactive solutions. Natural clinoptilolite proved to be the most suitable zeolite for the construction of artificial objects mimicking homogeneous and heterogeneous uptakes in 18 F-FDG PET lesions. Heterogeneous backgrounds showed a coefficient of variation equal to 269% and 443% of a uniform radioactive solution. Assembled phantom included eight lesions with volumes ranging from 1.86 to 7.24 ml and lesion to background contrasts ranging from 4.8:1 to 21.7:1. A novel phantom for the evaluation of PET-AS algorithms was developed. It is provided with both reference contours and activity ground truth, and it covers a wide range of volumes and lesion to background contrasts. The dataset is open to the community of PET-AS developers and utilizers. © 2016 American Association of Physicists in Medicine.
A collision scheme for hybrid fluid-particle simulation of plasmas
NASA Astrophysics Data System (ADS)
Nguyen, Christine; Lim, Chul-Hyun; Verboncoeur, John
2006-10-01
Desorption phenomena at the wall of a tokamak can lead to the introduction of impurities at the edge of a thermonuclear plasma. In particular, the use of carbon as a constituent of the tokamak wall, as planned for ITER, requires the study of carbon and hydrocarbon transport in the plasma, including understanding of collisional interaction with the plasma. These collisions can result in new hydrocarbons, hydrogen, secondary electrons and so on. Computational modeling is a primary tool for studying these phenomena. XOOPIC [1] and OOPD1 are widely used computer modeling tools for the simulation of plasmas. Both are particle type codes. Particle simulation gives more kinetic information than fluid simulation, but more computation time is required. In order to reduce this disadvantage, hybrid simulation has been developed, and applied to the modeling of collisions. Present particle simulation tools such as XOOPIC and OODP1 employ a Monte Carlo model for the collisions between particle species and a neutral background gas defined by its temperature and pressure. In fluid-particle hybrid plasma models, collisions include combinations of particle and fluid interactions categorized by projectile-target pairing: particle-particle, particle-fluid, and fluid-fluid. For verification of this hybrid collision scheme, we compare simulation results to analytic solutions for classical plasma models. [1] Verboncoeur et al. Comput. Phys. Comm. 87, 199 (1995).
Apollo 16 astronauts in Apollo Command Module Mission Simulator
NASA Technical Reports Server (NTRS)
1972-01-01
Astronaut Thomas K. Mattingly II, command module pilot of the Apollo 16 lunar landing mission, participates in extravehicular activity (EVA) training in bldg 5 at the Manned Spacecraft Center (MSC). In the right background is Astronaut Charles M. Duke Jr., lunar module pilot. They are inside the Apollo Command Module Mission Simulator (31046); Mattingly (right foreground) and Duke (right backgroung) in the Apollo Command Module Mission Simulator for EVA simulation and training. Astronaut John W. Young, commander, can be seen in the left background (31047).
Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe
2013-08-01
Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.
Combat Simulation Using Breach Computer Language
1979-09-01
simulation and weapon system analysis computer language Two types of models were constructed: a stochastic duel and a dynamic engagement model The... duel model validates the BREACH approach by comparing results with mathematical solutions. The dynamic model shows the capability of the BREACH...BREACH 2 Background 2 The Language 3 Static Duel 4 Background and Methodology 4 Validation 5 Results 8 Tank Duel Simulation 8 Dynamic Assault Model
NASA Astrophysics Data System (ADS)
Fiore, A. M.; Lin, M.; Cooper, O. R.; Horowitz, L. W.; Naik, V.; Levy, H.; Langford, A. O.; Johnson, B. J.; Oltmans, S. J.; Senff, C. J.
2011-12-01
As the National Ambient Air Quality (NAAQS) standard for ozone (O_{3}) is lowered, it pushes closer to policy-relevant background levels (O_{3} concentrations that would exist in the absence of North American anthropogenic emissions), making attainment more difficult with local controls. We quantify the Asian and stratospheric components of this North American background, with a primary focus on the western United States. Prior work has identified this region as a hotspot for deep stratospheric intrusions in spring. We conduct global simulations at 200 km and 50 km horizontal resolution with the GFDL AM3 model, including a stratospheric O_{3} tracer and two sensitivity simulations with anthropogenic emissions from Asia and North America turned off. The model is evaluated with a suite of in situ and satellite measurements during the NOAA CalNex campaign (May-June 2010). The model reproduces the principle features in the observed surface to near tropopause distribution of O_{3} along the California coast, including its latitudinal variation and the development of regional high-O_{3} episodes. Four deep tropopause folds are diagnosed and we find that the remnants of these stratospheric intrusions are transported to the surface of Southern California and Western U.S. Rocky Mountains, contributing 10-30 ppbv positive anomalies relative to the simulated campaign mean stratospheric component in the model surface layer. We further examine the contribution of North American background, including its stratospheric and Asian components, to the entire distribution of observed MDA8 O_{3} at 12 high-elevation CASTNet sites in the Mountain West. We find that the stratospheric O_{3} tracer constitutes 50% of the North American background, and can enhance surface maximum daily 8-hour average (MDA8) O_{3} by 20 ppb when observed surface O_{3} is in the range of 60-80 ppbv. Our analysis highlights the potential for natural sources such as deep stratospheric intrusions to contribute to high surface O_{3} episodes in the western U.S., representing a major challenge if the NAAQS were to be tightened. We further demonstrate the potential for using satellite (AIRS and OMI) measurements of total column O_{3} to develop space-based criteria to define these exceptional events in support of regional air quality management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warburton, Thomas Karl
2017-01-01
The Deep Underground Neutrino Experiment (DUNE) is a next-generation neutrino experiment which will be built at the Sanford Underground Research Facility (SURF), and will receive a wide-band neutrino beam from Fermilab, 1300~km away. At this baseline DUNE will be able to study many of the properties of neutrino mixing, including the neutrino mass hierarchy and the value of the CP-violating complex phase (more » $$\\delta_{CP}$$). DUNE will utilise Liquid Argon (LAr) Time Projection Chamber (TPC) (LArTPC) technology, and the Far Detector (FD) will consist of four modules, each containing 17.1~kt of LAr with a fiducial mass of around 10~kt. Each of these FD modules represents around an order of magnitude increase in size, when compared to existing LArTPC experiments. \\\\ The 35 ton detector is the first DUNE prototype for the single (LAr) phase design of the FD. There were two running periods, one from November 2013 to February 2014, and a second from November 2015 to March 2016. During t he second running period, a system of TPCs was installed, and cosmic-ray data were collected. A method of particle identification was developed using simulations, though this was not applied to the data due to the higher than expected noise level. A new method of determining the interaction time of a track, using the effects of longitudinal diffusion, was developed using the cosmic-ray data. A camera system was also installed in the detector for monitoring purposes, and to look for high voltage breakdowns. \\\\ Simulations concerning the muon-induced background rate to nucleon decay are performed, following the incorporation of the MUon Simulations UNderground (MUSUN) generator into the DUNE software framework. A series of cuts which are based on Monte Carlo truth information is developed, designed to reject simulated background events, whilst preserving simulated signal events in the $$n \\rightarrow K^{+} + e^{-}$$ decay channel. No background events are seen to survive the app lication of these cuts in a sample of 2~$$\\times$$~10$^9$ muon! s, representing 401.6~years of detector live time. This corresponds to an annual background rate of <~0.44~events$$\\cdot$$Mt$$^{-1}\\cdot$$year$$^{-1}$$ at 90\\% confidence, using a fiducial mass of 13.8~kt.« less
The Development and Comparison of Molecular Dynamics Simulation and Monte Carlo Simulation
NASA Astrophysics Data System (ADS)
Chen, Jundong
2018-03-01
Molecular dynamics is an integrated technology that combines physics, mathematics and chemistry. Molecular dynamics method is a computer simulation experimental method, which is a powerful tool for studying condensed matter system. This technique not only can get the trajectory of the atom, but can also observe the microscopic details of the atomic motion. By studying the numerical integration algorithm in molecular dynamics simulation, we can not only analyze the microstructure, the motion of particles and the image of macroscopic relationship between them and the material, but can also study the relationship between the interaction and the macroscopic properties more conveniently. The Monte Carlo Simulation, similar to the molecular dynamics, is a tool for studying the micro-molecular and particle nature. In this paper, the theoretical background of computer numerical simulation is introduced, and the specific methods of numerical integration are summarized, including Verlet method, Leap-frog method and Velocity Verlet method. At the same time, the method and principle of Monte Carlo Simulation are introduced. Finally, similarities and differences of Monte Carlo Simulation and the molecular dynamics simulation are discussed.
NASA Astrophysics Data System (ADS)
Lu, Wei; Sun, Jianfeng; Hou, Peipei; Xu, Qian; Xi, Yueli; Zhou, Yu; Zhu, Funan; Liu, Liren
2017-08-01
Performance of satellite laser communications between GEO and LEO satellites can be influenced by background light noise appeared in the field of view due to sunlight or planets and some comets. Such influences should be studied on the ground testing platform before the space application. In this paper, we introduce a simulator that can simulate the real case of background light noise in space environment during the data talking via laser beam between two lonely satellites. This simulator can not only simulate the effect of multi-wavelength spectrum, but also the effects of adjustable angles of field-of-view, large range of adjustable optical power and adjustable deflection speeds of light noise in space environment. We integrate these functions into a device with small and compact size for easily mobile use. Software control function is also achieved via personal computer to adjust these functions arbitrarily. Keywords:
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reckinger, Scott James; Livescu, Daniel; Vasilyev, Oleg V.
A comprehensive numerical methodology has been developed that handles the challenges introduced by considering the compressive nature of Rayleigh-Taylor instability (RTI) systems, which include sharp interfacial density gradients on strongly stratified background states, acoustic wave generation and removal at computational boundaries, and stratification-dependent vorticity production. The computational framework is used to simulate two-dimensional single-mode RTI to extreme late-times for a wide range of flow compressibility and variable density effects. The results show that flow compressibility acts to reduce the growth of RTI for low Atwood numbers, as predicted from linear stability analysis.
Atmospheric effects in multispectral remote sensor data
NASA Technical Reports Server (NTRS)
Turner, R. E.
1975-01-01
The problem of radiometric variations in multispectral remote sensing data which occur as a result of a change in geometric and environmental factors is studied. The case of spatially varying atmospheres is considered and the effect of atmospheric scattering is analyzed for realistic conditions. Emphasis is placed upon a simulation of LANDSAT spectral data for agricultural investigations over the United States. The effect of the target-background interaction is thoroughly analyzed in terms of various atmospheric states, geometric parameters, and target-background materials. Results clearly demonstrate that variable atmospheres can alter the classification accuracy and that the presence of various backgrounds can change the effective target radiance by a significant amount. A failure to include these effects in multispectral data analysis will result in a decrease in the classification accuracy.
USDA-ARS?s Scientific Manuscript database
Background: Accurate determination of food-borne pathogen serotype and genotype information is important for disease surveillance and outbreak source tracking. E. coli serotype O157:H7 and non-O157 of Shiga toxin-producing E. coli (STEC) serogroups, including O26, O45, O103, O111, O121, O145 (top ...
A New Unsteady Model for Dense Cloud Cavitation in Cryogenic Fluids
NASA Technical Reports Server (NTRS)
Hosangadi, Ashvin; Ahuja, Vineet
2005-01-01
Contents include the following: Background on thermal effects in cavitation. Physical properties of hydrogen. Multi-phase cavitation with thermal effect. Solution procedure. Cavitation model overview. Cavitation source terms. New cavitation model. Source term for bubble growth. One equation les model. Unsteady ogive simulations: liquid nitrogen. Unsteady incompressible flow in a pipe. Time averaged cavity length for NACA15 flowfield.
Texture segregation, surface representation and figure-ground separation.
Grossberg, S; Pessoa, L
1998-09-01
A widespread view is that most texture segregation can be accounted for by differences in the spatial frequency content of texture regions. Evidence from both psychophysical and physiological studies indicate, however, that beyond these early filtering stages, there are stages of 3-D boundary segmentation and surface representation that are used to segregate textures. Chromatic segregation of element-arrangement patterns--as studied by Beck and colleagues--cannot be completely explained by the filtering mechanisms previously employed to account for achromatic segregation. An element arrangement pattern is composed of two types of elements that are arranged differently in different image regions (e.g. vertically on top and diagonally on the bottom). FACADE theory mechanisms that have previously been used to explain data about 3-D vision and figure-ground separation are here used to simulate chromatic texture segregation data, including data with equiluminant elements on dark or light homogeneous backgrounds, or backgrounds composed of vertical and horizontal dark or light stripes, or horizontal notched stripes. These data include the fact that segregation of patterns composed of red and blue squares decreases with increasing luminance of the interspaces. Asymmetric segregation properties under 3-D viewing conditions with the equiluminant elements close or far are also simulated. Two key model properties are a spatial impenetrability property that inhibits boundary grouping across regions with non-collinear texture elements and a boundary-surface consistency property that uses feedback between boundary and surface representations to eliminate spurious boundary groupings and separate figures from their backgrounds.
A dual-waveband dynamic IR scene projector based on DMD
NASA Astrophysics Data System (ADS)
Hu, Yu; Zheng, Ya-wei; Gao, Jiao-bo; Sun, Ke-feng; Li, Jun-na; Zhang, Lei; Zhang, Fang
2016-10-01
Infrared scene simulation system can simulate multifold objects and backgrounds to perform dynamic test and evaluate EO detecting system in the hardware in-the-loop test. The basic structure of a dual-waveband dynamic IR scene projector was introduced in the paper. The system's core device is an IR Digital Micro-mirror Device (DMD) and the radiant source is a mini-type high temperature IR plane black-body. An IR collimation optical system which transmission range includes 3-5μm and 8-12μm is designed as the projection optical system. Scene simulation software was developed with Visual C++ and Vega soft tools and a software flow chart was presented. The parameters and testing results of the system were given, and this system was applied with satisfying performance in an IR imaging simulation testing.
Electron backscattering simulation in Geant4
NASA Astrophysics Data System (ADS)
Dondero, Paolo; Mantero, Alfonso; Ivanchencko, Vladimir; Lotti, Simone; Mineo, Teresa; Fioretti, Valentina
2018-06-01
The backscattering of electrons is a key phenomenon in several physics applications which range from medical therapy to space including AREMBES, the new ESA simulation framework for radiation background effects. The importance of properly reproducing this complex interaction has grown considerably in the last years and the Geant4 Monte Carlo simulation toolkit, recently upgraded to the version 10.3, is able to comply with the AREMBES requirements in a wide energy range. In this study a validation of the electron Geant4 backscattering models is performed with respect to several experimental data. In addition a selection of the most recent validation results on the electron scattering processes is also presented. Results of our analysis show a good agreement between simulations and data from several experiments, confirming the Geant4 electron backscattering models to be robust and reliable up to a few tens of electronvolts.
Can We Predict CME Deflections Based on Solar Magnetic Field Configuration Alone?
NASA Astrophysics Data System (ADS)
Kay, C.; Opher, M.; Evans, R. M.
2013-12-01
Accurate space weather forecasting requires knowledge of the trajectory of coronal mass ejections (CMEs), including predicting CME deflections close to the Sun and through interplanetary space. Deflections of CMEs occur due to variations in the background magnetic field or solar wind speed, magnetic reconnection, and interactions with other CMEs. Using our newly developed model of CME deflections due to gradients in the background solar magnetic field, ForeCAT (Kay et al. 2013), we explore the questions: (a) do all simulated CMEs ultimately deflect to the minimum in the background solar magnetic field? (b) does the majority of the deflection occur in the lower corona below 4 Rs? ForeCAT does not include temporal variations in the magnetic field of active regions (ARs), spatial variations in the background solar wind speed, magnetic reconnection, or interactions with other CMEs. Therefore we focus on the effects of the steady state solar magnetic field. We explore two different Carrington Rotations (CRs): CR 2029 (April-May 2005) and CR 2077 (November-December 2008). Little is known about how the density and magnetic field fall with distance in the lower corona. We consider four density models derived from observations (Chen 1996, Mann et al. 2003, Guhathakurta et al. 2006, Leblanc et al. 1996) and two magnetic field models (PFSS and a scaled model). ForeCAT includes drag resulting from both CME propagation and deflection through the background solar wind. We vary the drag coefficient to explore the effect of drag on the deflection at 1 AU.
Twelve tips for a successful interprofessional team-based high-fidelity simulation education session
Bould, M. Dylan; Layat Burn, Carine; Reeves, Scott
2014-01-01
Simulation-based education allows experiential learning without risk to patients. Interprofessional education aims to provide opportunities to different professions for learning how to work effectively together. Interprofessional simulation-based education presents many challenges, including the logistics of setting up the session and providing effective feedback to participants with different backgrounds and mental models. This paper aims to provide educators with a series of practical and pedagogical tips for designing, implementing, assessing, and evaluating a successful interprofessional team-based simulation session. The paper is organized in the sequence that an educator might use in developing an interprofessional simulation-based education session. Collectively, this paper provides guidance from determining interprofessional learning objectives and curricular design to program evaluation. With a better understanding of the concepts and pedagogical methods underlying interprofessional education and simulation, educators will be able to create conditions for a unique educational experience where individuals learn with and from other specialties and professions in a controlled, safe environment. PMID:25023765
NASA Astrophysics Data System (ADS)
Riquelme, Mario A.; Quataert, Eliot; Verscharen, Daniel
2015-02-01
We use particle-in-cell simulations to study the nonlinear evolution of ion velocity space instabilities in an idealized problem in which a background velocity shear continuously amplifies the magnetic field. We simulate the astrophysically relevant regime where the shear timescale is long compared to the ion cyclotron period, and the plasma beta is β ~ 1-100. The background field amplification in our calculation is meant to mimic processes such as turbulent fluctuations or MHD-scale instabilities. The field amplification continuously drives a pressure anisotropy with p > p ∥ and the plasma becomes unstable to the mirror and ion cyclotron instabilities. In all cases, the nonlinear state is dominated by the mirror instability, not the ion cyclotron instability, and the plasma pressure anisotropy saturates near the threshold for the linear mirror instability. The magnetic field fluctuations initially undergo exponential growth but saturate in a secular phase in which the fluctuations grow on the same timescale as the background magnetic field (with δB ~ 0.3 langBrang in the secular phase). At early times, the ion magnetic moment is well-conserved but once the fluctuation amplitudes exceed δB ~ 0.1 langBrang, the magnetic moment is no longer conserved but instead changes on a timescale comparable to that of the mean magnetic field. We discuss the implications of our results for low-collisionality astrophysical plasmas, including the near-Earth solar wind and low-luminosity accretion disks around black holes.
Enhanced backgrounds in scene rendering with GTSIMS
NASA Astrophysics Data System (ADS)
Prussing, Keith F.; Pierson, Oliver; Cordell, Chris; Stewart, John; Nielson, Kevin
2018-05-01
A core component to modeling visible and infrared sensor responses is the ability to faithfully recreate background noise and clutter in a synthetic image. Most tracking and detection algorithms use a combination of signal to noise or clutter to noise ratios to determine if a signature is of interest. A primary source of clutter is the background that defines the environment in which a target is placed. Over the past few years, the Electro-Optical Systems Laboratory (EOSL) at the Georgia Tech Research Institute has made significant improvements to its in house simulation framework GTSIMS. First, we have expanded our terrain models to include the effects of terrain orientation on emission and reflection. Second, we have included the ability to model dynamic reflections with full BRDF support. Third, we have added the ability to render physically accurate cirrus clouds. And finally, we have updated the overall rendering procedure to reduce the time necessary to generate a single frame by taking advantage of hardware acceleration. Here, we present the updates to GTSIMS to better predict clutter and noise doe to non-uniform backgrounds. Specifically, we show how the addition of clouds, terrain, and improved non-uniform sky rendering improve our ability to represent clutter during scene generation.
Dark and background response stability for the Landsat 8 Thermal Infrared Sensor
Vanderwerff, Kelly; Montanaro, Matthew
2012-01-01
The Thermal Infrared Sensor (TIRS) is a pushbroom sensor that will be a part of the Landsat Data Continuity Mission (LDCM), which is a joint mission between NASA and the USGS. The TIRS instrument will continue to collect the thermal infrared data that are currently being collected by the Thematic Mapper and the Enhanced Thematic Mapper Plus on Landsats 5 and 7, respectively. One of the key requirements of the new sensor is that the dark and background response be stable to ensure proper data continuity from the legacy Landsat instruments. Pre launch testing of the instrument has recently been completed at the NASA Goddard Space Flight Center (GSFC), which included calibration collects that mimic those that will be performed on orbit. These collects include images of a cold plate meant to simulate the deep space calibration source as viewed by the instrument in flight. The data from these collects give insight into the stability of the instrument’s dark and background response, as well as factors that may cause these responses to vary. This paper quantifies the measured background and dark response of TIRS as well as its stability.
Simulation of Carbon Production from Material Surfaces in Fusion Devices
NASA Astrophysics Data System (ADS)
Marian, J.; Verboncoeur, J.
2005-10-01
Impurity production at carbon surfaces by plasma bombardment is a key issue for fusion devices as modest amounts can lead to excessive radiative power loss and/or hydrogenic D-T fuel dilution. Here results of molecular dynamics (MD) simulations of physical and chemical sputtering of hydrocarbons are presented for models of graphite and amorphous carbon, the latter formed by continuous D-T impingement in conditions that mimic fusion devices. The results represent more extensive simulations than we reported last year, including incident energies in the 30-300 eV range for a variety of incident angles that yield a number of different hydrocarbon molecules. The calculated low-energy yields clarify the uncertainty in the complex chemical sputtering rate since chemical bonding and hard-core repulsion are both included in the interatomic potential. Also modeled is hydrocarbon break-up by electron-impact collisions and transport near the surface. Finally, edge transport simulations illustrate the sensitivity of the edge plasma properties arising from moderate changes in the carbon content. The models will provide the impurity background for the TEMPEST kinetic edge code.
NASA Astrophysics Data System (ADS)
Gruneisen, Mark T.; Sickmiller, Brett A.; Flanagan, Michael B.; Black, James P.; Stoltenberg, Kurt E.; Duchane, Alexander W.
2016-02-01
Spatial filtering is an important technique for reducing sky background noise in a satellite quantum key distribution downlink receiver. Atmospheric turbulence limits the extent to which spatial filtering can reduce sky noise without introducing signal losses. Using atmospheric propagation and compensation simulations, the potential benefit of adaptive optics (AO) to secure key generation (SKG) is quantified. Simulations are performed assuming optical propagation from a low-Earth-orbit satellite to a terrestrial receiver that includes AO. Higher-order AO correction is modeled assuming a Shack-Hartmann wavefront sensor and a continuous-face-sheet deformable mirror. The effects of atmospheric turbulence, tracking, and higher-order AO on the photon capture efficiency are simulated using statistical representations of turbulence and a time-domain wave-optics hardware emulator. SKG rates are calculated for a decoy-state protocol as a function of the receiver field of view for various strengths of turbulence, sky radiances, and pointing angles. The results show that at fields of view smaller than those discussed by others, AO technologies can enhance SKG rates in daylight and enable SKG where it would otherwise be prohibited as a consequence of background optical noise and signal loss due to propagation and turbulence effects.
A novel background field removal method for MRI using projection onto dipole fields (PDF).
Liu, Tian; Khalidov, Ildar; de Rochefort, Ludovic; Spincemaille, Pascal; Liu, Jing; Tsiouris, A John; Wang, Yi
2011-11-01
For optimal image quality in susceptibility-weighted imaging and accurate quantification of susceptibility, it is necessary to isolate the local field generated by local magnetic sources (such as iron) from the background field that arises from imperfect shimming and variations in magnetic susceptibility of surrounding tissues (including air). Previous background removal techniques have limited effectiveness depending on the accuracy of model assumptions or information input. In this article, we report an observation that the magnetic field for a dipole outside a given region of interest (ROI) is approximately orthogonal to the magnetic field of a dipole inside the ROI. Accordingly, we propose a nonparametric background field removal technique based on projection onto dipole fields (PDF). In this PDF technique, the background field inside an ROI is decomposed into a field originating from dipoles outside the ROI using the projection theorem in Hilbert space. This novel PDF background removal technique was validated on a numerical simulation and a phantom experiment and was applied in human brain imaging, demonstrating substantial improvement in background field removal compared with the commonly used high-pass filtering method. Copyright © 2011 John Wiley & Sons, Ltd.
Mitigation strategies against radiation-induced background for space astronomy missions
NASA Astrophysics Data System (ADS)
Davis, C. S. W.; Hall, D.; Keelan, J.; O'Farrell, J.; Leese, M.; Holland, A.
2018-01-01
The Advanced Telescope for High ENergy Astrophysics (ATHENA) mission is a major upcoming space-based X-ray observatory due to be launched in 2028 by ESA, with the purpose of mapping the early universe and observing black holes. Background radiation is expected to constitute a large fraction of the total system noise in the Wide Field Imager (WFI) instrument on ATHENA, and designing an effective system to reduce the background radiation impacting the WFI will be crucial for maximising its sensitivity. Significant background sources are expected to include high energy protons, X-ray fluorescence lines, 'knock-on' electrons and Compton electrons. Due to the variety of the different background sources, multiple shielding methods may be required to achieve maximum sensitivity in the WFI. These techniques may also be of great interest for use in future space-based X-ray experiments. Simulations have been developed to model the effect of a graded-Z shield on the X-ray fluorescence background. In addition the effect of a 90nm optical blocking filter on the secondary electron background has been investigated and shown to modify the requirements of any secondary electron shielding that is to be used.
Multispectral Terrain Background Simulation Techniques For Use In Airborne Sensor Evaluation
NASA Astrophysics Data System (ADS)
Weinberg, Michael; Wohlers, Ronald; Conant, John; Powers, Edward
1988-08-01
A background simulation code developed at Aerodyne Research, Inc., called AERIE is designed to reflect the major sources of clutter that are of concern to staring and scanning sensors of the type being considered for various airborne threat warning (both aircraft and missiles) sensors. The code is a first principles model that could be used to produce a consistent image of the terrain for various spectral bands, i.e., provide the proper scene correlation both spectrally and spatially. The code utilizes both topographic and cultural features to model terrain, typically from DMA data, with a statistical overlay of the critical underlying surface properties (reflectance, emittance, and thermal factors) to simulate the resulting texture in the scene. Strong solar scattering from water surfaces is included with allowance for wind driven surface roughness. Clouds can be superimposed on the scene using physical cloud models and an analytical representation of the reflectivity obtained from scattering off spherical particles. The scene generator is augmented by collateral codes that allow for the generation of images at finer resolution. These codes provide interpolation of the basic DMA databases using fractal procedures that preserve the high frequency power spectral density behavior of the original scene. Scenes are presented illustrating variations in altitude, radiance, resolution, material, thermal factors, and emissivities. The basic models utilized for simulation of the various scene components and various "engineering level" approximations are incorporated to reduce the computational complexity of the simulation.
The Mpi-M Aerosol Climatology (MAC)
NASA Astrophysics Data System (ADS)
Kinne, S.
2014-12-01
Monthly gridded global data-sets for aerosol optical properties (AOD, SSA and g) and for aerosol microphysical properties (CCN and IN) offer a (less complex) alternate path to include aerosol radiative effects and aerosol impacts on cloud-microphysics in global simulations. Based on merging AERONET sun-/sky-photometer data onto background maps provided by AeroCom phase 1 modeling output and AERONET sun-/the MPI-M Aerosol Climatology (MAC) version 1 was developed and applied in IPCC simulations with ECHAM and as ancillary data-set in satellite-based global data-sets. An updated version 2 of this climatology will be presented now applying central values from the more recent AeroCom phase 2 modeling and utilizing the better global coverage of trusted sun-photometer data - including statistics from the Marine Aerosol network (MAN). Applications include spatial distributions of estimates for aerosol direct and aerosol indirect radiative effects.
Marker-Assisted Introgression in Backcross Breeding Programs
Visscher, P. M.; Haley, C. S.; Thompson, R.
1996-01-01
The efficiency of marker-assisted introgression in backcross populations derived from inbred lines was investigated by simulation. Background genotypes were simulated assuming that a genetic model of many genes of small effects in coupling phase explains the observed breed difference and variance in backcross populations. Markers were efficient in introgression backcross programs for simultaneously introgressing an allele and selecting for the desired genomic background. Using a marker spacing of 10-20 cM gave an advantage of one to two backcross generations selection relative to random or phenotypic selection. When the position of the gene to be introgressed is uncertain, for example because its position was estimated from a trait gene mapping experiment, a chromosome segment should be introgressed that is likely to include the allele of interest. Even for relatively precisely mapped quantitative trait loci, flanking markers or marker haplotypes should cover ~10-20 cM around the estimated position of the gene, to ensure that the allele frequency does not decline in later backcross generations. PMID:8978075
Multifluid Simulations of the Global Solar Wind Including Pickup Ions and Turbulence Modeling
NASA Technical Reports Server (NTRS)
Goldstein, Melvyn L.; Usmanov, A. V.
2011-01-01
I will describe a three-dimensional magnetohydrodynamic model of the solar wind that takes into account turbulent heating of the wind by velocity and magnetic fluctuations as well as a variety of effects produced by interstellar pickup protons. The interstellar pickup protons are treated in the model as one fluid and the protons and electrons are treated together as a second fluid. The model equations include a Reynolds decomposition of the plasma velocity and magnetic field into mean and fluctuating quantities, as well as energy transfer from interstellar pickup protons to solar wind protons that results in the deceleration of the solar wind. The model is used to simulate the global steady-state structure of the solar wind in the region from 0.3 to 100 AU. The simulation assumes that the background magnetic field on the Sun is either a dipole (aligned or tilted with respect to the solar rotation axis) or one that is deduced from solar magnetograms.
Space Transportation Propulsion Systems
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Stewart, Mark E.; Suresh, Ambady; Owen, A. Karl
2001-01-01
This report outlines the Space Transportation Propulsion Systems for the NPSS (Numerical Propulsion System Simulation) program. Topics include: 1) a review of Engine/Inlet Coupling Work; 2) Background/Organization of Space Transportation Initiative; 3) Synergy between High Performance Computing and Communications Program (HPCCP) and Advanced Space Transportation Program (ASTP); 4) Status of Space Transportation Effort, including planned deliverables for FY01-FY06, FY00 accomplishments (HPCCP Funded) and FY01 Major Milestones (HPCCP and ASTP); and 5) a review current technical efforts, including a review of the Rocket-Based Combined-Cycle (RBCC), Scope of Work, RBCC Concept Aerodynamic Analysis and RBCC Concept Multidisciplinary Analysis.
Quantum chemistry simulation on quantum computers: theories and experiments.
Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng
2012-07-14
It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.
NASA Astrophysics Data System (ADS)
Kuik, F.; Lauer, A.; von Schneidemesser, E.; Butler, T. M.
2016-12-01
Many European cities continue to struggle with exceedances of NO2 limit values at measurement sites near roads, of which a large contribution is attributed to emissions from traffic. In this study, we explore how urban air quality can be improved with different traffic measures using the example of the Berlin-Brandenburg region. In order to simulate urban background air quality we use the Weather Research and Forecasting model with chemistry (WRF-Chem) at a horizontal resolution of 1km. We use emission input data at a horizontal resolution of 1km obtained by downscaling TNO-MACC III emissions based on local proxy data including population and traffic densities. In addition we use a statistical approach combining the simulated urban background concentrations with information on traffic densities to estimate NO2 at street level. This helps assessing whether the emission scenarios studied here can lead to significant reductions in NO2 concentrations at street level. The emission scenarios in this study represent a range of scenarios in which car traffic is replaced with bicycle traffic. Part of this study was an initial discussion phase with stakeholders, including policy makers and NGOs. The discussions have shown that the different stakeholders are interested in a scientific assessment of the impact of replacing car traffic with bicycle traffic in the Berlin-Brandenburg urban area. Local policy makers responsible for city planning and implementing traffic measures can make best use of scientific modeling results if input data and scenarios are as realistic as possible. For these reasons, the scenarios cover very idealized optimistic ("all passenger cars are replaced by bicycles") and pessimistic ("all cyclists are replaced by cars") scenarios to explore the sensitivity of simulated urban background air quality to these changes, as well as additional scenarios based on city-specific data to analyze more realistic situations. Of particular interest is how these impact street level NO2 concentrations.
Nonlinear Dynamics of the Cosmic Neutrino Background
NASA Astrophysics Data System (ADS)
Inman, Derek
At least two of the three neutrino species are known to be massive, but their exact masses are currently unknown. Cosmic neutrinos decoupled from the rest of the primordial plasma early on when the Universe was over a billion times hotter than it is today. These relic particles, which have cooled and are now non-relativistic, constitute the Cosmic Neutrino Background and permeate the Universe. While they are not observable directly, their presence can be inferred by measuring the suppression of the matter power spectrum. This suppression is a linear effect caused by the large thermal velocities of neutrinos, which prevent them from collapsing gravitationally on small scales. Unfortunately, it is difficult to measure because of degeneracies with other cosmological parameters and biases arising from the fact that we typically observe point-like galaxies rather than a continous matter field. It is therefore important to look for new effects beyond linear suppression that may be more sensitive to neutrinos. This thesis contributes to the understanding of the nonlinear dynamics of the cosmological neutrino background in the following ways: (i) the development of a new injection scheme for neutrinos in cosmological N-body simulations which circumvents many issues associated with simulating neutrinos at large redshifts, (ii) the numerical study of the relative velocity field between cold dark matter and neutrinos including its reconstruction from density fields, (iii) the theoretical description of neutrinos as a dispersive fluid and its use in modelling the nonlinear evolution of the neutrino density power spectrum, (iv) the derivation of the dipole correlation function using linear response which allows for the Fermi-Dirac velocity distribution to be properly included, and (v) the numerical study and detection of the dipole correlation function in the TianNu simulation. In totality, this thesis is a comprehensive study of neutrino density and velocity fields that may lead to a new technique for constraining neutrino properties via the dipole correlation function.
Building an Open-source Simulation Platform of Acoustic Radiation Force-based Breast Elastography
Wang, Yu; Peng, Bo; Jiang, Jingfeng
2017-01-01
Ultrasound-based elastography including strain elastography (SE), acoustic radiation force Impulse (ARFI) imaging, point shear wave elastography (pSWE) and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. “ground truth”) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity – one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments. PMID:28075330
Building an open-source simulation platform of acoustic radiation force-based breast elastography
NASA Astrophysics Data System (ADS)
Wang, Yu; Peng, Bo; Jiang, Jingfeng
2017-03-01
Ultrasound-based elastography including strain elastography, acoustic radiation force impulse (ARFI) imaging, point shear wave elastography and supersonic shear imaging (SSI) have been used to differentiate breast tumors among other clinical applications. The objective of this study is to extend a previously published virtual simulation platform built for ultrasound quasi-static breast elastography toward acoustic radiation force-based breast elastography. Consequently, the extended virtual breast elastography simulation platform can be used to validate image pixels with known underlying soft tissue properties (i.e. ‘ground truth’) in complex, heterogeneous media, enhancing confidence in elastographic image interpretations. The proposed virtual breast elastography system inherited four key components from the previously published virtual simulation platform: an ultrasound simulator (Field II), a mesh generator (Tetgen), a finite element solver (FEBio) and a visualization and data processing package (VTK). Using a simple message passing mechanism, functionalities have now been extended to acoustic radiation force-based elastography simulations. Examples involving three different numerical breast models with increasing complexity—one uniform model, one simple inclusion model and one virtual complex breast model derived from magnetic resonance imaging data, were used to demonstrate capabilities of this extended virtual platform. Overall, simulation results were compared with the published results. In the uniform model, the estimated shear wave speed (SWS) values were within 4% compared to the predetermined SWS values. In the simple inclusion and the complex breast models, SWS values of all hard inclusions in soft backgrounds were slightly underestimated, similar to what has been reported. The elastic contrast values and visual observation show that ARFI images have higher spatial resolution, while SSI images can provide higher inclusion-to-background contrast. In summary, our initial results were consistent with our expectations and what have been reported in the literature. The proposed (open-source) simulation platform can serve as a single gateway to perform many elastographic simulations in a transparent manner, thereby promoting collaborative developments.
Dynamic thermal signature prediction for real-time scene generation
NASA Astrophysics Data System (ADS)
Christie, Chad L.; Gouthas, Efthimios (Themie); Williams, Owen M.; Swierkowski, Leszek
2013-05-01
At DSTO, a real-time scene generation framework, VIRSuite, has been developed in recent years, within which trials data are predominantly used for modelling the radiometric properties of the simulated objects. Since in many cases the data are insufficient, a physics-based simulator capable of predicting the infrared signatures of objects and their backgrounds has been developed as a new VIRSuite module. It includes transient heat conduction within the materials, and boundary conditions that take into account the heat fluxes due to solar radiation, wind convection and radiative transfer. In this paper, an overview is presented, covering both the steady-state and transient performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acciarri, R.; Adams, C.; An, R.
Here, we present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. Lastly, we also address technical issues that arise when applying this technique to data from a large LArTPCmore » at or near ground level.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acciarri, R.; Adams, C.; An, R.
We present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. We also address technical issues that arise when applying this technique to data from a large LArTPC at ormore » near ground level.« less
Patel, Radha V; Chudow, Melissa; Vo, Teresa T; Serag-Bolos, Erini S
The purpose of this study was to evaluate students' knowledge and perceptions of the clinical application of pharmacogenetics through a simulation activity and to assess communication of pharmacogenetic-guided treatment recommendations utilizing standardized patients. Third-year students in the four-year doctor of pharmacy (PharmD) program at University of South Florida College of Pharmacy completed a pharmacogenetics simulation involving a patient case review, interpretation of pharmacogenetic test results, completion of a situation, background, assessment, recommendation (SBAR) note with drug therapy recommendations, and patient counseling. Voluntary assessments were completed before and after the simulation, which included demographics, knowledge, and perceptions of students' ability to interpret and communicate pharmacogenetic results. Response rates for the pre- and post-simulation assessments were 109 (98%) and 104 (94%), respectively. Correct responses in application-type questions improved after the simulation (74%) compared to before the simulation (44%, p < 0.01). Responses to perception questions shifted towards "strongly agree" or "agree" after the simulation (p < 0.01). The simulation gave students an opportunity to apply pharmacogenetics knowledge and allowed them to gain an appreciation of pharmacists' roles within the pharmacogenetics field. Copyright © 2017 Elsevier Inc. All rights reserved.
Experiments on Electron-Plasma Vortex Motion Driven by a Background Vorticity Gradient.
NASA Astrophysics Data System (ADS)
Kabantsev, A. A.; Driscoll, C. F.
2000-10-01
The interaction of self-trapped vortices with a background vorticity gradient plays an important role in 2D hydrodynamics, including various aspects of relaxation and self-organization of 2D turbulence. In the present experiments, electron plasma columns with monotonically decreasing density profiles provide a vorticity background with (negative) shear in the rotational flow. Clumps of extra electrons are then retrograde vortices, rotating against the background shear; and regions with a deficit of electrons (holes) are prograde vortices. Theory predicts that clumps move up the background gradient, and holes move down the gradient, with velocities which depend differently on the ratio of the vortex trapping length to vortex radius, l / r_v. The present experiments show quantitative agreement with recent theory and simulations,(D.A. Schecter and D.H.E. Dubin, Phys. Rev. Lett. 83), 2191 (1999). for the accessible regime of 0.2 < l/rv < 2. The experiments also show that moving clumps leave a spiral density wake, and that instability of these wakes results in a large number of long-lived holes.
NASA Technical Reports Server (NTRS)
Yamauchi, M.
1994-01-01
A two-dimensional numerical simulation of finite-amplitude magnetohydrodynamic (MHD) magnetosonic waves is performed under a finite-velocity background convection condition. Isothermal cases are considered for simplicity. External dissipation is introduced by assuming that the field-aligned currents are generated in proportion to the accumulated charges. The simulation results are as follows: Paired field-aligned currents are found from the simulated waves. The flow directions of these field-aligned currents depend on the angle between the background convection and the wave normal, and hence two pairs of field-aligned currents are found from a bowed wave if we look at the overall structure. The majority of these field-aligned currents are closed within each pair rather than between two wings. These features are not observed under slow background convection. The result could be applied to the cusp current system and the substorm current system.
NASA Astrophysics Data System (ADS)
Choomlucksana, Juthamas; Doolen, Toni L.
2017-11-01
The use of collaborative activities and simulation sessions in engineering education has been explored previously. However, few studies have investigated the relationship of these types of teaching innovations with other learner characteristics, such as self-efficacy and background knowledge. This study explored the effects of collaborative activities and simulation sessions on learning and the relationships between self-efficacy beliefs, background knowledge, and learning. Data were collected from two different terms in an upper division engineering course entitled Lean Manufacturing Systems Engineering. Findings indicated that the impact of collaborative activities and simulation sessions appears to be different, depending on the concepts being taught. Simulation sessions were found to have a significant effect on self-efficacy beliefs, and background knowledge had a mixed effect on learning. Overall the results of this study highlight the complex set of relationships between classroom innovations, learner characteristics, and learning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruschin, Mark, E-mail: Mark.Ruschin@sunnybrook.ca; Chin, Lee; Ravi, Ananth
Purpose: To develop a multipurpose gel-based breast phantom consisting of a simulated tumor with realistic imaging properties in CT, ultrasound and MRI, or a postsurgical cavity on CT. Applications for the phantom include: deformable image registration (DIR) quality assurance (QA), autosegmentation validation, and localization testing and training for minimally invasive image-guided procedures such as those involving catheter or needle insertion. Methods: A thermoplastic mask of a typical breast patient lying supine was generated and then filled to make an array of phantoms. The background simulated breast tissue consisted of 32.4 g each of ballistic gelatin (BG) powder and Metamusil™ (MM)more » dissolved in 800 ml of water. Simulated tumors were added using the following recipe: 12 g of barium sulfate (1.4% v/v) plus 0.000 14 g copper sulfate plus 0.7 g of MM plus 7.2 g of BG all dissolved in 75 ml of water. The phantom was evaluated quantitatively in CT by comparing Hounsfield units (HUs) with actual breast tissue. For ultrasound and MRI, the phantoms were assessed based on subjective image quality and signal-difference to noise (SDNR) ratio, respectively. The stiffness of the phantom was evaluated based on ultrasound elastography measurements to yield an average Young’s modulus. In addition, subjective tactile assessment of phantom was performed under needle insertion. Results: The simulated breast tissue had a mean background value of 24 HU on CT imaging, which more closely resembles fibroglandular tissue (40 HU) as opposed to adipose (−100 HU). The tumor had a mean CT number of 45 HU, which yielded a qualitatively realistic image contrast relative to the background either as an intact tumor or postsurgical cavity. The tumor appeared qualitatively realistic on ultrasound images, exhibiting hypoechoic characteristics compared to background. On MRI, the tumor exhibited a SDNR of 3.7. The average Young’s modulus was computed to be 15.8 ± 0.7 kPa (1 SD). Conclusions: We have developed a process to efficiently and inexpensively produce multipurpose breast phantoms containing simulated tumors visible on CT, ultrasound, and MRI. The phantoms have been evaluated for image quality and elasticity and can serve as a medium for DIR QA, autosegmentation QA, and training for minimally invasive procedures.« less
Elbert, Yevgeniy; Burkom, Howard S
2009-11-20
This paper discusses further advances in making robust predictions with the Holt-Winters forecasts for a variety of syndromic time series behaviors and introduces a control-chart detection approach based on these forecasts. Using three collections of time series data, we compare biosurveillance alerting methods with quantified measures of forecast agreement, signal sensitivity, and time-to-detect. The study presents practical rules for initialization and parameterization of biosurveillance time series. Several outbreak scenarios are used for detection comparison. We derive an alerting algorithm from forecasts using Holt-Winters-generalized smoothing for prospective application to daily syndromic time series. The derived algorithm is compared with simple control-chart adaptations and to more computationally intensive regression modeling methods. The comparisons are conducted on background data from both authentic and simulated data streams. Both types of background data include time series that vary widely by both mean value and cyclic or seasonal behavior. Plausible, simulated signals are added to the background data for detection performance testing at signal strengths calculated to be neither too easy nor too hard to separate the compared methods. Results show that both the sensitivity and the timeliness of the Holt-Winters-based algorithm proved to be comparable or superior to that of the more traditional prediction methods used for syndromic surveillance.
Studies of Tenuous Planetary Atmospheres
NASA Technical Reports Server (NTRS)
Combi, Michael R.
1998-01-01
The final report includes an overall project overview as well as scientific background summaries of dust and sodium in comets, and tenuous atmospheres of Jupiter's natural satellites. Progress and continuing work related to dust coma and tenuous atmospheric studies are presented. Also included are published articles written during the course of the report period. These are entitled: (1) On Europa's Magnetospheric Interaction: An MHD Simulation; (2) Dust-Gas Interrelations in Comets: Observations and Theory; and (3) Io's Plasma Environment During the Galileo Flyby: Global Three Dimensional MHD Modeling with Adaptive Mesh Refinement.
NASA Astrophysics Data System (ADS)
Chen, Xin; Xing, Pei; Luo, Yong; Zhao, Zongci; Nie, Suping; Huang, Jianbin; Wang, Shaowu; Tian, Qinhua
2015-04-01
A new dataset of annual mean surface temperature has been constructed over North America in recent 500 years by performing optimal interpolation (OI) algorithm. Totally, 149 series totally were screened out including 69 tree ring width (MXD) and 80 tree ring width (TRW) chronologies are screened from International Tree Ring Data Bank (ITRDB). The simulated annual mean surface temperature derives from the past1000 experiment results of Community Climate System Model version 4 (CCSM4). Different from existing research that applying data assimilation approach to (General Circulation Models) GCMs simulation, the errors of both the climate model simulation and tree ring reconstruction were considered, with a view to combining the two parts in an optimal way. Variance matching (VM) was employed to calibrate tree ring chronologies on CRUTEM4v, and corresponding errors were estimated through leave-one-out process. Background error covariance matrix was estimated from samples of simulation results in a running 30-year window in a statistical way. Actually, the background error covariance matrix was calculated locally within the scanning range (2000km in this research). Thus, the merging process continued with a time-varying local gain matrix. The merging method (MM) was tested by two kinds of experiments, and the results indicated standard deviation of errors can be reduced by about 0.3 degree centigrade lower than tree ring reconstructions and 0.5 degree centigrade lower than model simulation. During the recent Obvious decadal variability can be identified in MM results including the evident cooling (0.10 degree per decade) in 1940-60s, wherein the model simulation exhibit a weak increasing trend (0.05 degree per decade) instead. MM results revealed a compromised spatial pattern of the linear trend of surface temperature during a typical period (1601-1800 AD) in Little Ice Age, which basically accorded with the phase transitions of the Pacific decadal oscillation (PDO) and Atlantic multi-decadal oscillation (AMO). Through the empirical orthogonal functions and power spectrum analysis, it was demonstrated that, compared with the pure simulations of CCSM4, MM made significant improvement of decadal variability for the gridded temperature in North America by merging the temperature-sensitive tree ring records.
Passive Turbulence Generating Grid Arrangements in a Turbine Cascade Wind Tunnel
2015-01-01
mean square of free stream velocity μ = flow viscosity I. Introduction and Background Turbine Cascade Wind Tunnels ( CWT ) are...closed-loop CWT . Turbine cascade facilities are used to simulate turbine operating conditions for the study of flow phenomena such as 2 boundary layer...A CWT test section inlet must have uniform flowfield properties. The inlet conditions of interest upstream of the cascade include velocity and
NASA Astrophysics Data System (ADS)
Menelaou, K.; Yau, M. K.; Martinez, Y.
2014-09-01
Some aspects of the problem of secondary eyewall formation (SEF) are investigated with the aid of an idealized model. A series of experiments are conducted, starting with a strong annular vortex embedded in a quiescent background flow and forced by the sustained heating associated with a spiral rainband (control experiment). Following this, two experiments are configured to assess the impact of vertical wind shear (VWS) in the SEF process. The importance of the boundary layer force imbalance is finally investigated in a number of simulations in which surface and boundary layer physics are included. From the control experiment, it is found that in the absence of background environmental flow, the sustained latent heating associated with a spiral rainband can form a secondary eyewall even in the absence of a frictional boundary layer. The presence of VWS acts negatively in the SEF process by disrupting the organization of the potential vorticity induced by the rainband. When boundary layer physics is included, some similarities with previous studies are seen, but there is no SEF. These results suggest that the boundary layer most likely contributes to, rather than initiate, a secondary eyewall. This article was corrected on 10 OCT 2014. See the end of the full text for details.
Allowing for Slow Evolution of Background Plasma in the 3D FDTD Plasma, Sheath, and Antenna Model
NASA Astrophysics Data System (ADS)
Smithe, David; Jenkins, Thomas; King, Jake
2015-11-01
We are working to include a slow-time evolution capability for what has previously been the static background plasma parameters, in the 3D finite-difference time-domain (FDTD) plasma and sheath model used to model ICRF antennas in fusion plasmas. A key aspect of this is SOL-density time-evolution driven by ponderomotive rarefaction from the strong fields in the vicinity of the antenna. We demonstrate and benchmark a Scalar Ponderomotive Potential method, based on local field amplitudes, which is included in the 3D simulation. And present a more advanced Tensor Ponderomotive Potential approach, which we hope to employ in the future, which should improve the physical fidelity in the highly anisotropic environment of the SOL. Finally, we demonstrate and benchmark slow time (non-linear) evolution of the RF sheath, and include realistic collisional effects from the neutral gas. Support from US DOE Grants DE-FC02-08ER54953, DE-FG02-09ER55006.
Demonstration of theoretical and experimental simulations in fiber optics course
NASA Astrophysics Data System (ADS)
Yao, Tianfu; Wang, Xiaolin; Shi, Jianhua; Lei, Bing; Liu, Wei; Wang, Wei; Hu, Haojun
2017-08-01
"Fiber optics" course plays a supporting effect in the curriculum frame of optics and photonics at both undergraduate and postgraduate levels. Moreover, the course can be treated as compulsory for students specialized in the fiber-related field, such as fiber communication, fiber sensing and fiber light source. The corresponding content in fiber optics requires the knowledge of geometrical and physical optics as background, including basic optical theory and fiber components in practice. Thus, to help the students comprehend the relatively abundant and complex content, it is necessary to investigate novel teaching method assistant the classic lectures. In this paper, we introduce the multidimensional pattern in fiber-optics teaching involving theoretical and laboratory simulations. First, the theoretical simulations is demonstrated based on the self-developed software named "FB tool" which can be installed in both smart phone with Android operating system and personal computer. FB tool covers the fundamental calculations relating to transverse modes, fiber lasers and nonlinearities and so on. By comparing the calculation results with other commercial software like COMSOL, SFTool shows high accuracy with high speed. Then the laboratory simulations are designed including fiber coupling, Erbium doped fiber amplifiers, fiber components and so on. The simulations not only supports students understand basic knowledge in the course, but also provides opportunities to develop creative projects in fiber optics.
NASA Technical Reports Server (NTRS)
Winglee, Robert M.
1991-01-01
The objective was to conduct large scale simulations of electron beams injected into space. The study of the active injection of electron beams from spacecraft is important, as it provides valuable insight into the plasma beam interactions and the development of current systems in the ionosphere. However, the beam injection itself is not simple, being constrained by the ability of the spacecraft to draw current from the ambient plasma. The generation of these return currents is dependent on several factors, including the density of the ambient plasma relative to the beam density, the presence of neutrals around the spacecraft, the configuration of the spacecraft, and the motion of the spacecraft through the plasma. Two dimensional (three velocity) particle simulations with collisional processes included are used to show how these different and often coupled processes can be used to enhance beam propagation from the spacecraft. To understand the radial expansion mechanism of an electron beam injected from a highly charged spacecraft, two dimensional particle-in-cell simulations were conducted for a high density electron beam injected parallel to magnetic fields from an isolated equipotential conductor into a cold background plasma. The simulations indicate that charge build-up at the beam stagnation point causes the beam to expand radially to the beam electron gyroradius.
NASA Technical Reports Server (NTRS)
1991-01-01
The object was to conduct large scale simulations of electron beams injected into space. The study of active injection of electron beams from spacecraft is important since it provides valuable insight into beam-plasma interactions and the development of current systems in the ionosphere. However, the beam injection itself is not simple, being constrained by the ability of the spacecraft to draw return current from the ambient plasma. The generation of these return currents is dependent on several factors, including the density of the ambient plasma relative to the beam density, the presence of neutrals around the spacecraft, the configuration of the spacecraft, and the motion of the spacecraft through the plasma. Two dimensional particle simulations with collisional processes included are used to show how these different and often coupled processes can be utilized to enhance beam propagation from the spacecraft. To understand the radical expansion of mechanism of an electron beam from a highly charged spacecraft, two dimensional particle in cell simulations were conducted for a high density electron beam injected parallel to magnetic fields from an isolated equipotential conductor into a cold background plasma. The simulations indicate that charge buildup at the beam stagnation point causes the beam to expand radially to the beam electron gyroradius.
Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.
Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M
2016-05-05
Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.
Mixture-Tuned, Clutter Matched Filter for Remote Detection of Subpixel Spectral Signals
NASA Technical Reports Server (NTRS)
Thompson, David R.; Mandrake, Lukas; Green, Robert O.
2013-01-01
Mapping localized spectral features in large images demands sensitive and robust detection algorithms. Two aspects of large images that can harm matched-filter detection performance are addressed simultaneously. First, multimodal backgrounds may thwart the typical Gaussian model. Second, outlier features can trigger false detections from large projections onto the target vector. Two state-of-the-art approaches are combined that independently address outlier false positives and multimodal backgrounds. The background clustering models multimodal backgrounds, and the mixture tuned matched filter (MT-MF) addresses outliers. Combining the two methods captures significant additional performance benefits. The resulting mixture tuned clutter matched filter (MT-CMF) shows effective performance on simulated and airborne datasets. The classical MNF transform was applied, followed by k-means clustering. Then, each cluster s mean, covariance, and the corresponding eigenvalues were estimated. This yields a cluster-specific matched filter estimate as well as a cluster- specific feasibility score to flag outlier false positives. The technology described is a proof of concept that may be employed in future target detection and mapping applications for remote imaging spectrometers. It is of most direct relevance to JPL proposals for airborne and orbital hyperspectral instruments. Applications include subpixel target detection in hyperspectral scenes for military surveillance. Earth science applications include mineralogical mapping, species discrimination for ecosystem health monitoring, and land use classification.
Radiation Design of Ion Mass Spectrometers
NASA Technical Reports Server (NTRS)
Sittler, Ed; Cooper, John; Christian, Eric; Moore, Tom; Sturner, Steve; Paschalidis, Nick
2011-01-01
In the harsh radiation environment of Jupiter and with the JUpiter ICy moon Explorer (JUICE) mission including two Europa flybys where local intensities are approx. 150 krad/month behind 100 mils of Al shielding, so background from penetrating radiation can be a serious issue for detectors inside an Ion Mass Spectrometer (IMS). This can especially be important for minor ion detection designs. Detectors of choice for time-of-flight (TOF) designs are microchannel plates (MCP) and some designs may include solid state detectors (SSD). The standard approach is to use shielding designs so background event rates are low enough that the detector max rates and lifetimes are first not exceeded and then the more stringent requirement that the desired measurement can successfully be made (i.e., desired signal is sufficiently greater than background noise after background subtraction is made). GEANT codes are typically used along with various electronic techniques, but such designs need to know how the detectors will respond to the simulated primary and secondary radiations produced within the instrument. We will be presenting some preliminary measurements made on the response of MCPs to energetic electrons (20 ke V to 1400 ke V) using a Miniature TOF (MTOF) device and the High Energy Facility at Goddard Space Flight Center which has a Van de Graaff accelerator.
Improving Factor Score Estimation Through the Use of Observed Background Characteristics
Curran, Patrick J.; Cole, Veronica; Bauer, Daniel J.; Hussong, Andrea M.; Gottfredson, Nisha
2016-01-01
A challenge facing nearly all studies in the psychological sciences is how to best combine multiple items into a valid and reliable score to be used in subsequent modelling. The most ubiquitous method is to compute a mean of items, but more contemporary approaches use various forms of latent score estimation. Regardless of approach, outside of large-scale testing applications, scoring models rarely include background characteristics to improve score quality. The current paper used a Monte Carlo simulation design to study score quality for different psychometric models that did and did not include covariates across levels of sample size, number of items, and degree of measurement invariance. The inclusion of covariates improved score quality for nearly all design factors, and in no case did the covariates degrade score quality relative to not considering the influences at all. Results suggest that the inclusion of observed covariates can improve factor score estimation. PMID:28757790
The Dynamics of Helium and its Impact on the Upper Thermosphere
NASA Astrophysics Data System (ADS)
Sutton, E. K.; Thayer, J. P.; Wang, W.; Solomon, S. C.; Schmidt, F.
2015-12-01
The TIE-GCM was recently augmented to include helium and argon, two approximately inert species that can be used as tracers of dynamics in the thermosphere. The former species is treated as a major species due to its large abundance near the upper boundary. The effects of exospheric transport are also included in order to simulate realistic seasonal and latitudinal helium distributions. The latter species is treated as a classical minor species, imparting absolutely no forces on the background atmosphere. In this study, we examine the interplay of the various dynamical terms - i.e. background circulation, molecular and Eddy diffusion - as they drive departures from the distributions that would be expected under assumptions of diffusive equilibrium. As this has implications on the formulation of all semi-empirical thermospheric models, we use this understanding to identify the conditions under which helium can significantly affect nowcasts and forecasts of neutral density.
MacKinnon, Ralph; Humphries, Christopher
2015-01-01
Background: Technology-enhanced simulation is well-established in healthcare teaching curricula, including those regarding wilderness medicine. Compellingly, the evidence base for the value of this educational modality to improve learner competencies and patient outcomes are increasing. Aims: The aim was to systematically review the characteristics of technology-enhanced simulation presented in the wilderness medicine literature to date. Then, the secondary aim was to explore how this technology has been used and if the use of this technology has been associated with improved learner or patient outcomes. Methods: EMBASE and MEDLINE were systematically searched from 1946 to 2014, for articles on the provision of technology-enhanced simulation to teach wilderness medicine. Working independently, the team evaluated the information on the criteria of learners, setting, instructional design, content, and outcomes. Results: From a pool of 37 articles, 11 publications were eligible for systematic review. The majority of learners in the included publications were medical students, settings included both indoors and outdoors, and the main focus clinical content was initial trauma management with some including leadership skills. The most prevalent instructional design components were clinical variation and cognitive interactivity, with learner satisfaction as the main outcome. Conclusions: The results confirm that the current provision of wilderness medicine utilizing technology-enhanced simulation is aligned with instructional design characteristics that have been used to achieve effective learning. Future research should aim to demonstrate the translation of learning into the clinical field to produce improved learner outcomes and create improved patient outcomes. PMID:26824012
A brief simulation intervention increasing basic science and clinical knowledge.
Sheakley, Maria L; Gilbert, Gregory E; Leighton, Kim; Hall, Maureen; Callender, Diana; Pederson, David
2016-01-01
Background The United States Medical Licensing Examination (USMLE) is increasing clinical content on the Step 1 exam; thus, inclusion of clinical applications within the basic science curriculum is crucial. Including simulation activities during basic science years bridges the knowledge gap between basic science content and clinical application. Purpose To evaluate the effects of a one-off, 1-hour cardiovascular simulation intervention on a summative assessment after adjusting for relevant demographic and academic predictors. Methods This study was a non-randomized study using historical controls to evaluate curricular change. The control group received lecture (n l =515) and the intervention group received lecture plus a simulation exercise (n l+s =1,066). Assessment included summative exam questions (n=4) that were scored as pass/fail (≥75%). USMLE-style assessment questions were identical for both cohorts. Descriptive statistics for variables are presented and odds of passage calculated using logistic regression. Results Undergraduate grade point ratio, MCAT-BS, MCAT-PS, age, attendance at an academic review program, and gender were significant predictors of summative exam passage. Students receiving the intervention were significantly more likely to pass the summative exam than students receiving lecture only (P=0.0003). Discussion Simulation plus lecture increases short-term understanding as tested by a written exam. A longitudinal study is needed to assess the effect of a brief simulation intervention on long-term retention of clinical concepts in a basic science curriculum.
Generative technique for dynamic infrared image sequences
NASA Astrophysics Data System (ADS)
Zhang, Qian; Cao, Zhiguo; Zhang, Tianxu
2001-09-01
The generative technique of the dynamic infrared image was discussed in this paper. Because infrared sensor differs from CCD camera in imaging mechanism, it generates the infrared image by incepting the infrared radiation of scene (including target and background). The infrared imaging sensor is affected deeply by the atmospheric radiation, the environmental radiation and the attenuation of atmospheric radiation transfers. Therefore at first in this paper the imaging influence of all kinds of the radiations was analyzed and the calculation formula of radiation was provided, in addition, the passive scene and the active scene were analyzed separately. Then the methods of calculation in the passive scene were provided, and the functions of the scene model, the atmospheric transmission model and the material physical attribute databases were explained. Secondly based on the infrared imaging model, the design idea, the achievable way and the software frame for the simulation software of the infrared image sequence were introduced in SGI workstation. Under the guidance of the idea above, in the third segment of the paper an example of simulative infrared image sequences was presented, which used the sea and sky as background and used the warship as target and used the aircraft as eye point. At last the simulation synthetically was evaluated and the betterment scheme was presented.
Surgical simulation: a urological perspective.
Wignall, Geoffrey R; Denstedt, John D; Preminger, Glenn M; Cadeddu, Jeffrey A; Pearle, Margaret S; Sweet, Robert M; McDougall, Elspeth M
2008-05-01
Surgical education is changing rapidly as several factors including budget constraints and medicolegal concerns limit opportunities for urological trainees. New methods of skills training such as low fidelity bench trainers and virtual reality simulators offer new avenues for surgical education. In addition, surgical simulation has the potential to allow practicing surgeons to develop new skills and maintain those they already possess. We provide a review of the background, current status and future directions of surgical simulators as they pertain to urology. We performed a literature review and an overview of surgical simulation in urology. Surgical simulators are in various stages of development and validation. Several simulators have undergone extensive validation studies and are in use in surgical curricula. While virtual reality simulators offer the potential to more closely mimic reality and present entire operations, low fidelity simulators remain useful in skills training, particularly for novices and junior trainees. Surgical simulation remains in its infancy. However, the potential to shorten learning curves for difficult techniques and practice surgery without risk to patients continues to drive the development of increasingly more advanced and realistic models. Surgical simulation is an exciting area of surgical education. The future is bright as advancements in computing and graphical capabilities offer new innovations in simulator technology. Simulators must continue to undergo rigorous validation studies to ensure that time spent by trainees on bench trainers and virtual reality simulators will translate into improved surgical skills in the operating room.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meziane, M.; Eichwald, O.; Ducasse, O.
The present paper is devoted to the 2D simulation of an Atmospheric Corona Discharge Reactor (ACDR) involving 10 pins powered by a DC high voltage and positioned 7 mm above a grounded metallic plane. The corona reactor is periodically crossed by thin mono filamentary streamers with a natural repetition frequency of some tens of kHz. The simulation involves the electro-dynamic, chemical kinetic, and neutral gas hydrodynamic phenomena that influence the kinetics of the chemical species transformation. Each discharge stage (including the primary and the secondary streamers development and the resulting thermal shock) lasts about one hundred nanoseconds while the post-dischargemore » stages occurring between two successive discharge phases last one hundred microseconds. The ACDR is crossed by a lateral air flow including 400 ppm of NO. During the considered time scale of 10 ms, one hundred discharge/post-discharge cycles are simulated. The simulation involves the radical formation and thermal exchange between the discharges and the background gas. The results show how the successive discharges activate the flow gas and how the induced turbulence phenomena affect the redistribution of the thermal energy and the chemical kinetics inside the ACDR.« less
Health-Related Benefits of Attaining the 8-Hr Ozone Standard
Hubbell, Bryan J.; Hallberg, Aaron; McCubbin, Donald R.; Post, Ellen
2005-01-01
During the 2000–2002 time period, between 36 and 56% of ozone monitors each year in the United States failed to meet the current ozone standard of 80 ppb for the fourth highest maximum 8-hr ozone concentration. We estimated the health benefits of attaining the ozone standard at these monitors using the U.S. Environmental Protection Agency’s Environmental Benefits Mapping and Analysis Program. We used health impact functions based on published epidemiologic studies, and valuation functions derived from the economics literature. The estimated health benefits for 2000 and 2001 are similar in magnitude, whereas the results for 2002 are roughly twice that of each of the prior 2 years. The simple average of health impacts across the 3 years includes reductions of 800 premature deaths, 4,500 hospital and emergency department admissions, 900,000 school absences, and > 1 million minor restricted activity days. The simple average of benefits (including premature mortality) across the 3 years is $5.7 billion [90% confidence interval (CI), 0.6–15.0] for the quadratic rollback simulation method and $4.9 billion (90% CI, 0.5–14.0) for the proportional rollback simulation method. Results are sensitive to the form of the standard and to assumptions about background ozone levels. If the form of the standard is based on the first highest maximum 8-hr concentration, impacts are increased by a factor of 2–3. Increasing the assumed hourly background from zero to 40 ppb reduced impacts by 30 and 60% for the proportional and quadratic attainment simulation methods, respectively. PMID:15626651
Stochastic simulation and analysis of biomolecular reaction networks
Frazier, John M; Chushak, Yaroslav; Foy, Brent
2009-01-01
Background In recent years, several stochastic simulation algorithms have been developed to generate Monte Carlo trajectories that describe the time evolution of the behavior of biomolecular reaction networks. However, the effects of various stochastic simulation and data analysis conditions on the observed dynamics of complex biomolecular reaction networks have not recieved much attention. In order to investigate these issues, we employed a a software package developed in out group, called Biomolecular Network Simulator (BNS), to simulate and analyze the behavior of such systems. The behavior of a hypothetical two gene in vitro transcription-translation reaction network is investigated using the Gillespie exact stochastic algorithm to illustrate some of the factors that influence the analysis and interpretation of these data. Results Specific issues affecting the analysis and interpretation of simulation data are investigated, including: (1) the effect of time interval on data presentation and time-weighted averaging of molecule numbers, (2) effect of time averaging interval on reaction rate analysis, (3) effect of number of simulations on precision of model predictions, and (4) implications of stochastic simulations on optimization procedures. Conclusion The two main factors affecting the analysis of stochastic simulations are: (1) the selection of time intervals to compute or average state variables and (2) the number of simulations generated to evaluate the system behavior. PMID:19534796
Simulating Issue Networks in Small Classes using the World Wide Web.
ERIC Educational Resources Information Center
Josefson, Jim; Casey, Kelly
2000-01-01
Provides background information on simulations and active learning. Discusses the use of simulations in political science courses. Describes a simulation exercise where students performed specific institutional role playing, simulating the workings of a single congressional issue network, based on the reauthorization of the Endangered Species Act.…
An experimental method for the assessment of color simulation tools.
Lillo, Julio; Alvaro, Leticia; Moreira, Humberto
2014-07-22
The Simulcheck method for evaluating the accuracy of color simulation tools in relation to dichromats is described and used to test three color simulation tools: Variantor, Coblis, and Vischeck. A total of 10 dichromats (five protanopes, five deuteranopes) and 10 normal trichromats participated in the current study. Simulcheck includes two psychophysical tasks: the Pseudoachromatic Stimuli Identification task and the Minimum Achromatic Contrast task. The Pseudoachromatic Stimuli Identification task allows determination of the two chromatic angles (h(uv) values) that generate a minimum response in the yellow–blue opponent mechanism and, consequently, pseudoachromatic stimuli (greens or reds). The Minimum Achromatic Contrast task requires the selection of the gray background that produces minimum contrast (near zero change in the achromatic mechanism) for each pseudoachromatic stimulus selected in the previous task (L(R) values). Results showed important differences in the colorimetric transformations performed by the three evaluated simulation tools and their accuracy levels. Vischeck simulation accurately implemented the algorithm of Brettel, Viénot, and Mollon (1997). Only Vischeck appeared accurate (similarity in huv and L(R) values between real and simulated dichromats) and, consequently, could render reliable color selections. It is concluded that Simulcheck is a consistent method because it provided an equivalent pattern of results for huv and L(R) values irrespective of the stimulus set used to evaluate a simulation tool. Simulcheck was also considered valid because real dichromats provided expected huv and LR values when performing the two psychophysical tasks included in this method. © 2014 ARVO.
Air shower simulation for background estimation in muon tomography of volcanoes
NASA Astrophysics Data System (ADS)
Béné, S.; Boivin, P.; Busato, E.; Cârloganu, C.; Combaret, C.; Dupieux, P.; Fehr, F.; Gay, P.; Labazuy, P.; Laktineh, I.; Lénat, J.-F.; Miallier, D.; Mirabito, L.; Niess, V.; Portal, A.; Vulpescu, B.
2013-01-01
One of the main sources of background for the radiography of volcanoes using atmospheric muons comes from the accidental coincidences produced in the muon telescopes by charged particles belonging to the air shower generated by the primary cosmic ray. In order to quantify this background effect, Monte Carlo simulations of the showers and of the detector are developed by the TOMUVOL collaboration. As a first step, the atmospheric showers were simulated and investigated using two Monte Carlo packages, CORSIKA and GEANT4. We compared the results provided by the two programs for the muonic component of vertical proton-induced showers at three energies: 1, 10 and 100 TeV. We found that the spatial distribution and energy spectrum of the muons were in good agreement for the two codes.
Overview of the CHarring Ablator Response (CHAR) Code
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Oliver, A. Brandon; Kirk, Benjamin S.; Salazar, Giovanni; Droba, Justin
2016-01-01
An overview of the capabilities of the CHarring Ablator Response (CHAR) code is presented. CHAR is a one-, two-, and three-dimensional unstructured continuous Galerkin finite-element heat conduction and ablation solver with both direct and inverse modes. Additionally, CHAR includes a coupled linear thermoelastic solver for determination of internal stresses induced from the temperature field and surface loading. Background on the development process, governing equations, material models, discretization techniques, and numerical methods is provided. Special focus is put on the available boundary conditions including thermochemical ablation and contact interfaces, and example simulations are included. Finally, a discussion of ongoing development efforts is presented.
GRACKLE: a chemistry and cooling library for astrophysics
NASA Astrophysics Data System (ADS)
Smith, Britton D.; Bryan, Greg L.; Glover, Simon C. O.; Goldbaum, Nathan J.; Turk, Matthew J.; Regan, John; Wise, John H.; Schive, Hsi-Yu; Abel, Tom; Emerick, Andrew; O'Shea, Brian W.; Anninos, Peter; Hummels, Cameron B.; Khochfar, Sadegh
2017-04-01
We present the GRACKLE chemistry and cooling library for astrophysical simulations and models. GRACKLE provides a treatment of non-equilibrium primordial chemistry and cooling for H, D and He species, including H2 formation on dust grains; tabulated primordial and metal cooling; multiple ultraviolet background models; and support for radiation transfer and arbitrary heat sources. The library has an easily implementable interface for simulation codes written in C, C++ and FORTRAN as well as a PYTHON interface with added convenience functions for semi-analytical models. As an open-source project, GRACKLE provides a community resource for accessing and disseminating astrochemical data and numerical methods. We present the full details of the core functionality, the simulation and PYTHON interfaces, testing infrastructure, performance and range of applicability. GRACKLE is a fully open-source project and new contributions are welcome.
Manufacturing of ArF chromeless hard shifter for 65-nm technology
NASA Astrophysics Data System (ADS)
Park, Keun-Taek; Dieu, Laurent; Hughes, Greg P.; Green, Kent G.; Croffie, Ebo H.; Taravade, Kunal N.
2003-12-01
For logic design, Chrome-less Phase Shift Mask is one of the possible solutions for defining small geometry with low MEF (mask enhancement factor) for the 65nm node. There have been lots of dedicated studies on the PCO (Phase Chrome Off-axis) mask technology and several design approaches have been proposed including grating background, chrome patches (or chrome shield) for applying PCO on line/space and contact pattern. In this paper, we studied the feasibility of grating design for line and contact pattern. The design of the grating pattern was provided from the EM simulation software (TEMPEST) and the aerial image simulation software. AIMS measurements with high NA annular illumination were done. Resist images were taken on designed pattern in different focus. Simulations, AIMS are compared to verify the consistency of the process with wafer printed performance.
Pisano, E D; Zong, S; Hemminger, B M; DeLuca, M; Johnston, R E; Muller, K; Braeuning, M P; Pizer, S M
1998-11-01
The purpose of this project was to determine whether Contrast Limited Adaptive Histogram Equalization (CLAHE) improves detection of simulated spiculations in dense mammograms. Lines simulating the appearance of spiculations, a common marker of malignancy when visualized with masses, were embedded in dense mammograms digitized at 50 micron pixels, 12 bits deep. Film images with no CLAHE applied were compared to film images with nine different combinations of clip levels and region sizes applied. A simulated spiculation was embedded in a background of dense breast tissue, with the orientation of the spiculation varied. The key variables involved in each trial included the orientation of the spiculation, contrast level of the spiculation and the CLAHE settings applied to the image. Combining the 10 CLAHE conditions, 4 contrast levels and 4 orientations gave 160 combinations. The trials were constructed by pairing 160 combinations of key variables with 40 backgrounds. Twenty student observers were asked to detect the orientation of the spiculation in the image. There was a statistically significant improvement in detection performance for spiculations with CLAHE over unenhanced images when the region size was set at 32 with a clip level of 2, and when the region size was set at 32 with a clip level of 4. The selected CLAHE settings should be tested in the clinic with digital mammograms to determine whether detection of spiculations associated with masses detected at mammography can be improved.
The Instagram: A Novel Sounding Technique for Enhanced HF Propagation Advice
2010-05-01
precautions are necessary before such a scheme is attempted, and an ultimate aim might be to have the technique as close to subliminal as possible...waveform on reception has been reduced to subliminal levels. Figure 4 PSD plot of the weighted instagram waveform as received on a...wideband system with simulated background noise level included. Although the signal appears subliminal to other users it can still be extracted with
2011-02-18
environmental interferents selected for this study included dolomitic limestone (Lime, NIST Standard Reference Materials, Catalog No. SRM 88b) and ovalbumin...emission lines due solely to substrates or interferents can be ignored. As in previous studies by our group, the background-corrected peak ...calculated by adding the intensi- ties of the emission lines at 486 and 656 nm); the summed intensities were normalized to the total peak intensity of the
1997-12-19
Resource Consultants Inc. (RCI) Science Applications InternatT Corp (SAIC) Veda Inc. Virtual Space Devices (VSD) 1.1 Background The Land Warrior...network. The VICs included: • VIC Alpha - a fully immersive Dismounted Soldier System developed by Veda under a STRICOM applied research effort...consists of the Dismounted Soldier System (DSS), which is characterized as follows: • Developed by Veda under a STRICOM applied research effort
Implementation of input command shaping to reduce vibration in flexible space structures
NASA Technical Reports Server (NTRS)
Chang, Kenneth W.; Seering, Warren P.; Rappole, B. Whitney
1992-01-01
Viewgraphs on implementation of input command shaping to reduce vibration in flexible space structures are presented. Goals of the research are to explore theory of input command shaping to find an efficient algorithm for flexible space structures; to characterize Middeck Active Control Experiment (MACE) test article; and to implement input shaper on the MACE structure and interpret results. Background on input shaping, simulation results, experimental results, and future work are included.
[PM₂.₅ Background Concentration at Different Directions in Beijing in 2013].
Li, Yun-ting; Cheng, Niam-liang; Zhang, Da-wei; Sun, Rui-wen; Dong, Xin; Sun, Nai-di; Chen, Chen
2015-12-01
PM₂.₅, background concentration at different directions in 2013 in Beijing was analyzed combining the techniques of mathematical statistics, physical identification and numerical simulation (CMAQ4.7.1) as well as using monitoring data of six PM₂.₅ auto-monitoring sites and five meteorological sites in 2013. Results showed that background concentrations of PM₂.₅ at northwest, northeast, eastern, southeast, southern and southwest boundary sites were between 40.3 and 85.3 µg · m⁻³ in Beijing. From the lowest to the highest, PMPM₂.₅ background concentrations at different sites were: Miyun reservoir, Badaling, Donggaocun, Yufa, Yongledian and Liulihe. Background concentration of PM₂.₅ was the lowest under north wind, then under west wind, and significantly higher under south and east wind. Calculated PM₂.₅ background average concentrations were 6.5-27.9, 22.4-73.4, 67.2-91.7, 40.7-116.1 µg · m⁻³ respectively in different wind directions. Simulated PM₂.₅ background concentration showed a clear north-south gradient distribution and the surrounding area had a notable effect on the spatial distribution of PM₂.₅ background concentration in 2013 in Beijing.
Convolutional neural networks applied to neutrino events in a liquid argon time projection chamber
NASA Astrophysics Data System (ADS)
Acciarri, R.; Adams, C.; An, R.; Asaadi, J.; Auger, M.; Bagby, L.; Baller, B.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Bugel, L.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; James, C.; de Vries, J. Jan; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Jones, B. J. P.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; von Rohr, C. Rudolf; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van de Water, R. G.; Viren, B.; Weber, M.; Weston, J.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Zeller, G. P.; Zennamo, J.; Zhang, C.
2017-03-01
We present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. We also address technical issues that arise when applying this technique to data from a large LArTPC at or near ground level.
SUBGR: A Program to Generate Subgroup Data for the Subgroup Resonance Self-Shielding Calculation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Kang Seog
2016-06-06
The Subgroup Data Generation (SUBGR) program generates subgroup data, including levels and weights from the resonance self-shielded cross section table as a function of background cross section. Depending on the nuclide and the energy range, these subgroup data can be generated by (a) narrow resonance approximation, (b) pointwise flux calculations for homogeneous media; and (c) pointwise flux calculations for heterogeneous lattice cells. The latter two options are performed by the AMPX module IRFFACTOR. These subgroup data are to be used in the Consortium for Advanced Simulation of Light Water Reactors (CASL) neutronic simulator MPACT, for which the primary resonance self-shieldingmore » method is the subgroup method.« less
Expected Backgrounds of the BetaCage, an Ultra-sensitive Screener for Surface Contamination
NASA Astrophysics Data System (ADS)
Wang, Boqian; Bunker, Raymond; Schnee, Richard; Bowles, Michael; Kos, Marek; Ahmed, Zeeshan; Golwala, Sunil; Nelson, Robert; Grant, Darren
2013-04-01
Material screening for low-energy betas and alphas is necessary for rare-event-search experiments, such as dark matter and neutrinoless double-beta decay searches where surface radiocontamination has become a significant background. The BetaCage, a gaseous neon time-projection chamber, has been proposed as a screener for emitters of low-energy betas and alphas to which existing screening facilities are insufficiently sensitive. The expected sensitivity is 0.1 betas / (keV m^2 day) and 0.1 alphas / (m^2 day). Expected backgrounds are dominated by Compton scattering of external photons in the sample to be screened; radioassays and simulations indicate backgrounds from detector materials and radon daughters should be subdominant. We will report on details of the background simulations and the detector design that allows discrimination to reach these sensitivity levels.
Surface alpha backgrounds from plate-out of radon progeny
NASA Astrophysics Data System (ADS)
Perumpilly, Gopakumar; Guiseppe, Vincente
2012-03-01
Low-background detectors operating underground aim for unprecedented low levels of radioactive backgrounds. Although the radioactive decays of airborne radon (particularly Rn-222) and its subsequent daughters present in an experiment are potential backgrounds, more troublesome is the deposition of radon daughters on detector materials. Exposure to radon at any stage of assembly of an experiment can result in surface contamination by daughters supported by the long half life (22 y) of Pb-210 on sensitive locations of a detector. We have developed a model of the radon progeny implantation using Geant4 simulations based on the low energy nuclear recoil process. We explore the alpha decays from implanted progeny on a Ge crystal as potential backgrounds for a neutrinoless double-beta decay experiment. Results of the simulations validated with alpha spectrum measurement of plate-out samples will be presented.
Casutt, Gianclaudio; Theill, Nathan; Martin, Mike; Keller, Martin; Jäncke, Lutz
2014-01-01
Background: Age-related cognitive decline is often associated with unsafe driving behavior. We hypothesized that 10 active training sessions in a driving simulator increase cognitive and on-road driving performance. In addition, driving simulator training should outperform cognitive training. Methods: Ninety-one healthy active drivers (62–87 years) were randomly assigned to one of three groups: (1) a driving simulator training group, (2) an attention training group (vigilance and selective attention), or (3) a control group. The main outcome variables were on-road driving and cognitive performance. Seventy-seven participants (85%) completed the training and were included in the analyses. Training gains were analyzed using a multiple regression analysis with planned orthogonal comparisons. Results: The driving simulator-training group showed an improvement in on-road driving performance compared to the attention-training group. In addition, both training groups increased cognitive performance compared to the control group. Conclusion: Driving simulator training offers the potential to enhance driving skills in older drivers. Compared to the attention training, the simulator training seems to be a more powerful program for increasing older drivers' safety on the road. PMID:24860497
NASA Astrophysics Data System (ADS)
Miyake, Yohei; Usui, Hideyuki
It is necessary to predict the nature of spacecraft-plasma interactions in extreme plasma conditions such as in the near-Sun environment. The spacecraft environment immersed in the solar corona is characterized by the small Debye length due to dense (7000 mathrm{/cc}) plasmas and a large photo-/secondary electron emission current emitted from the spacecraft surfaces, which lead to distinctive nature of spacecraft-plasma interactions [1,2,3]. In the present study, electromagnetic field perturbation around the Solar Probe Plus (SPP) spacecraft is examined by using our original EM-PIC (electromagnetic particle-in-cell) plasma simulation code called EMSES. In the simulations, we consider the SPP spacecraft at perihelion (0.04 mathrm{AU} from the Sun) and important physical effects such as spacecraft charging, photoelectron and secondary electron emission, solar wind plasma flow including the effect of spacecraft orbital velocity, and the presence of a background magnetic field. Our preliminary results show that both photoelectrons and secondary electrons from the spacecraft are magnetized in a spatial scale of several meters, and make drift motion due the presence of the background convection electric field. This effect leads to non-axisymmetric distributions of the electron density and the resultant electric potential near the spacecraft. Our simulations predict that a strong (˜ 100 mathrm{mV/m}) spurious electric field can be observed by the probe measurement on the spacecraft due to such a non-axisymmetric effect. We also confirm that the large photo-/secondary electron current alters magnetic field intensity around the spacecraft, but the field variation is much smaller than the background magnetic field magnitude (a few mathrm{nT} compared to a few mathrm{mu T}). [1] Ergun et al., textit{Phys. Plasmas}, textbf{17}, 072903, 2010. [2] Guillemant et al., textit{Ann. Geophys.}, textbf{30}, 1075-1092, 2012. [3] Guillemant et al., textit{IEEE Trans. Plasma Sci.}, textbf{41}, 3338-3348, 2013.
Virtual wayfinding using simulated prosthetic vision in gaze-locked viewing.
Wang, Lin; Yang, Liancheng; Dagnelie, Gislin
2008-11-01
To assess virtual maze navigation performance with simulated prosthetic vision in gaze-locked viewing, under the conditions of varying luminance contrast, background noise, and phosphene dropout. Four normally sighted subjects performed virtual maze navigation using simulated prosthetic vision in gaze-locked viewing, under five conditions of luminance contrast, background noise, and phosphene dropout. Navigation performance was measured as the time required to traverse a 10-room maze using a game controller, and the number of errors made during the trip. Navigation performance time (1) became stable after 6 to 10 trials, (2) remained similar on average at luminance contrast of 68% and 16% but had greater variation at 16%, (3) was not significantly affected by background noise, and (4) increased by 40% when 30% of phosphenes were removed. Navigation performance time and number of errors were significantly and positively correlated. Assuming that the simulated gaze-locked viewing conditions are extended to implant wearers, such prosthetic vision can be helpful for wayfinding in simple mobility tasks, though phosphene dropout may interfere with performance.
Limits of detection and decision. Part 4
NASA Astrophysics Data System (ADS)
Voigtman, E.
2008-02-01
Probability density functions (PDFs) have been derived for a number of commonly used limit of detection definitions, including several variants of the Relative Standard Deviation of the Background-Background Equivalent Concentration (RSDB-BEC) method, for a simple linear chemical measurement system (CMS) having homoscedastic, Gaussian measurement noise and using ordinary least squares (OLS) processing. All of these detection limit definitions serve as both decision and detection limits, thereby implicitly resulting in 50% rates of Type 2 errors. It has been demonstrated that these are closely related to Currie decision limits, if the coverage factor, k, is properly defined, and that all of the PDFs are scaled reciprocals of noncentral t variates. All of the detection limits have well-defined upper and lower limits, thereby resulting in finite moments and confidence limits, and the problem of estimating the noncentrality parameter has been addressed. As in Parts 1-3, extensive Monte Carlo simulations were performed and all the simulation results were found to be in excellent agreement with the derived theoretical expressions. Specific recommendations for harmonization of detection limit methodology have also been made.
NASA Astrophysics Data System (ADS)
Aurisano, A.; Backhouse, C.; Hatcher, R.; Mayer, N.; Musser, J.; Patterson, R.; Schroeter, R.; Sousa, A.
2015-12-01
The NOνA experiment is a two-detector, long-baseline neutrino experiment operating in the recently upgraded NuMI muon neutrino beam. Simulating neutrino interactions and backgrounds requires many steps including: the simulation of the neutrino beam flux using FLUKA and the FLUGG interface; cosmic ray generation using CRY; neutrino interaction modeling using GENIE; and a simulation of the energy deposited in the detector using GEANT4. To shorten generation time, the modeling of detector-specific aspects, such as photon transport, detector and electronics noise, and readout electronics, employs custom, parameterized simulation applications. We will describe the NOνA simulation chain, and present details on the techniques used in modeling photon transport near the ends of cells, and in developing a novel data-driven noise simulation. Due to the high intensity of the NuMI beam, the Near Detector samples a high rate of muons originating in the surrounding rock. In addition, due to its location on the surface at Ash River, MN, the Far Detector collects a large rate (˜ 140 kHz) of cosmic muons. We will discuss the methods used in NOνA for overlaying rock muons and cosmic ray muons with simulated neutrino interactions and show how realistically the final simulation reproduces the preliminary NOνA data.
Aurisano, A.; Backhouse, C.; Hatcher, R.; ...
2015-12-23
The NO vA experiment is a two-detector, long-baseline neutrino experiment operating in the recently upgraded NuMI muon neutrino beam. Simulating neutrino interactions and backgrounds requires many steps including: the simulation of the neutrino beam flux using FLUKA and the FLUGG interface, cosmic ray generation using CRY, neutrino interaction modeling using GENIE, and a simulation of the energy deposited in the detector using GEANT4. To shorten generation time, the modeling of detector-specific aspects, such as photon transport, detector and electronics noise, and readout electronics, employs custom, parameterized simulation applications. We will describe the NO vA simulation chain, and present details onmore » the techniques used in modeling photon transport near the ends of cells, and in developing a novel data-driven noise simulation. Due to the high intensity of the NuMI beam, the Near Detector samples a high rate of muons originating in the surrounding rock. In addition, due to its location on the surface at Ash River, MN, the Far Detector collects a large rate ((˜) 140 kHz) of cosmic muons. Furthermore, we will discuss the methods used in NO vA for overlaying rock muons and cosmic ray muons with simulated neutrino interactions and show how realistically the final simulation reproduces the preliminary NO vA data.« less
Background studies of high energy γ rays from (n,γ) reactions in the CANDLES experiment
NASA Astrophysics Data System (ADS)
Nakajima, K.; Iida, T.; Akutagawa, K.; Batpurev, T.; Chan, W. M.; Dokaku, F.; Fushimi, K.; Kakubata, H.; Kanagawa, K.; Katagiri, S.; Kawasaki, K.; Khai, B. T.; Kino, H.; Kinoshita, E.; Kishimoto, T.; Hazama, R.; Hiraoka, H.; Hiyama, T.; Ishikawa, M.; Li, X.; Maeda, T.; Matsuoka, K.; Moser, M.; Nomachi, M.; Ogawa, I.; Ohata, T.; Sato, H.; Shamoto, K.; Shimada, M.; Shokati, M.; Takahashi, N.; Takemoto, Y.; Takihira, Y.; Tamagawa, Y.; Tozawa, M.; Teranishi, K.; Tetsuno, K.; Trang, V. T. T.; Tsuzuki, M.; Umehara, S.; Wang, W.; Yoshida, S.; Yotsunaga, N.
2018-07-01
High energy γ rays with several MeV produced by (n,γ) reactions can be a trouble for low background measurements in the underground laboratories such as double beta decay experiments. In the CANDLES project, which aimed to observe the neutrino-less double beta decay from 48Ca, γ rays caused by (n,γ) reactions were found to be the most significant background. The profile of the background was studied by measurements with a neutron source and a simulation with a validity check of neutron processes in Geant4. The observed spectrum of γ rays from (n,γ) reactions was well reproduced by the simulated spectra, which were originated from the surrounding rock and a detector tank made of stainless steel. The environmental neutron flux was derived by the observed event rate of γ rays from (n,γ) reactions using the simulation. The thermal and non-thermal neutron flux were found to be (1.3 ± 0.6) ×10-6 cm-2s-1 and (1.1 ± 0.5) ×10-5 cm-2s-1 , respectively. It is necessary to install an additional shield to reduce the background from (n,γ) reaction to the required level.
Luna: What Did We Learn and What Should We Expect?
NASA Technical Reports Server (NTRS)
Wallace, William T.
2009-01-01
This presentation presents a look at the space program's background prior to lunar exploration and highlights the Apollo program and lessons learned from lunar exploration. The possibilities of exposures and difficulties attributed to lunar dust are described, including obscured vision, clogged equipment, coated surfaces, and inhalation, among others. A lunar dust simulant is proposed to support preliminary studies. Lunar dust is constantly activated by meteorite lunar dust, UV radiation and elements of solar wind - this active dust could produce reactive species. Methods of deactivation must be determined before new lunar missions, but first we must understand how to reactivate dust on Earth. Activation methods tested and described here include crushing/grinding or UV activation. Grinding time has a direct effect on amount of hydroxyl radicals produced upon addition of ground quartz to a solution. An increase in hydroxyl production was also seen for a lunar simulant with increased grinding.
Fully kinetic simulations of magnetic reconnection in partially ionised gases
NASA Astrophysics Data System (ADS)
Innocenti, M. E.; Jiang, W.; Lapenta, G.; Markidis, S.
2016-12-01
Magnetic reconnection has been explored for decades as a way to convert magnetic energy into kinetic energy and heat and to accelerate particles in environments as different as the solar surface, planetary magnetospheres, the solar wind, accretion disks, laboratory plasmas. When studying reconnection via simulations, it is usually assumed that the plasma is fully ionised, as it is indeed the case in many of the above-mentioned cases. There are, however, exceptions, the most notable being the lower solar atmosphere. Small ionisation fractions are registered also in the warm neutral interstellar medium, in dense interstellar clouds, in protostellar and protoplanetary accreditation disks, in tokamak edge plasmas and in ad-hoc laboratory experiments [1]. We study here how magnetic reconnection is modified by the presence of a neutral background, i.e. when the majority of the gas is not ionised. The ionised plasma is simulated with the fully kinetic Particle-In-Cell (PIC) code iPic3D [2]. Collisions with the neutral background are introduced via a Monte Carlo plug-in. The standard Monte Carlo procedure [3] is employed to account for elastic, excitation and ionization electron-neutral collisions, as well as for elastic scattering and charge exchange ion-neutral collisions. Collisions with the background introduce resistivity in an otherwise collisionless plasma and modifications of the particle distribution functions: particles (and ions at a faster rate) tend to thermalise to the background. To pinpoint the consequences of this, we compare reconnection simulations with and without background. References [1] E E Lawrence et al. Physical review letters, 110(1):015001, 2013. [2] S Markidis et al. Mathematics and Computers in Simulation, 80(7):1509-1519, 2010. [3] K Nanbu. IEEE Transactions on plasma science, 28(3):971-990, 2000.
Radiation noise in a high sensitivity star sensor
NASA Technical Reports Server (NTRS)
Parkinson, J. B.; Gordon, E.
1972-01-01
An extremely accurate attitude determination was developed for space applications. This system uses a high sensitivity star sensor in which the photomultiplier tube is subject to noise generated by space radiations. The space radiation induced noise arises from trapped electrons, solar protons and other ionizing radiations, as well as from dim star background. The solar activity and hence the electron and proton environments are predicted through the end of the twentieth century. The available data for the response of the phototube to proton, electron, gamma ray, and bremsstrahlung radiations are reviewed and new experimental data is presented. A simulation was developed which represents the characteristics of the effect of radiations on the star sensor, including the non-stationarity of the backgrounds.
NASA Technical Reports Server (NTRS)
Lopez Ortega, Alejandro; Mikellides, Ioannis G.
2015-01-01
Hall2De is a first-principles, 2-D axisymmetric code that solves the equations of motion for ions, electrons, and neutrals on a magnetic-field-aligned grid. The computational domain downstream of the acceleration channel exit plane is large enough to include self-consistently the cathode boundary. In this paper, we present results from numerical simulations of the H6 laboratory thruster with an internally mounted cathode, with the aim of highlighting the importance of properly accounting for the interactions between the ion beam and cathode plume. The anomalous transport of electrons across magnetic field lines in Hall2De is modelled using an anomalous collision frequency, ?anom, yielding ?anom approximately equal to omega ce (i.e., the electron cyclotron frequency) in the plume. We first show that restricting the anomalous collision frequency to only regions where the current density of ions is large does not alter the plasma discharge in the Hall thruster as long as the interaction between the ion beam and the cathode plume is captured properly in the computational domain. This implies that the boundary conditions must be placed sufficiently far as to not interfere with the electron transport in this region. These simulation results suggest that electron transport across magnetic field lines occurs largely inside the beam and may be driven by the interactions between beam ions and electrons. A second finding that puts in relevance the importance of including the cathode plume in numerical simulations is on the significance of accounting for the ion acoustic turbulence (IAT), now known to occur in the vicinity of the cathode exit. We have included in the Hall2De simulations a model of the IAT-driven anomalous collision frequency based on Sagdeev's model for saturation of the ion-acoustic instability. This implementation has allowed us to achieve excellent agreement with experimental measurements in the near plume obtained during the operation of the H6 thruster at nominal conditions (300V, 20A) and chamber background pressure of approximately 1.5 x 10(exp -5) Torr. In addition, the numerical results obtained with the latter approach exhibit less sensitivity to background pressure than previous attempts at explaining the features of the plasma properties in the near plume.
Automated mixed traffic vehicle control and scheduling study
NASA Technical Reports Server (NTRS)
Peng, T. K. C.; Chon, K.
1976-01-01
The operation and the expected performance of a proposed automatic guideway transit system which uses low speed automated mixed traffic vehicles (AMTVs) were analyzed. Vehicle scheduling and headway control policies were evaluated with a transit system simulation model. The effect of mixed traffic interference on the average vehicle speed was examined with a vehicle pedestrian interface model. Control parameters regulating vehicle speed were evaluated for safe stopping and passenger comfort. Some preliminary data on the cost and operation of an experimental AMTV system are included. These data were the result of a separate task conducted at JPL, and were included as background information.
Review of Monte Carlo simulations for backgrounds from radioactivity
NASA Astrophysics Data System (ADS)
Selvi, Marco
2013-08-01
For all experiments dealing with the rare event searches (neutrino, dark matter, neutrino-less double-beta decay), the reduction of the radioactive background is one of the most important and difficult tasks. There are basically two types of background, electron recoils and nuclear recoils. The electron recoil background is mostly from the gamma rays through the radioactive decay. The nuclear recoil background is from neutrons from spontaneous fission, (α, n) reactions and muoninduced interactions (spallations, photo-nuclear and hadronic interaction). The external gammas and neutrons from the muons and laboratory environment, can be reduced by operating the detector at deep underground laboratories and by placing active or passive shield materials around the detector. The radioactivity of the detector materials also contributes to the background; in order to reduce it a careful screening campaign is mandatory to select highly radio-pure materials. In this review I present the status of current Monte Carlo simulations aimed to estimate and reproduce the background induced by gamma and neutron radioactivity of the materials and the shield of rare event search experiment. For the electromagnetic background a good level of agreement between the data and the MC simulation has been reached by the XENON100 and EDELWEISS experiments, using the GEANT4 toolkit. For the neutron background, a comparison between the yield of neutrons from spontaneous fission and (α, n) obtained with two dedicated softwares, SOURCES-4A and the one developed by Mei-Zhang-Hime, show a good overall agreement, with total yields within a factor 2 difference. The energy spectra from SOURCES-4A are in general smoother, while those from MZH presents sharp peaks. The neutron propagation through various materials has been studied with two MC codes, GEANT4 and MCNPX, showing a reasonably good agreement, inside 50% discrepancy.
NASA Astrophysics Data System (ADS)
Justus, Christopher
2005-04-01
In this study, we simulated top-antitop (tt-bar) quark events at the Compact Muon Solenoid (CMS), an experiment presently being constructed at the Large Hadron Collider in Geneva, Switzerland. The tt-bar process is an important background for Higgs events. We used a chain of software to simulate and reconstruct processes that will occur inside the detector. CMKIN was used to generate and store Monte Carlo Events. OSCAR, a GEANT4 based CMS detector simulator, was used to simulate the CMS detector and how particles would interact with the detector. Next, we used ORCA to simulate the response of the readout electronics at CMS. Last, we used the Jet/MET Root maker to create root files of jets and missing energy. We are now using this software analysis chain to complete a systematic study of initial state radiation at hadron colliders. This study is essential because tt-bar is the main background for the Higgs boson and these processes are extremely sensitive to initial state radiation. Results of our initial state radiation study will be presented. We started this study at the new LHC Physics Center (LPC) located at Fermi National Accelerator Laboratory, and we are now completing the study at the University of Rochester.
Spacecraft applications of advanced global positioning system technology
NASA Technical Reports Server (NTRS)
1988-01-01
This is the final report on the Texas Instruments Incorporated (TI) simulations study of Spacecraft Application of Advanced Global Positioning System (GPS) Technology. This work was conducted for the NASA Johnson Space Center (JSC) under contract NAS9-17781. GPS, in addition to its baselined capability as a highly accurate spacecraft navigation system, can provide traffic control, attitude control, structural control, and uniform time base. In Phase 1 of this program, another contractor investigated the potential of GPS in these four areas and compared GPS to other techniques. This contract was for the Phase 2 effort, to study the performance of GPS for these spacecraft applications through computer simulations. TI had previously developed simulation programs for GPS differential navigation and attitude measurement. These programs were adapted for these specific spacecraft applications. In addition, TI has extensive expertise in the design and production of advanced GPS receivers, including space-qualified GPS receivers. We have drawn on this background to augment the simulation results in the system level overview, which is Section 2 of this report.
Multi-model comparison of the volcanic sulfate deposition from the 1815 eruption of Mt. Tambora
NASA Astrophysics Data System (ADS)
Marshall, Lauren; Schmidt, Anja; Toohey, Matthew; Carslaw, Ken S.; Mann, Graham W.; Sigl, Michael; Khodri, Myriam; Timmreck, Claudia; Zanchettin, Davide; Ball, William T.; Bekki, Slimane; Brooke, James S. A.; Dhomse, Sandip; Johnson, Colin; Lamarque, Jean-Francois; LeGrande, Allegra N.; Mills, Michael J.; Niemeier, Ulrike; Pope, James O.; Poulain, Virginie; Robock, Alan; Rozanov, Eugene; Stenke, Andrea; Sukhodolov, Timofei; Tilmes, Simone; Tsigaridis, Kostas; Tummon, Fiona
2018-02-01
The eruption of Mt. Tambora in 1815 was the largest volcanic eruption of the past 500 years. The eruption had significant climatic impacts, leading to the 1816 year without a summer
, and remains a valuable event from which to understand the climatic effects of large stratospheric volcanic sulfur dioxide injections. The eruption also resulted in one of the strongest and most easily identifiable volcanic sulfate signals in polar ice cores, which are widely used to reconstruct the timing and atmospheric sulfate loading of past eruptions. As part of the Model Intercomparison Project on the climatic response to Volcanic forcing (VolMIP), five state-of-the-art global aerosol models simulated this eruption. We analyse both simulated background (no Tambora) and volcanic (with Tambora) sulfate deposition to polar regions and compare to ice core records. The models simulate overall similar patterns of background sulfate deposition, although there are differences in regional details and magnitude. However, the volcanic sulfate deposition varies considerably between the models with differences in timing, spatial pattern and magnitude. Mean simulated deposited sulfate on Antarctica ranges from 19 to 264 kg km-2 and on Greenland from 31 to 194 kg km-2, as compared to the mean ice-core-derived estimates of roughly 50 kg km-2 for both Greenland and Antarctica. The ratio of the hemispheric atmospheric sulfate aerosol burden after the eruption to the average ice sheet deposited sulfate varies between models by up to a factor of 15. Sources of this inter-model variability include differences in both the formation and the transport of sulfate aerosol. Our results suggest that deriving relationships between sulfate deposited on ice sheets and atmospheric sulfate burdens from model simulations may be associated with greater uncertainties than previously thought.
Compact configurations within small evolving groups of galaxies
NASA Astrophysics Data System (ADS)
Mamon, G. A.
Small virialized groups of galaxies are evolved with a gravitational N-body code, where the galaxies and a diffuse background are treated as single particles, but with mass and luminosity profiles attached, which enbles the estimation of parameters such as internal energies, half-mass radii, and the softened potential energies of interaction. The numerical treatment includes mergers, collisional stripping, tidal limitation by the mean-field of the background (evaluated using a combination of instantaneous and impulsive formulations), galaxy heating from collisons, and background heating from dynamical friction. The groups start out either as dense as appear the groups in Hickson's (1982) catalog, or as loose as appear those in Turner and Gott's (1976a) catalog, and they are simulated many times (usually 20) with different initial positions and velocities. Dense groups of galaxies with massive dark haloes coalesce into a single galaxy and lose their compact group appearance in approximately 3 group half-mass crossing times, while dense groups of galaxies without massive haloes survive the merger instability for 15 half-mass crossing times (in a more massive background to keep the same total group mass).
Galindo-Murillo, Rodrigo; Roe, Daniel R.; Cheatham, Thomas E.
2014-01-01
Background The structure and dynamics of DNA are critically related to its function. Molecular dynamics (MD) simulations augment experiment by providing detailed information about the atomic motions. However, to date the simulations have not been long enough for convergence of the dynamics and structural properties of DNA. Methods MD simulations performed with AMBER using the ff99SB force field with the parmbsc0 modifications, including ensembles of independent simulations, were compared to long timescale MD performed with the specialized Anton MD engine on the B-DNA structure d(GCACGAACGAACGAACGC). To assess convergence, the decay of the average RMSD values over longer and longer time intervals was evaluated in addition to assessing convergence of the dynamics via the Kullback-Leibler divergence of principal component projection histograms. Results These MD simulations —including one of the longest simulations of DNA published to date at ~44 μs—surprisingly suggest that the structure and dynamics of the DNA helix, neglecting the terminal base pairs, are essentially fully converged on the ~1–5 μs timescale. Conclusions We can now reproducibly converge the structure and dynamics of B-DNA helices, omitting the terminal base pairs, on the μs time scale with both the AMBER and CHARMM C36 nucleic acid force fields. Results from independent ensembles of simulations starting from different initial conditions, when aggregated, match the results from long timescale simulations on the specialized Anton MD engine. General Significance With access to large-scale GPU resources or the specialized MD engine “Anton” it is possibly for a variety of molecular systems to reproducibly and reliably converge the conformational ensemble of sampled structures. PMID:25219455
Interfacing MCNPX and McStas for simulation of neutron transport
NASA Astrophysics Data System (ADS)
Klinkby, Esben; Lauritzen, Bent; Nonbøl, Erik; Kjær Willendrup, Peter; Filges, Uwe; Wohlmuther, Michael; Gallmeier, Franz X.
2013-02-01
Simulations of target-moderator-reflector system at spallation sources are conventionally carried out using Monte Carlo codes such as MCNPX (Waters et al., 2007 [1]) or FLUKA (Battistoni et al., 2007; Ferrari et al., 2005 [2,3]) whereas simulations of neutron transport from the moderator and the instrument response are performed by neutron ray tracing codes such as McStas (Lefmann and Nielsen, 1999; Willendrup et al., 2004, 2011a,b [4-7]). The coupling between the two simulation suites typically consists of providing analytical fits of MCNPX neutron spectra to McStas. This method is generally successful but has limitations, as it e.g. does not allow for re-entry of neutrons into the MCNPX regime. Previous work to resolve such shortcomings includes the introduction of McStas inspired supermirrors in MCNPX. In the present paper different approaches to interface MCNPX and McStas are presented and applied to a simple test case. The direct coupling between MCNPX and McStas allows for more accurate simulations of e.g. complex moderator geometries, backgrounds, interference between beam-lines as well as shielding requirements along the neutron guides.
Monte Carlo simulation for background study of geophysical inspection with cosmic-ray muons
NASA Astrophysics Data System (ADS)
Nishiyama, Ryuichi; Taketa, Akimichi; Miyamoto, Seigo; Kasahara, Katsuaki
2016-08-01
Several attempts have been made to obtain a radiographic image inside volcanoes using cosmic-ray muons (muography). Muography is expected to resolve highly heterogeneous density profiles near the surface of volcanoes. However, several prior works have failed to make clear observations due to contamination by background noise. The background contamination leads to an overestimation of the muon flux and consequently a significant underestimation of the density in the target mountains. To investigate the origin of the background noise, we performed a Monte Carlo simulation. The main components of the background noise in muography are found to be low-energy protons, electrons and muons in case of detectors without particle identification and with energy thresholds below 1 GeV. This result was confirmed by comparisons with actual observations of nuclear emulsions. This result will be useful for detector design in future works, and in addition some previous works of muography should be reviewed from the view point of background contamination.
Bivalves: From individual to population modelling
NASA Astrophysics Data System (ADS)
Saraiva, S.; van der Meer, J.; Kooijman, S. A. L. M.; Ruardij, P.
2014-11-01
An individual based population model for bivalves was designed, built and tested in a 0D approach, to simulate the population dynamics of a mussel bed located in an intertidal area. The processes at the individual level were simulated following the dynamic energy budget theory, whereas initial egg mortality, background mortality, food competition, and predation (including cannibalism) were additional population processes. Model properties were studied through the analysis of theoretical scenarios and by simulation of different mortality parameter combinations in a realistic setup, imposing environmental measurements. Realistic criteria were applied to narrow down the possible combination of parameter values. Field observations obtained in the long-term and multi-station monitoring program were compared with the model scenarios. The realistically selected modeling scenarios were able to reproduce reasonably the timing of some peaks in the individual abundances in the mussel bed and its size distribution but the number of individuals was not well predicted. The results suggest that the mortality in the early life stages (egg and larvae) plays an important role in population dynamics, either by initial egg mortality, larvae dispersion, settlement failure or shrimp predation. Future steps include the coupling of the population model with a hydrodynamic and biogeochemical model to improve the simulation of egg/larvae dispersion, settlement probability, food transport and also to simulate the feedback of the organisms' activity on the water column properties, which will result in an improvement of the food quantity and quality characterization.
Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A
2018-03-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.
New software to model energy dispersive X-ray diffraction in polycrystalline materials
NASA Astrophysics Data System (ADS)
Ghammraoui, B.; Tabary, J.; Pouget, S.; Paulus, C.; Moulin, V.; Verger, L.; Duvauchelle, Ph.
2012-02-01
Detection of illicit materials, such as explosives or drugs, within mixed samples is a major issue, both for general security and as part of forensic analyses. In this paper, we describe a new code simulating energy dispersive X-ray diffraction patterns in polycrystalline materials. This program, SinFullscat, models diffraction of any object in any diffractometer system taking all physical phenomena, including amorphous background, into account. Many system parameters can be tuned: geometry, collimators (slit and cylindrical), sample properties, X-ray source and detector energy resolution. Good agreement between simulations and experimental data was obtained. Simulations using explosive materials indicated that parameters such as the diffraction angle or the energy resolution of the detector have a significant impact on the diffraction signature of the material inspected. This software will be a convenient tool to test many diffractometer configurations, providing information on the one that best restores the spectral diffraction signature of the materials of interest.
Effect of Simulation on Undergraduate Nursing Students' Knowledge of Nursing Ethics Principles.
Donnelly, Mary Broderick; Horsley, Trisha Leann; Adams, William H; Gallagher, Peggy; Zibricky, C Dawn
2017-12-01
Background Undergraduate nursing education standards include acquisition of knowledge of ethics principles and the prevalence of health-care ethical dilemmas mandates that nursing students study ethics. However, little research has been published to support best practices for teaching/learning ethics principles. Purpose This study sought to determine if participation in an ethics consultation simulation increased nursing students' knowledge of nursing ethics principles compared to students who were taught ethics principles in the traditional didactic format. Methods This quasi-experimental study utilized a pre-test/post-test design with randomized assignment of students at three universities into both control and experimental groups. Results Nursing students' knowledge of nursing ethics principles significantly improved from pre-test to post-test ( p = .002); however, there was no significant difference between the experimental and control groups knowledge scores ( p = .13). Conclusion Further research into use of simulation to teach ethics principles is indicated.
NASA Technical Reports Server (NTRS)
Comstock, James R., Jr.; Ghatas, Rania W.; Consiglio, Maria C.; Chamberlain, James P.; Hoffler, Keith D.
2016-01-01
This study evaluated the effects of Communications Delays and Winds on Air Traffic Controller ratings of acceptability of horizontal miss distances (HMDs) for encounters between UAS and manned aircraft in a simulation of the Dallas-Ft. Worth East-side airspace. Fourteen encounters per hour were staged in the presence of moderate background traffic. Seven recently retired controllers with experience at DFW served as subjects. Guidance provided to the UAS pilots for maintaining a given HMD was provided by information from self-separation algorithms displayed on the Multi-Aircraft Simulation System. Winds tested did not affect the acceptability ratings. Communications delays tested included 0, 400, 1200, and 1800 msec. For longer communications delays, there were changes in strategy and communications flow that were observed and reported by the controllers. The aim of this work is to provide useful information for guiding future rules and regulations applicable to flying UAS in the NAS.
Faunus: An object oriented framework for molecular simulation
Lund, Mikael; Trulsson, Martin; Persson, Björn
2008-01-01
Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331
Particle-in-Cell Modeling of Magnetron Sputtering Devices
NASA Astrophysics Data System (ADS)
Cary, John R.; Jenkins, T. G.; Crossette, N.; Stoltz, Peter H.; McGugan, J. M.
2017-10-01
In magnetron sputtering devices, ions arising from the interaction of magnetically trapped electrons with neutral background gas are accelerated via a negative voltage bias to strike a target cathode. Neutral atoms ejected from the target by such collisions then condense on neighboring material surfaces to form a thin coating of target material; a variety of industrial applications which require thin surface coatings are enabled by this plasma vapor deposition technique. In this poster we discuss efforts to simulate various magnetron sputtering devices using the Vorpal PIC code in 2D axisymmetric cylindrical geometry. Field solves are fully self-consistent, and discrete models for sputtering, secondary electron emission, and Monte Carlo collisions are included in the simulations. In addition, the simulated device can be coupled to an external feedback circuit. Erosion/deposition profiles and steady-state plasma parameters are obtained, and modifications due to self consistency are seen. Computational performance issues are also discussed. and Tech-X Corporation.
Statistical simulations of the dust foreground to cosmic microwave background polarization
NASA Astrophysics Data System (ADS)
Vansyngel, F.; Boulanger, F.; Ghosh, T.; Wandelt, B.; Aumont, J.; Bracco, A.; Levrier, F.; Martin, P. G.; Montier, L.
2017-07-01
The characterization of the dust polarization foreground to the cosmic microwave background (CMB) is a necessary step toward the detection of the B-mode signal associated with primordial gravitational waves. We present a method to simulate maps of polarized dust emission on the sphere that is similar to the approach used for CMB anisotropies. This method builds on the understanding of Galactic polarization stemming from the analysis of Planck data. It relates the dust polarization sky to the structure of the Galactic magnetic field and its coupling with interstellar matter and turbulence. The Galactic magnetic field is modeled as a superposition of a mean uniform field and a Gaussian random (turbulent) component with a power-law power spectrum of exponent αM. The integration along the line of sight carried out to compute Stokes maps is approximated by a sum over a small number of emitting layers with different realizations of the random component of the magnetic field. The model parameters are constrained to fit the power spectra of dust polarization EE, BB, and TE measured using Planck data. We find that the slopes of the E and B power spectra of dust polarization are matched for αM = -2.5, an exponent close to that measured for total dust intensity but larger than the Kolmogorov exponent - 11/3. The model allows us to compute multiple realizations of the Stokes Q and U maps for different realizations of the random component of the magnetic field, and to quantify the variance of dust polarization spectra for any given sky area outside of the Galactic plane. The simulations reproduce the scaling relation between the dust polarization power and the mean total dust intensity including the observed dispersion around the mean relation. We also propose a method to carry out multifrequency simulations, including the decorrelation measured recently by Planck, using a given covariance matrix of the polarization maps. These simulations are well suited to optimize component separation methods and to quantify the confidence with which the dust and CMB B-modes can be separated in present and future experiments. We also provide an astrophysical perspective on our phenomenological modeling of the dust polarization spectra.
NASA Astrophysics Data System (ADS)
Cusumano, Salvatore J.; Fiorino, Steven T.; Bartell, Richard J.; Krizo, Matthew J.; Bailey, William F.; Beauchamp, Rebecca L.; Marciniak, Michael A.
2011-01-01
The Air Force Institute of Technology's Center for Directed Energy developed the High Energy Laser End-to-End Operational Simulation (HELEEOS) model in part to quantify the performance variability in laser propagation created by the natural environment during dynamic engagements. As such, HELEEOS includes a fast-calculating, first principles, worldwide surface-to-100 km, atmospheric propagation, and characterization package. This package enables the creation of profiles of temperature, pressure, water vapor content, optical turbulence, atmospheric particulates, and hydrometeors as they relate to line-by-line layer transmission, path, and background radiance at wavelengths from the ultraviolet to radio frequencies. In the current paper an example of a unique high fidelity simulation of a bistatic, time-varying five band multispectral remote observation of energy delivered on a distant and receding test object is presented for noncloudy conditions with aerosols. The multispectral example emphasizes atmospheric effects using HELEEOS, the interaction of the energy and the test object, the observed reflectance, and subsequent hot spot generated. A model of a sensor suite located on the surface is included to collect the diffuse reflected in-band laser radiation and the emitted radiance of the hot spot in four separate and spatially offset midwave infrared and longwave infrared bands. Particular care is taken in modeling the bidirectional reflectance distribution function of the delivered energy/target interaction to account for both the coupling of energy into the test object and the changes in reflectance as a function of temperature. The architecture supports any platform-target-observer geometry, geographic location, season, and time of day, and it provides for correct contributions of the sky-earth background. The simulation accurately models the thermal response, kinetics, turbulence, base disturbance, diffraction, and signal-to-noise ratios.
LENS: μLENS Simulations, Analysis, and Results
NASA Astrophysics Data System (ADS)
Rasco, Charles
2013-04-01
Simulations of the Low-Energy Neutrino Spectrometer prototype, μLENS, have been performed in order to benchmark the first measurements of the μLENS detector at the Kimballton Underground Research Facility (KURF). μLENS is a 6x6x6 celled scintillation lattice filled with Linear Alkylbenzene based scintillator. We have performed simulations of μLENS using the GEANT4 toolkit. We have measured various radioactive sources, LEDs, and environmental background radiation measurements at KURF using up to 96 PMTs with a simplified data acquisition system of QDCs and TDCs. In this talk we will demonstrate our understanding of the light propagation and we will compare simulation results with measurements of the μLENS detector of various radioactive sources, LEDs, and the environmental background radiation.
René de Cotret, Laurent P; Siwick, Bradley J
2017-07-01
The general problem of background subtraction in ultrafast electron powder diffraction (UEPD) is presented with a focus on the diffraction patterns obtained from materials of moderately complex structure which contain many overlapping peaks and effectively no scattering vector regions that can be considered exclusively background. We compare the performance of background subtraction algorithms based on discrete and dual-tree complex (DTCWT) wavelet transforms when applied to simulated UEPD data on the M1-R phase transition in VO 2 with a time-varying background. We find that the DTCWT approach is capable of extracting intensities that are accurate to better than 2% across the whole range of scattering vector simulated, effectively independent of delay time. A Python package is available.
Simulating glories and cloudbows in color.
Gedzelman, Stanley D
2003-01-20
Glories and cloudbows are simulated in color by use of the Mie scattering theory of light upwelling from small-droplet clouds of finite optical thickness embedded in a Rayleigh scattering atmosphere. Glories are generally more distinct for clouds of droplets of as much as approximately 10 microm in radius. As droplet radius increases, the glory shrinks and becomes less prominent, whereas the cloudbow becomes more distinct and eventually colorful. Cloudbows typically consist of a broad, almost white band with a slightly orange outer edge and a dark inner band. Multiple light and dark bands that are related to supernumerary rainbows first appear inside the cloudbow as droplet radius increases above approximately 10 microm and gradually become more prominent when all droplets are the same size. Bright glories with multiple rings and high color purity are simulated when all droplets are the same size and every light beam is scattered just once. Color purity decreases and outer rings fade as the range of droplet sizes widens and when skylight, reflected light from the ground or background, and multiply scattered light from the cloud are included. Consequently, the brightest and most colorful glories and bows are seen when the observer is near a cloud or a rain swath with optical thickness of approximately 0.25 that consists of uniform-sized drops and when a dark or shaded background lies a short distance behind the cloud.
A Systems Approach to Designing Effective Clinical Trials Using Simulations
Fusaro, Vincent A.; Patil, Prasad; Chi, Chih-Lin; Contant, Charles F.; Tonellato, Peter J.
2013-01-01
Background Pharmacogenetics in warfarin clinical trials have failed to show a significant benefit compared to standard clinical therapy. This study demonstrates a computational framework to systematically evaluate pre-clinical trial design of target population, pharmacogenetic algorithms, and dosing protocols to optimize primary outcomes. Methods and Results We programmatically created an end-to-end framework that systematically evaluates warfarin clinical trial designs. The framework includes options to create a patient population, multiple dosing strategies including genetic-based and non-genetic clinical-based, multiple dose adjustment protocols, pharmacokinetic/pharmacodynamics (PK/PD) modeling and international normalization ratio (INR) prediction, as well as various types of outcome measures. We validated the framework by conducting 1,000 simulations of the CoumaGen clinical trial primary endpoints. The simulation predicted a mean time in therapeutic range (TTR) of 70.6% and 72.2% (P = 0.47) in the standard and pharmacogenetic arms, respectively. Then, we evaluated another dosing protocol under the same original conditions and found a significant difference in TTR between the pharmacogenetic and standard arm (78.8% vs. 73.8%; P = 0.0065), respectively. Conclusions We demonstrate that this simulation framework is useful in the pre-clinical assessment phase to study and evaluate design options and provide evidence to optimize the clinical trial for patient efficacy and reduced risk. PMID:23261867
Monte-Carlo Simulations of the Suzaku-XRS Residual Background Spectrum
NASA Technical Reports Server (NTRS)
Perinati, E.; Kilbourne, Caroline Anne; Colasanti, L.; Lotti, S.; Macculi, C.; Piro, L.; Mineo, T.; Mitsuda, K.; Bonardi, A.; Santangelo, A.
2012-01-01
Cryogenic micro-calorimeters are suitable to detect small amounts of energy deposited by electromagnetic and nuclear interactions, which makes them attractive in a variety of applications on ground and in space. The only X-ray microcalorimeter that operated in orbit to date is the X-Ray Spectrometer on-board of the Japanese Suzaku satellite. We discuss the analysis of the components of its residual background spectrum with the support of Monte-Carlo simulations.
Smith, D G; Baranski, J V; Thompson, M M; Abel, S M
2003-01-01
A total of twenty-five subjects were cloistered for a period of 70 hours, five at a time, in a hyperbaric chamber modified to simulate the conditions aboard the International Space Station (ISS). A recording of 72 dBA background noise from the ISS service module was used to simulate noise conditions on the ISS. Two groups experienced the background noise throughout the experiment, two other groups experienced the noise only during the day, and one control group was cloistered in a quiet environment. All subjects completed a battery of cognitive tests nine times throughout the experiment. The data showed little or no effect of noise on reasoning, perceptual decision-making, memory, vigilance, mood, or subjective indices of fatigue. Our results suggest that the level of noise on the space station should not affect cognitive performance, at least over a period of several days.
Convolutional neural networks applied to neutrino events in a liquid argon time projection chamber
Acciarri, R.; Adams, C.; An, R.; ...
2017-03-14
Here, we present several studies of convolutional neural networks applied to data coming from the MicroBooNE detector, a liquid argon time projection chamber (LArTPC). The algorithms studied include the classification of single particle images, the localization of single particle and neutrino interactions in an image, and the detection of a simulated neutrino event overlaid with cosmic ray backgrounds taken from real detector data. These studies demonstrate the potential of convolutional neural networks for particle identification or event detection on simulated neutrino interactions. Lastly, we also address technical issues that arise when applying this technique to data from a large LArTPCmore » at or near ground level.« less
Experimental and Numerical Study of Drift Alfv'en Waves in LAPD
NASA Astrophysics Data System (ADS)
Friedman, Brett; Popovich, P.; Carter, T. A.; Auerbach, D.; Schaffner, D.
2009-11-01
We present a study of drift Alfv'en waves in linear geometry using experiments in the Large Plasma Device (LAPD) at UCLA and simulations from the Boundary Turbulence code (BOUT). BOUT solves the 3D time evolution of plasma parameters and turbulence using Braginskii fluid equations. First, we present a verification study of linear drift Alfven wave physics in BOUT, which has been modified to simulate the cylindrical geometry of LAPD. Second, we present measurements of density and magnetic field fluctuations in the LAPD plasma and the correlation of these fluctuations as a function of plasma parameters, including strength of the background field and discharge current. We also compare the measurements to nonlinear BOUT calculations using experimental LAPD profiles.
NASA Technical Reports Server (NTRS)
Lin, Yuh-Lang; Kaplan, Michael L.
1992-01-01
Work performed during the report period is summarized. The first numerical experiment which was performed on the North Carolina Supercomputer Center's CRAY-YMP machine during the second half of FY92 involved a 36 hour simulation of the CCOPE case study. This first coarse-mesh simulation employed the GMASS model with a 178 x 108 x 32 matrix of grid points spaced approximately 24 km apart. The initial data was comprised of the global 2.5 x 2.5 degree analyses as well as all available North American rawinsonde data valid at 0000 UTC 11 July 1981. Highly-smoothed LFM-derived terrain data were utilized so as to determine the mesoscale response of the three-dimensional atmosphere to weak terrain forcing prior to including the observed highly complex terrain of the northern Rocky Mountain region. It was felt that the model should be run with a spectrum of terrain geometries, ranging from observed complex terrain to no terrain at all, to determine how crucial the terrain was in forcing the mesoscale phenomena. Both convection and stratiform (stable) precipitation were not allowed in this simulation so that their relative importance could be determined by inclusion in forth-coming simulations. A full suite of planetary boundary layer forcing was allowed in the simulation, including surface sensible and latent heat fluxes employing the Blakadar PBL formulation. The details of this simulation, which in many ways could be considered the control simulation, including the important synoptic-scale, meso-alpha scale, and meso-beta scale circulations is described. These results are compared to the observations diagnosed by Koch and his colleagues as well as hypotheses set forth in the project proposal for terrain-influences upon the jet stream and their role in the generation of mesoscale wave phenomenon. The fundamental goal of the analyses being the discrimination among background geostrophic adjustment, terrain influences, and shearing instability in the initiation and maintainance of mesoscale internal wave phenomena. Based upon these findings, FY93 plans are discussed. A review of linear theory and theoretical modeling of a geostrophic zonal wind anomaly is included.
THE CENTRAL SLOPE OF DARK MATTER CORES IN DWARF GALAXIES: SIMULATIONS VERSUS THINGS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oh, Se-Heon; De Blok, W. J. G.; Brook, Chris
2011-07-15
We make a direct comparison of the derived dark matter (DM) distributions between hydrodynamical simulations of dwarf galaxies assuming a {Lambda}CDM cosmology and the observed dwarf galaxies sample from the THINGS survey in terms of (1) the rotation curve shape and (2) the logarithmic inner density slope {alpha} of mass density profiles. The simulations, which include the effect of baryonic feedback processes, such as gas cooling, star formation, cosmic UV background heating, and most importantly, physically motivated gas outflows driven by supernovae, form bulgeless galaxies with DM cores. We show that the stellar and baryonic mass is similar to thatmore » inferred from photometric and kinematic methods for galaxies of similar circular velocity. Analyzing the simulations in exactly the same way as the observational sample allows us to address directly the so-called cusp/core problem in the {Lambda}CDM model. We show that the rotation curves of the simulated dwarf galaxies rise less steeply than cold dark matter rotation curves and are consistent with those of the THINGS dwarf galaxies. The mean value of the logarithmic inner density slopes {alpha} of the simulated galaxies' DM density profiles is {approx}-0.4 {+-} 0.1, which shows good agreement with {alpha} = -0.29 {+-} 0.07 of the THINGS dwarf galaxies. The effect of non-circular motions is not significant enough to affect the results. This confirms that the baryonic feedback processes included in the simulations are efficiently able to make the initial cusps with {alpha} {approx}-1.0 to -1.5 predicted by DM-only simulations shallower and induce DM halos with a central mass distribution similar to that observed in nearby dwarf galaxies.« less
NASA Astrophysics Data System (ADS)
Sever, G.; Collis, S. M.; Ghate, V. P.
2017-12-01
Three-dimensional numerical experiments are performed to explore the mechanical and thermal impacts of Graciosa Island on the sampling of oceanic airflow and cloud evolution. Ideal and real configurations of flow and terrain are planned using high-resolution, large-eddy resolving (e.g., Δ < 100 meter) simulations. Ideal configurations include model initializations with ideal dry and moist temperature and wind profiles to capture flow features over an island-like topography. Real configurations will use observations from different climatological background states over the Eastern Northern Atlantic, Atmospheric Radiation Measurement (ENA-ARM) site on Graciosa Island. Initial small-domain large-eddy simulations (LES) of dry airflow produce cold-pool formation upstream of an ideal two-kilometer island, with von Kármán like vortices propagation downstream. Although the peak height of Graciosa is less than half kilometer, the Azores island chain has a mountain over 2 km, which may be leading to more complex flow patterns when simulations are extended to a larger domain. Preliminary idealized low-resolution moist simulations indicate that the cloud field is impacted due to the presence of the island. Longer simulations that are performed to capture diurnal evolution of island boundary layer show distinct land/sea breeze formations under quiescent flow conditions. Further numerical experiments are planned to extend moist simulations to include realistic atmospheric profiles and observations of surface fluxes coupled with radiative effects. This work is intended to produce a useful simulation framework coupled with instruments to guide airborne and ground sampling strategies during the ACE-ENA field campaign which is aimed to better characterize marine boundary layer clouds.
An Overview of Landing Gear Dynamics
NASA Technical Reports Server (NTRS)
Pritchard, Jocelyn I.
1999-01-01
One of the problems facing the aircraft community is landing gear dynamics, especially shimmy and brake-induced vibration. Shimmy and brake-induced vibrations can lead to accidents due to excessive wear and shortened life of gear parts and contribute to pilot and passenger discomfort. To increase understanding of these problems, a literature survey was performed. The major focus is on work from the last ten years. Some older publications are included to understand the longevity of the problem and the background from earlier researchers. The literature survey includes analyses, testing, modeling, and simulation of aircraft landing gear; and experimental validation and characterization of shimmy and brake-induced vibration of aircraft landing gear. The paper presents an overview of the problem, background information, and a history of landing gear dynamics problems and solutions. Based on the survey an assessment and recommendations of the most critically needed enhancements to the state of the art will be presented. The status of Langley work contributing to this activity will be given.
LC-MSsim – a simulation software for liquid chromatography mass spectrometry data
Schulz-Trieglaff, Ole; Pfeifer, Nico; Gröpl, Clemens; Kohlbacher, Oliver; Reinert, Knut
2008-01-01
Background Mass Spectrometry coupled to Liquid Chromatography (LC-MS) is commonly used to analyze the protein content of biological samples in large scale studies. The data resulting from an LC-MS experiment is huge, highly complex and noisy. Accordingly, it has sparked new developments in Bioinformatics, especially in the fields of algorithm development, statistics and software engineering. In a quantitative label-free mass spectrometry experiment, crucial steps are the detection of peptide features in the mass spectra and the alignment of samples by correcting for shifts in retention time. At the moment, it is difficult to compare the plethora of algorithms for these tasks. So far, curated benchmark data exists only for peptide identification algorithms but no data that represents a ground truth for the evaluation of feature detection, alignment and filtering algorithms. Results We present LC-MSsim, a simulation software for LC-ESI-MS experiments. It simulates ESI spectra on the MS level. It reads a list of proteins from a FASTA file and digests the protein mixture using a user-defined enzyme. The software creates an LC-MS data set using a predictor for the retention time of the peptides and a model for peak shapes and elution profiles of the mass spectral peaks. Our software also offers the possibility to add contaminants, to change the background noise level and includes a model for the detectability of peptides in mass spectra. After the simulation, LC-MSsim writes the simulated data to mzData, a public XML format. The software also stores the positions (monoisotopic m/z and retention time) and ion counts of the simulated ions in separate files. Conclusion LC-MSsim generates simulated LC-MS data sets and incorporates models for peak shapes and contaminations. Algorithm developers can match the results of feature detection and alignment algorithms against the simulated ion lists and meaningful error rates can be computed. We anticipate that LC-MSsim will be useful to the wider community to perform benchmark studies and comparisons between computational tools. PMID:18842122
NASA Technical Reports Server (NTRS)
Mullally, Fergal
2017-01-01
We present an automated method of identifying background eclipsing binaries masquerading as planet candidates in the Kepler planet candidate catalogs. We codify the manual vetting process for Kepler Objects of Interest (KOIs) described in Bryson et al. (2013) with a series of measurements and tests that can be performed algorithmically. We compare our automated results with a sample of manually vetted KOIs from the catalog of Burke et al. (2014) and find excellent agreement. We test the performance on a set of simulated transits and find our algorithm correctly identifies simulated false positives approximately 50 of the time, and correctly identifies 99 of simulated planet candidates.
The local nanohertz gravitational-wave landscape from supermassive black hole binaries
NASA Astrophysics Data System (ADS)
Mingarelli, Chiara M. F.; Lazio, T. Joseph W.; Sesana, Alberto; Greene, Jenny E.; Ellis, Justin A.; Ma, Chung-Pei; Croft, Steve; Burke-Spolaor, Sarah; Taylor, Stephen R.
2017-12-01
Supermassive black hole binary systems form in galaxy mergers and reside in galactic nuclei with large and poorly constrained concentrations of gas and stars. These systems emit nanohertz gravitational waves that will be detectable by pulsar timing arrays. Here we estimate the properties of the local nanohertz gravitational-wave landscape that includes individual supermassive black hole binaries emitting continuous gravitational waves and the gravitational-wave background that they generate. Using the 2 Micron All-Sky Survey, together with galaxy merger rates from the Illustris simulation project, we find that there are on average 91 ± 7 continuous nanohertz gravitational-wave sources, and 7 ± 2 binaries that will never merge, within 225 Mpc. These local unresolved gravitational-wave sources can generate a departure from an isotropic gravitational-wave background at a level of about 20 per cent, and if the cosmic gravitational-wave background can be successfully isolated, gravitational waves from at least one local supermassive black hole binary could be detected in 10 years with pulsar timing arrays.
Prospects for searching the η→e+e- rare decay at the CSR
NASA Astrophysics Data System (ADS)
Ji, Chang-Sheng; Shao, Ming; Zhang, Hui; Chen, Hong-Fang; Zhang, Yi-Fei
2013-04-01
We study the possibility of searching the η→e+e- rare decay on the Cooling Storage Ring (CSR) at Lanzhou. The main features of the proposed Internal Target Experiment (ITE) and External Target Facility (ETF) are included in the Monte Carlo simulation. Both the beam condition at the CSR and the major physics backgrounds are carefully taken into account. We conclude that the ITE is more suitable for such a study due to better detector acceptance and higher beam density. At the maximum designed luminosity (1034 cm-2 s-1), η→e+e- events can be collected every ~400 seconds at the CSR. With a mass resolution of 1 MeV, the expected signal-to-background (S/B) ratio is around 1.
On gravitational chirality as the genesis of astrophysical jets
NASA Astrophysics Data System (ADS)
Tucker, R. W.; Walton, T. J.
2017-02-01
It has been suggested that single and double jets observed emanating from certain astrophysical objects may have a purely gravitational origin. We discuss new classes of plane-fronted and pulsed gravitational wave solutions to the equation for perturbations of Ricci-flat spacetimes around Minkowski metrics, as models for the genesis of such phenomena. These solutions are classified in terms of their chirality and generate a family of non-stationary spacetime metrics. Particular members of these families are used as backgrounds in analysing time-like solutions to the geodesic equation for test particles. They are found numerically to exhibit both single and double jet-like features with dimensionless aspect ratios suggesting that it may be profitable to include such backgrounds in simulations of astrophysical jet dynamics from rotating accretion discs involving electromagnetic fields.
Extending the Li&Ma method to include PSF information
NASA Astrophysics Data System (ADS)
Nievas-Rosillo, M.; Contreras, J. L.
2016-02-01
The so called Li&Ma formula is still the most frequently used method for estimating the significance of observations carried out by Imaging Atmospheric Cherenkov Telescopes. In this work a straightforward extension of the method for point sources that profits from the good imaging capabilities of current instruments is proposed. It is based on a likelihood ratio under the assumption of a well-known PSF and a smooth background. Its performance is tested with Monte Carlo simulations based on real observations and its sensitivity is compared to standard methods which do not incorporate PSF information. The gain of significance that can be attributed to the inclusion of the PSF is around 10% and can be boosted if a background model is assumed or a finer binning is used.
Evolving a Neural Olfactorimotor System in Virtual and Real Olfactory Environments
Rhodes, Paul A.; Anderson, Todd O.
2012-01-01
To provide a platform to enable the study of simulated olfactory circuitry in context, we have integrated a simulated neural olfactorimotor system with a virtual world which simulates both computational fluid dynamics as well as a robotic agent capable of exploring the simulated plumes. A number of the elements which we developed for this purpose have not, to our knowledge, been previously assembled into an integrated system, including: control of a simulated agent by a neural olfactorimotor system; continuous interaction between the simulated robot and the virtual plume; the inclusion of multiple distinct odorant plumes and background odor; the systematic use of artificial evolution driven by olfactorimotor performance (e.g., time to locate a plume source) to specify parameter values; the incorporation of the realities of an imperfect physical robot using a hybrid model where a physical robot encounters a simulated plume. We close by describing ongoing work toward engineering a high dimensional, reversible, low power electronic olfactory sensor which will allow olfactorimotor neural circuitry evolved in the virtual world to control an autonomous olfactory robot in the physical world. The platform described here is intended to better test theories of olfactory circuit function, as well as provide robust odor source localization in realistic environments. PMID:23112772
2016-11-08
Entrance to the Heroes and Legends attraction at the Kennedy Space Center Visitor Complex is by way of a sweeping ramp designed to simulate a journey to the stars by way of the "Rocket Garden." The new facility includes the U.S. Astronaut Hall of Fame and looks back to the pioneering efforts of Mercury, Gemini and Apollo. It sets the stage by providing the background and context for space exploration and the legendary men and women who pioneered the nation's journey into space.
Accommodating complexity and human behaviors in decision analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Siirola, John Daniel; Schoenwald, David Alan
2007-11-01
This is the final report for a LDRD effort to address human behavior in decision support systems. One sister LDRD effort reports the extension of this work to include actual human choices and additional simulation analyses. Another provides the background for this effort and the programmatic directions for future work. This specific effort considered the feasibility of five aspects of model development required for analysis viability. To avoid the use of classified information, healthcare decisions and the system embedding them became the illustrative example for assessment.
The Effect of AGN Heating on the Low-redshift Lyα Forest
NASA Astrophysics Data System (ADS)
Gurvich, Alex; Burkhart, Blakesley; Bird, Simeon
2017-02-01
We investigate the effects of AGN heating and the ultraviolet background on the low-redshift Lyα forest column density distribution (CDD) using the Illustris simulation. We show that Illustris reproduces observations at z = 0.1 in the column density range {10}12.5{--}{10}13.5 cm-2, relevant for the “photon underproduction crisis.” We attribute this to the inclusion of AGN feedback, which changes the gas distribution so as to mimic the effect of extra photons, as well as the use of the Faucher-Giguère ultraviolet background, which is more ionizing at z = 0.1 than the Haardt & Madau background previously considered. We show that the difference between simulations run with smoothed particle hydrodynamics and simulations using a moving mesh is small in this column density range but can be more significant at larger column densities. We further consider the effect of supernova feedback, Voigt profile fitting, and finite resolution, all of which we show to have little influence on the CDD. Finally, we identify a discrepancy between our simulations and observations at column densities {10}14{--}{10}16 cm-2, where Illustris produces too few absorbers, which suggests the AGN feedback model should be further refined. Since the “photon underproduction crisis” primarily affects lower column density systems, we conclude that AGN feedback and standard ionizing background models can resolve the crisis.
Ground Level Ozone Regional Background Characteristics In North-west Pacific Rim
NASA Astrophysics Data System (ADS)
Chiang, C.; Fan, J.; Chang, J. S.
2007-12-01
Understanding the ground level ozone regional background characteristics is essential in understanding the contribution of long-range transport of pollutants from Asia Mainland to air quality in downwind areas. In order to understand this characteristic in north-west Pacific Rim, we conducted a coupled study using ozone observation from regional background stations and 3-D regional-scale chemical transport model simulations. We used O3, CO, wind speed and wind direction data from two regional background stations and ¡§other stations¡¨ over a ten year period and organized several numerical experiments to simulate one spring month in 2003 to obtain a deeper understanding. The so called ¡§other stations¡¨ had actually been named as background stations under various governmental auspices. But we found them to be often under strong influence of local pollution sources with strong diurnal or slightly longer time variations. We found that the Yonagunijima station (24.74 N, 123.02 E) and Heng-Chuen station (21.96 N,120.78 E), about a distance of 400 km apart, have almost the same ozone time series pattern. For these two stations in 2003, correlation coefficients (R2) for annual observed ozone concentration is about 0.64, in the springtime it is about 0.7, and in a one-month period at simulation days it is about 0.76. These two stations have very little small scale variations in all the variables studied. All variations are associated with large scale circulation changes. This is especially so at Yonagunijima station. Using a 3-D regional-scale chemical transport model for East Asia region including contribution from Asia continental outflow and neighboring island pollution areas we found that the Yonagunijima and HengChuen station are indeed free of pollutants from all neighboring areas keeping in mind that pollutants from Taiwan area is never far away. Ozone concentrations in these two stations are dominated by synoptic scale weather patterns, with diffused pollutant contribution from distant sources. When the weather system brings in air mass from the low latitude of western Pacific Ocean, ozone concentrations are about 10-20 ppb. When the China high pressure system moves eastward and with the accompanying Asian continental outflow plume, ozone concentrations are about 65-80 ppb.
Simulation of background from low-level tritium and radon emanation in the KATRIN spectrometers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leiber, B.; Collaboration: KATRIN Collaboration
The KArlsruhe TRItium Neutrino (KATRIN) experiment is a large-scale experiment for the model independent determination of the mass of electron anti-neutrinos with a sensitivity of 200 meV/c{sup 2}. It investigates the kinematics of electrons from tritium beta decay close to the endpoint of the energy spectrum at 18.6 keV. To achieve a good signal to background ratio at the endpoint, a low background rate below 10{sup −2} counts per second is required. The KATRIN setup thus consists of a high luminosity windowless gaseous tritium source (WGTS), a magnetic electron transport system with differential and cryogenic pumping for tritium retention, andmore » electro-static retarding spectrometers (pre-spectrometer and main spectrometer) for energy analysis, followed by a segmented detector system for counting transmitted beta-electrons. A major source of background comes from magnetically trapped electrons in the main spectrometer (vacuum vessel: 1240 m{sup 3}, 10{sup −11} mbar) produced by nuclear decays in the magnetic flux tube of the spectrometer. Major contributions are expected from short-lived radon isotopes and tritium. Primary electrons, originating from these decays, can be trapped for hours, until having lost almost all their energy through inelastic scattering on residual gas particles. Depending on the initial energy of the primary electron, up to hundreds of low energetic secondary electrons can be produced. Leaving the spectrometer, these electrons will contribute to the background rate. This contribution describes results from simulations for the various background sources. Decays of {sup 219}Rn, emanating from the main vacuum pump, and tritium from the WGTS that reaches the spectrometers are expected to account for most of the background. As a result of the radon alpha decay, electrons are emitted through various processes, such as shake-off, internal conversion and the Auger deexcitations. The corresponding simulations were done using the KASSIOPEIA framework, which has been developed for the KATRIN experiment for low-energy electron tracking, field calculation and detector simulation. The results of the simulations have been used to optimize the design parameters of the vacuum system with regard to radon emanation and tritium pumping, in order to reach the stringent requirements of the neutrino mass measurement.« less
Accurate Modeling of the Terrestrial Gamma-Ray Background for Homeland Security Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandness, Gerald A.; Schweppe, John E.; Hensley, Walter K.
2009-10-24
Abstract–The Pacific Northwest National Laboratory has developed computer models to simulate the use of radiation portal monitors to screen vehicles and cargo for the presence of illicit radioactive material. The gamma radiation emitted by the vehicles or cargo containers must often be measured in the presence of a relatively large gamma-ray background mainly due to the presence of potassium, uranium, and thorium (and progeny isotopes) in the soil and surrounding building materials. This large background is often a significant limit to the detection sensitivity for items of interest and must be modeled accurately for analyzing homeland security situations. Calculations ofmore » the expected gamma-ray emission from a disk of soil and asphalt were made using the Monte Carlo transport code MCNP and were compared to measurements made at a seaport with a high-purity germanium detector. Analysis revealed that the energy spectrum of the measured background could not be reproduced unless the model included gamma rays coming from the ground out to distances of at least 300 m. The contribution from beyond about 50 m was primarily due to gamma rays that scattered in the air before entering the detectors rather than passing directly from the ground to the detectors. These skyshine gamma rays contribute tens of percent to the total gamma-ray spectrum, primarily at energies below a few hundred keV. The techniques that were developed to efficiently calculate the contributions from a large soil disk and a large air volume in a Monte Carlo simulation are described and the implications of skyshine in portal monitoring applications are discussed.« less
Virtual reality simulator training of laparoscopic cholecystectomies - a systematic review.
Ikonen, T S; Antikainen, T; Silvennoinen, M; Isojärvi, J; Mäkinen, E; Scheinin, T M
2012-01-01
Simulators are widely used in occupations where practice in authentic environments would involve high human or economic risks. Surgical procedures can be simulated by increasingly complex and expensive techniques. This review gives an update on computer-based virtual reality (VR) simulators in training for laparoscopic cholecystectomies. From leading databases (Medline, Cochrane, Embase), randomised or controlled trials and the latest systematic reviews were systematically searched and reviewed. Twelve randomised trials involving simulators were identified and analysed, as well as four controlled studies. Furthermore, seven studies comparing black boxes and simulators were included. The results indicated any kind of simulator training (black box, VR) to be beneficial at novice level. After VR training, novice surgeons seemed to be able to perform their first live cholecystectomies with fewer errors, and in one trial the positive effect remained during the first ten cholecystectomies. No clinical follow-up data were found. Optimal learning requires skills training to be conducted as part of a systematic training program. No data on the cost-benefit of simulators were found, the price of a VR simulator begins at EUR 60 000. Theoretical background to learning and limited research data support the use of simulators in the early phases of surgical training. The cost of buying and using simulators is justified if the risk of injuries and complications to patients can be reduced. Developing surgical skills requires repeated training. In order to achieve optimal learning a validated training program is needed.
PREDICTION METRICS FOR CHEMICAL DETECTION IN LONG-WAVE INFRARED HYPERSPECTRAL IMAGERY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chilton, M.; Walsh, S.J.; Daly, D.S.
2009-01-01
Natural and man-made chemical processes generate gaseous plumes that may be detected by hyperspectral imaging, which produces a matrix of spectra affected by the chemical constituents of the plume, the atmosphere, the bounding background surface and instrument noise. A physics-based model of observed radiance shows that high chemical absorbance and low background emissivity result in a larger chemical signature. Using simulated hyperspectral imagery, this study investigated two metrics which exploited this relationship. The objective was to explore how well the chosen metrics predicted when a chemical would be more easily detected when comparing one background type to another. The twomore » predictor metrics correctly rank ordered the backgrounds for about 94% of the chemicals tested as compared to the background rank orders from Whitened Matched Filtering (a detection algorithm) of the simulated spectra. These results suggest that the metrics provide a reasonable summary of how the background emissivity and chemical absorbance interact to produce the at-sensor chemical signal. This study suggests that similarly effective predictors that account for more general physical conditions may be derived.« less
Geant4 Developments for the Radon Electric Dipole Moment Search at TRIUMF
NASA Astrophysics Data System (ADS)
Rand, E. T.; Bangay, J. C.; Bianco, L.; Dunlop, R.; Finlay, P.; Garrett, P. E.; Leach, K. G.; Phillips, A. A.; Sumithrarachchi, C. S.; Svensson, C. E.; Wong, J.
2011-09-01
An experiment is being developed at TRIUMF to search for a time-reversal violating electric dipole moment (EDM) in odd-A isotopes of Rn. Extensive simulations of the experiment are being performed with GEANT4 to study the backgrounds and sensitivity of the proposed measurement technique involving the detection of γ rays emitted following the β decay of polarized Rn nuclei. GEANT4 developments for the RnEDM experiment include both realistic modelling of the detector geometry and full tracking of the radioactive β, γ, internal conversion, and x-ray processes, including the γ-ray angular distributions essential for measuring an atomic EDM.
NASA Astrophysics Data System (ADS)
Cathala, Thierry; Douchin, Nicolas; Latger, Jean; Caillault, Karine; Fauqueux, Sandrine; Huet, Thierry; Lubarre, Luc; Malherbe, Claire; Rosier, Bernard; Simoneau, Pierre
2009-05-01
The SE-WORKBENCH workshop, also called CHORALE (French acceptation for "simulated Optronic Acoustic Radar battlefield") is used by the French DGA (MoD) and several other Defense organizations and companies all around the World to perform multi-sensors simulations. CHORALE enables the user to create virtual and realistic multi spectral 3D scenes that may contain several types of target, and then generate the physical signal received by a sensor, typically an IR sensor. The SE-WORKBENCH can be used either as a collection of software modules through dedicated GUIs or as an API made of a large number of specialized toolkits. The SE-WORKBENCH is made of several functional block: one for geometrically and physically modeling the terrain and the targets, one for building the simulation scenario and one for rendering the synthetic environment, both in real and non real time. Among the modules that the modeling block is composed of, SE-ATMOSPHERE is used to simulate the atmospheric conditions of a Synthetic Environment and then to integrate the impact of these conditions on a scene. This software product generates an exploitable physical atmosphere by the SE WORKBENCH tools generating spectral images. It relies on several external radiative transfer models such as MODTRAN V4.2 in the current version. MATISSE [4,5] is a background scene generator developed for the computation of natural background spectral radiance images and useful atmospheric radiative quantities (radiance and transmission along a line of sight, local illumination, solar irradiance ...). Backgrounds include atmosphere, low and high altitude clouds, sea and land. A particular characteristic of the code is its ability to take into account atmospheric spatial variability (temperatures, mixing ratio, etc) along each line of sight. An Application Programming Interface (API) is included to facilitate its use in conjunction with external codes. MATISSE is currently considered as a new external radiative transfer model to be integrated in SE-ATMOSPHERE as a complement to MODTRAN. Compared to the latter which is used as a whole MATISSE can be used step by step and modularly as an API: this can avoid to pre compute large atmospheric parameters tables as it is done currently with MODTRAN. The use of MATISSE will also enable a real coupling between the ray tracing process of the SEWORKBENCH and the radiative transfer model of MATISSE. This will lead to the improvement of the link between a general atmospheric model and a specific 3D terrain. The paper will demonstrate the advantages for the SE WORKEBNCH of using MATISSE as a new atmospheric code, but also for computing the radiative properties of the sea surface.
Background of SAM atom-fraction profiles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ernst, Frank
Atom-fraction profiles acquired by SAM (scanning Auger microprobe) have important applications, e.g. in the context of alloy surface engineering by infusion of carbon or nitrogen through the alloy surface. However, such profiles often exhibit an artifact in form of a background with a level that anti-correlates with the local atom fraction. This article presents a theory explaining this phenomenon as a consequence of the way in which random noise in the spectrum propagates into the discretized differentiated spectrum that is used for quantification. The resulting model of “energy channel statistics” leads to a useful semi-quantitative background reduction procedure, which ismore » validated by applying it to simulated data. Subsequently, the procedure is applied to an example of experimental SAM data. The analysis leads to conclusions regarding optimum experimental acquisition conditions. The proposed method of background reduction is based on general principles and should be useful for a broad variety of applications. - Highlights: • Atom-fraction–depth profiles of carbon measured by scanning Auger microprobe • Strong background, varies with local carbon concentration. • Needs correction e.g. for quantitative comparison with simulations • Quantitative theory explains background. • Provides background removal strategy and practical advice for acquisition.« less
High-efficiency and low-background multi-segmented proportional gas counter for β-decay spectroscopy
NASA Astrophysics Data System (ADS)
Mukai, M.; Hirayama, Y.; Watanabe, Y. X.; Schury, P.; Jung, H. S.; Ahmed, M.; Haba, H.; Ishiyama, H.; Jeong, S. C.; Kakiguchi, Y.; Kimura, S.; Moon, J. Y.; Oyaizu, M.; Ozawa, A.; Park, J. H.; Ueno, H.; Wada, M.; Miyatake, H.
2018-03-01
A multi-segmented proportional gas counter (MSPGC) with high detection efficiency and low-background event rate has been developed for β-decay spectroscopy. The MSPGC consists of two cylindrically aligned layers of 16 counters (32 counters in total). Each counter has a long active length and small trapezoidal cross-section, and the total solid angle of the 32 counters is 80% of 4 π. β-rays are distinguished from the background events including cosmic-rays by analyzing the hit patterns of independent counters. The deduced intrinsic detection efficiency of each counter was almost 100%. The measured background event rate was 0.11 counts per second using the combination of veto counters for cosmic-rays and lead block shields for background γ-rays. The MSPGC was applied to measure the β-decay half-lives of 198Ir and 199mPt. The evaluated half-lives of T1/2 = 9 . 8(7) s and 12.4(7) s for 198Ir and 199mPt, respectively, were in agreement with previously reported values. The estimated absolute detection efficiency of the MSPGC from GEANT4 simulations was consistent with the evaluated efficiency from the analysis of the β- γ spectroscopy of 199Pt, saturating at approximately 60% for Qβ > 4 MeV.
Estimation of channel parameters and background irradiance for free-space optical link.
Khatoon, Afsana; Cowley, William G; Letzepis, Nick; Giggenbach, Dirk
2013-05-10
Free-space optical communication can experience severe fading due to optical scintillation in long-range links. Channel estimation is also corrupted by background and electrical noise. Accurate estimation of channel parameters and scintillation index (SI) depends on perfect removal of background irradiance. In this paper, we propose three different methods, the minimum-value (MV), mean-power (MP), and maximum-likelihood (ML) based methods, to remove the background irradiance from channel samples. The MV and MP methods do not require knowledge of the scintillation distribution. While the ML-based method assumes gamma-gamma scintillation, it can be easily modified to accommodate other distributions. Each estimator's performance is compared using simulation data as well as experimental measurements. The estimators' performance are evaluated from low- to high-SI areas using simulation data as well as experimental trials. The MV and MP methods have much lower complexity than the ML-based method. However, the ML-based method shows better SI and background-irradiance estimation performance.
Alpha particles diffusion due to charge changes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clauser, C. F., E-mail: cesar.clauser@ib.edu.ar; Farengo, R.
2015-12-15
Alpha particles diffusion due to charge changes in a magnetized plasma is studied. Analytical calculations and numerical simulations are employed to show that this process can be very important in the pedestal-edge-SOL regions. This is the first study that presents clear evidence of the importance of atomic processes on the diffusion of alpha particles. A simple 1D model that includes inelastic collisions with plasma species, “cold” neutrals, and partially ionized species was employed. The code, which follows the exact particle orbits and includes the effect of inelastic collisions via a Monte Carlo type random process, runs on a graphic processormore » unit (GPU). The analytical and numerical results show excellent agreement when a uniform background (plasma and cold species) is assumed. The simulations also show that the gradients in the density of the plasma and cold species, which are large and opposite in the edge region, produce an inward flux of alpha particles. Calculations of the alpha particles flux reaching the walls or divertor plates should include these processes.« less
Two- and three-dimensional turbine blade row flow field simulations
NASA Technical Reports Server (NTRS)
Buggeln, R. C.; Briley, W. R.; Mcdonald, H.; Shamroth, S. J.; Weinberg, B. C.
1987-01-01
Work performed in the numerical simulation of turbine passage flows via a Navier-Stokes approach is discussed. Both laminar and turbulent simulations in both two and three dimensions are discussed. An outline of the approach, background, and an overview of the results are given.
[Lake eutrophication modeling in considering climatic factors change: a review].
Su, Jie-Qiong; Wang, Xuan; Yang, Zhi-Feng
2012-11-01
Climatic factors are considered as the key factors affecting the trophic status and its process in most lakes. Under the background of global climate change, to incorporate the variations of climatic factors into lake eutrophication models could provide solid technical support for the analysis of the trophic evolution trend of lake and the decision-making of lake environment management. This paper analyzed the effects of climatic factors such as air temperature, precipitation, sunlight, and atmosphere on lake eutrophication, and summarized the research results about the lake eutrophication modeling in considering in considering climatic factors change, including the modeling based on statistical analysis, ecological dynamic analysis, system analysis, and intelligent algorithm. The prospective approaches to improve the accuracy of lake eutrophication modeling with the consideration of climatic factors change were put forward, including 1) to strengthen the analysis of the mechanisms related to the effects of climatic factors change on lake trophic status, 2) to identify the appropriate simulation models to generate several scenarios under proper temporal and spatial scales and resolutions, and 3) to integrate the climatic factors change simulation, hydrodynamic model, ecological simulation, and intelligent algorithm into a general modeling system to achieve an accurate prediction of lake eutrophication under climatic change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Churchfield, M. J.; Moriarty, P. J.; Hao, Y.
The focus of this work is the comparison of the dynamic wake meandering model and large-eddy simulation with field data from the Egmond aan Zee offshore wind plant composed of 36 3-MW turbines. The field data includes meteorological mast measurements, SCADA information from all turbines, and strain-gauge data from two turbines. The dynamic wake meandering model and large-eddy simulation are means of computing unsteady wind plant aerodynamics, including the important unsteady meandering of wakes as they convect downstream and interact with other turbines and wakes. Both of these models are coupled to a turbine model such that power and mechanicalmore » loads of each turbine in the wind plant are computed. We are interested in how accurately different types of waking (e.g., direct versus partial waking), can be modeled, and how background turbulence level affects these loads. We show that both the dynamic wake meandering model and large-eddy simulation appear to underpredict power and overpredict fatigue loads because of wake effects, but it is unclear that they are really in error. This discrepancy may be caused by wind-direction uncertainty in the field data, which tends to make wake effects appear less pronounced.« less
NASA Astrophysics Data System (ADS)
Shypailo, R. J.; Ellis, K. J.
2011-05-01
During construction of the whole body counter (WBC) at the Children's Nutrition Research Center (CNRC), efficiency calibration was needed to translate acquired counts of 40K to actual grams of potassium for measurement of total body potassium (TBK) in a diverse subject population. The MCNP Monte Carlo n-particle simulation program was used to describe the WBC (54 detectors plus shielding), test individual detector counting response, and create a series of virtual anthropomorphic phantoms based on national reference anthropometric data. Each phantom included an outer layer of adipose tissue and an inner core of lean tissue. Phantoms were designed for both genders representing ages 3.5 to 18.5 years with body sizes from the 5th to the 95th percentile based on body weight. In addition, a spherical surface source surrounding the WBC was modeled in order to measure the effects of subject mass on room background interference. Individual detector measurements showed good agreement with the MCNP model. The background source model came close to agreement with empirical measurements, but showed a trend deviating from unity with increasing subject size. Results from the MCNP simulation of the CNRC WBC agreed well with empirical measurements using BOMAB phantoms. Individual detector efficiency corrections were used to improve the accuracy of the model. Nonlinear multiple regression efficiency calibration equations were derived for each gender. Room background correction is critical in improving the accuracy of the WBC calibration.
Solernou, Albert
2018-01-01
Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package. PMID:29570700
NASA Astrophysics Data System (ADS)
Chen, M.; Lemon, C.; Walterscheid, R. L.; Hecht, J. H.; Sazykin, S. Y.; Wolf, R.
2017-12-01
We investigate how neutral winds and particle precipitation affect the simulated development of electric fields including Sub-Auroral Polarization Streams (SAPS) during the 17 March 2013 storm. Our approach is to use the magnetically and electrically self-consistent Rice Convection Model - Equilibrium (RCM-E) to simulate the inner magnetospheric electric field. We use parameterized rates of whistler-generated electron pitch-angle scattering from Orlova and Shprits [JGR, 2014] that depend on equatorial radial distance, magnetic activity (Kp), and magnetic local time (MLT) outside the simulated plasmasphere. Inside the plasmasphere, parameterized scattering rates due to hiss [Orlova et al., GRL, 2014] are used. Ions are scattered at a fraction of strong pitch-angle scattering where the fraction is scaled by epsilon, the ratio of the gyroradius to the field-line radius of curvature, when epsilon is greater than 0.1. The electron and proton contributions to the auroral conductance in the RCM-E are calculated using the empirical Robinson et al. [JGR, 1987] and Galand and Richmond [JGR, 2001] equations, respectively. The "background" ionospheric conductance is based on parameters from the International Reference Ionosphere [Bilitza and Reinisch, JASR, 2008] but modified to include the effect of specified ionospheric troughs. Neutral winds are modeled by the empirical Horizontal Wind Model (HWM07) in the RCM-E. We compare simulated precipitating particle energy flux, E x B velocities with DMSP observations during the 17 March 2013 storm with and without the inclusion of neutral winds. Discrepancies between the simulations and observations will aid us in assessing needed improvements in the model.
Adaptive Core Simulation Employing Discrete Inverse Theory - Part II: Numerical Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel-Khalik, Hany S.; Turinsky, Paul J.
2005-07-15
Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. The companion paper, ''Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory,'' describes in detail the theoretical background of the proposed adaptive techniques. This paper, Part II, demonstrates several computational experiments conducted to assess the fidelity and robustness of the proposed techniques. The intentmore » is to check the ability of the adapted core simulator model to predict future core observables that are not included in the adaption or core observables that are recorded at core conditions that differ from those at which adaption is completed. Also, this paper demonstrates successful utilization of an efficient sensitivity analysis approach to calculate the sensitivity information required to perform the adaption for millions of input core parameters. Finally, this paper illustrates a useful application for adaptive simulation - reducing the inconsistencies between two different core simulator code systems, where the multitudes of input data to one code are adjusted to enhance the agreement between both codes for important core attributes, i.e., core reactivity and power distribution. Also demonstrated is the robustness of such an application.« less
Gao, Lin; Zhang, Tongsheng; Wang, Jue; Stephen, Julia
2014-01-01
When connectivity analysis is carried out for event related EEG and MEG, the presence of strong spatial correlations from spontaneous activity in background may mask the local neuronal evoked activity and lead to spurious connections. In this paper, we hypothesized PCA decomposition could be used to diminish the background activity and further improve the performance of connectivity analysis in event related experiments. The idea was tested using simulation, where we found that for the 306-channel Elekta Neuromag system, the first 4 PCs represent the dominant background activity, and the source connectivity pattern after preprocessing is consistent with the true connectivity pattern designed in the simulation. Improving signal to noise of the evoked responses by discarding the first few PCs demonstrates increased coherences at major physiological frequency bands when removing the first few PCs. Furthermore, the evoked information was maintained after PCA preprocessing. In conclusion, it is demonstrated that the first few PCs represent background activity, and PCA decomposition can be employed to remove it to expose the evoked activity for the channels under investigation. Therefore, PCA can be applied as a preprocessing approach to improve neuronal connectivity analysis for event related data. PMID:22918837
Gao, Lin; Zhang, Tongsheng; Wang, Jue; Stephen, Julia
2013-04-01
When connectivity analysis is carried out for event related EEG and MEG, the presence of strong spatial correlations from spontaneous activity in background may mask the local neuronal evoked activity and lead to spurious connections. In this paper, we hypothesized PCA decomposition could be used to diminish the background activity and further improve the performance of connectivity analysis in event related experiments. The idea was tested using simulation, where we found that for the 306-channel Elekta Neuromag system, the first 4 PCs represent the dominant background activity, and the source connectivity pattern after preprocessing is consistent with the true connectivity pattern designed in the simulation. Improving signal to noise of the evoked responses by discarding the first few PCs demonstrates increased coherences at major physiological frequency bands when removing the first few PCs. Furthermore, the evoked information was maintained after PCA preprocessing. In conclusion, it is demonstrated that the first few PCs represent background activity, and PCA decomposition can be employed to remove it to expose the evoked activity for the channels under investigation. Therefore, PCA can be applied as a preprocessing approach to improve neuronal connectivity analysis for event related data.
Further Improvement of the RITS Code for Pulsed Neutron Bragg-edge Transmission Imaging
NASA Astrophysics Data System (ADS)
Sato, H.; Watanabe, K.; Kiyokawa, K.; Kiyanagi, R.; Hara, K. Y.; Kamiyama, T.; Furusaka, M.; Shinohara, T.; Kiyanagi, Y.
The RITS code is a unique and powerful tool for a whole Bragg-edge transmission spectrum fitting analysis. However, it has had two major problems. Therefore, we have proposed methods to overcome these problems. The first issue is the difference in the crystallite size values between the diffraction and the Bragg-edge analyses. We found the reason was a different definition of the crystal structure factor. It affects the crystallite size because the crystallite size is deduced from the primary extinction effect which depends on the crystal structure factor. As a result of algorithm change, crystallite sizes obtained by RITS drastically approached to crystallite sizes obtained by Rietveld analyses of diffraction data; from 155% to 110%. The second issue is correction of the effect of background neutrons scattered from a specimen. Through neutron transport simulation studies, we found that the background components consist of forward Bragg scattering, double backward Bragg scattering, and thermal diffuse scattering. RITS with the background correction function which was developed through the simulation studies could well reconstruct various simulated and experimental transmission spectra, but refined crystalline microstructural parameters were often distorted. Finally, it was recommended to reduce the background by improving experimental conditions.
Wong, Lai Fun; Chan, Sally Wai-Chi; Ho, Jasmine Tze Yin; Mordiffi, Siti Zubaidah; Ang, Sophia Bee Leng; Goh, Poh Sun; Ang, Emily Neo Kim
2015-01-01
Background Web-based learning is becoming an increasingly important instructional tool in nursing education. Multimedia advancements offer the potential for creating authentic nursing activities for developing nursing competency in clinical practice. Objective This study aims to describe the design, development, and evaluation of an interactive multimedia Web-based simulation for developing nurses’ competencies in acute nursing care. Methods Authentic nursing activities were developed in a Web-based simulation using a variety of instructional strategies including animation video, multimedia instructional material, virtual patients, and online quizzes. A randomized controlled study was conducted on 67 registered nurses who were recruited from the general ward units of an acute care tertiary hospital. Following a baseline evaluation of all participants’ clinical performance in a simulated clinical setting, the experimental group received 3 hours of Web-based simulation and completed a survey to evaluate their perceptions of the program. All participants were re-tested for their clinical performances using a validated tool. Results The clinical performance posttest scores of the experimental group improved significantly (P<.001) from the pretest scores after the Web-based simulation. In addition, compared to the control group, the experimental group had significantly higher clinical performance posttest scores (P<.001) after controlling the pretest scores. The participants from the experimental group were satisfied with their learning experience and gave positive ratings for the quality of the Web-based simulation. Themes emerging from the comments about the most valuable aspects of the Web-based simulation include relevance to practice, instructional strategies, and fostering problem solving. Conclusions Engaging in authentic nursing activities using interactive multimedia Web-based simulation can enhance nurses’ competencies in acute care. Web-based simulations provide a promising educational tool in institutions where large groups of nurses need to be trained in acute nursing care and accessibility to repetitive training is essential for achieving long-term retention of clinical competency. PMID:25583029
Automatic insertion of simulated microcalcification clusters in a software breast phantom
NASA Astrophysics Data System (ADS)
Shankla, Varsha; Pokrajac, David D.; Weinstein, Susan P.; DeLeo, Michael; Tuite, Catherine; Roth, Robyn; Conant, Emily F.; Maidment, Andrew D.; Bakic, Predrag R.
2014-03-01
An automated method has been developed to insert realistic clusters of simulated microcalcifications (MCs) into computer models of breast anatomy. This algorithm has been developed as part of a virtual clinical trial (VCT) software pipeline, which includes the simulation of breast anatomy, mechanical compression, image acquisition, image processing, display and interpretation. An automated insertion method has value in VCTs involving large numbers of images. The insertion method was designed to support various insertion placement strategies, governed by probability distribution functions (pdf). The pdf can be predicated on histological or biological models of tumor growth, or estimated from the locations of actual calcification clusters. To validate the automated insertion method, a 2-AFC observer study was designed to compare two placement strategies, undirected and directed. The undirected strategy could place a MC cluster anywhere within the phantom volume. The directed strategy placed MC clusters within fibroglandular tissue on the assumption that calcifications originate from epithelial breast tissue. Three radiologists were asked to select between two simulated phantom images, one from each placement strategy. Furthermore, questions were posed to probe the rationale behind the observer's selection. The radiologists found the resulting cluster placement to be realistic in 92% of cases, validating the automated insertion method. There was a significant preference for the cluster to be positioned on a background of adipose or mixed adipose/fibroglandular tissues. Based upon these results, this automated lesion placement method will be included in our VCT simulation pipeline.
NASA Technical Reports Server (NTRS)
Rorie, Conrad; Monk, Kevin; Roberts, Zach; Brandt, Summer
2018-01-01
This presentation provides an overview of the primary results from the Unmanned Aircraft Systems (UAS) Integration in the National Airspace System (NAS) Project's second Terminal Operations human-in-the-loop simulation. This talk covers the background of this follow-on experiment, which includes an overview of the first Terminal Operations HITL performed by the project. The primary results include a look at the number and durations of detect and avoid (DAA) alerts issued by the two DAA systems under test. It also includes response time metrics and metrics on the ability of the pilot-in-command (PIC) to maintain sufficient separation. Additional interoperability metrics are included to illustrate how pilots interact with the tower controller. Implications and conclusions are covered at the end.
NASA Astrophysics Data System (ADS)
Roy, N.; Molson, J.; Lemieux, J.-M.; Van Stempvoort, D.; Nowamooz, A.
2016-07-01
Three-dimensional numerical simulations are used to provide insight into the behavior of methane as it migrates from a leaky decommissioned hydrocarbon well into a shallow aquifer. The conceptual model includes gas-phase migration from a leaky well, dissolution into groundwater, advective-dispersive transport and biodegradation of the dissolved methane plume. Gas-phase migration is simulated using the DuMux multiphase simulator, while transport and fate of the dissolved phase is simulated using the BIONAPL/3D reactive transport model. Methane behavior is simulated for two conceptual models: first in a shallow confined aquifer containing a decommissioned leaky well based on a monitored field site near Lindbergh, Alberta, Canada, and secondly on a representative unconfined aquifer based loosely on the Borden, Ontario, field site. The simulations show that the Lindbergh site confined aquifer data are generally consistent with a 2 year methane leak of 2-20 m3/d, assuming anaerobic (sulfate-reducing) methane oxidation and with maximum oxidation rates of 1 × 10-5 to 1 × 10-3 kg/m3/d. Under the highest oxidation rate, dissolved methane decreased from solubility (110 mg/L) to the threshold concentration of 10 mg/L within 5 years. In the unconfined case with the same leakage rate, including both aerobic and anaerobic methane oxidation, the methane plume was less extensive compared to the confined aquifer scenarios. Unconfined aquifers may therefore be less vulnerable to impacts from methane leaks along decommissioned wells. At other potential leakage sites, site-specific data on the natural background geochemistry would be necessary to make reliable predictions on the fate of methane in groundwater.
HEP Software Foundation Community White Paper Working Group - Detector Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apostolakis, J.
A working group on detector simulation was formed as part of the high-energy physics (HEP) Software Foundation's initiative to prepare a Community White Paper that describes the main software challenges and opportunities to be faced in the HEP field over the next decade. The working group met over a period of several months in order to review the current status of the Full and Fast simulation applications of HEP experiments and the improvements that will need to be made in order to meet the goals of future HEP experimental programmes. The scope of the topics covered includes the main componentsmore » of a HEP simulation application, such as MC truth handling, geometry modeling, particle propagation in materials and fields, physics modeling of the interactions of particles with matter, the treatment of pileup and other backgrounds, as well as signal processing and digitisation. The resulting work programme described in this document focuses on the need to improve both the software performance and the physics of detector simulation. The goals are to increase the accuracy of the physics models and expand their applicability to future physics programmes, while achieving large factors in computing performance gains consistent with projections on available computing resources.« less
NASA Astrophysics Data System (ADS)
Berendsen, Herman J. C.
2004-06-01
The simulation of physical systems requires a simplified, hierarchical approach which models each level from the atomistic to the macroscopic scale. From quantum mechanics to fluid dynamics, this book systematically treats the broad scope of computer modeling and simulations, describing the fundamental theory behind each level of approximation. Berendsen evaluates each stage in relation to its applications giving the reader insight into the possibilities and limitations of the models. Practical guidance for applications and sample programs in Python are provided. With a strong emphasis on molecular models in chemistry and biochemistry, this book will be suitable for advanced undergraduate and graduate courses on molecular modeling and simulation within physics, biophysics, physical chemistry and materials science. It will also be a useful reference to all those working in the field. Additional resources for this title including solutions for instructors and programs are available online at www.cambridge.org/9780521835275. The first book to cover the wide range of modeling and simulations, from atomistic to the macroscopic scale, in a systematic fashion Providing a wealth of background material, it does not assume advanced knowledge and is eminently suitable for course use Contains practical examples and sample programs in Python
NASA Astrophysics Data System (ADS)
Inochkin, F. M.; Kruglov, S. K.; Bronshtein, I. G.; Kompan, T. A.; Kondratjev, S. V.; Korenev, A. S.; Pukhov, N. F.
2017-06-01
A new method for precise subpixel edge estimation is presented. The principle of the method is the iterative image approximation in 2D with subpixel accuracy until the appropriate simulated is found, matching the simulated and acquired images. A numerical image model is presented consisting of three parts: an edge model, object and background brightness distribution model, lens aberrations model including diffraction. The optimal values of model parameters are determined by means of conjugate-gradient numerical optimization of a merit function corresponding to the L2 distance between acquired and simulated images. Computationally-effective procedure for the merit function calculation along with sufficient gradient approximation is described. Subpixel-accuracy image simulation is performed in a Fourier domain with theoretically unlimited precision of edge points location. The method is capable of compensating lens aberrations and obtaining the edge information with increased resolution. Experimental method verification with digital micromirror device applied to physically simulate an object with known edge geometry is shown. Experimental results for various high-temperature materials within the temperature range of 1000°C..2400°C are presented.
NASA Astrophysics Data System (ADS)
Le, Manh; Ngirmang, Gregory; Orban, Chris; Morrison, John; Chowdhury, Enam; Roquemore, William
2017-10-01
We present two-dimensional particle-in-cell (PIC) simulations that investigate the role of background pressure on the acceleration of electrons from ultra intense laser interaction at normal incidence with liquid density ethylene glycol targets. The interaction was simulated at ten different pressures varying from 7.8 mTorr to 26 Torr. We calculated conversion efficiencies from the simulation results and plotted the efficiencies with respect to the background pressure. The results revealed that the laser to > 100 keV electron conversion efficiency remained flat around 0.35% from 7.8 mTorr to 1.2 Torr and increased exponentially from 1.2 Torr onward to about 1.47% at 26 Torr. Increasing the background pressure clearly has a dramatic effect on the acceleration of electrons from the target. We explain how electrostatic effects, in particular the neutralization of the target by the background plasma, allows electrons to escape more easily and that this effect is strengthened with higher densities. This work could facilitate the design of future experiments in increasing laser to electron conversion efficiency and generating substantial bursts of electrons with relativistic energies. This research is supported by the Air Force Office of Scientific Research under LRIR Project 17RQCOR504 under the management of Dr. Riq Parra and Dr. Jean-Luc Cambier. Support was also provided by the DOD HPCMP Internship Program.
Flute Instability of Expanding Plasma Cloud
NASA Astrophysics Data System (ADS)
Dudnikova, Galina; Vshivkov, Vitali
2000-10-01
The expansion of plasma against a magnetized background where collisions play no role is a situation common to many plasma phenomena. The character of interaction between expanding plasma and background plasma is depending of the ratio of the expansion velocity to the ambient Alfven velocity. If the expansion speed is greater than the background Alfven speed (super-Alfvenic flows) a collisionless shock waves are formed in background plasma. It is originally think that if the expansion speed is less than Alfvenic speed (sub-Alfvenic flows) the interaction of plasma flows will be laminar in nature. However, the results of laboratory experiments and chemical releases in magnetosphere have shown the development of flute instability on the boundary of expanding plasma (Rayleigh-Taylor instability). A lot of theoretical and experimental papers have been devoted to study the Large Larmor Flute Instability (LLFI) of plasma expanding into a vacuum magnetic field. In the present paper on the base of computer simulation of plasma cloud expansion in magnetizied background plasma the regimes of development and stabilization LLFI for super- and sub-Alfvenic plasma flows are investigated. 2D hybrid numerical model is based on kinetic Vlasov equation for ions and hydrodynamic approximation for electrons. The similarity parameters characterizing the regimes of laminar flows are founded. The stabilization of LLFI takes place with the transition from sub- to super-Alfvenic plasma cloud expansion. The results of the comparision between computer simulation and laboratory simulation are described.
Measurement error in time-series analysis: a simulation study comparing modelled and monitored data.
Butland, Barbara K; Armstrong, Ben; Atkinson, Richard W; Wilkinson, Paul; Heal, Mathew R; Doherty, Ruth M; Vieno, Massimo
2013-11-13
Assessing health effects from background exposure to air pollution is often hampered by the sparseness of pollution monitoring networks. However, regional atmospheric chemistry-transport models (CTMs) can provide pollution data with national coverage at fine geographical and temporal resolution. We used statistical simulation to compare the impact on epidemiological time-series analysis of additive measurement error in sparse monitor data as opposed to geographically and temporally complete model data. Statistical simulations were based on a theoretical area of 4 regions each consisting of twenty-five 5 km × 5 km grid-squares. In the context of a 3-year Poisson regression time-series analysis of the association between mortality and a single pollutant, we compared the error impact of using daily grid-specific model data as opposed to daily regional average monitor data. We investigated how this comparison was affected if we changed the number of grids per region containing a monitor. To inform simulations, estimates (e.g. of pollutant means) were obtained from observed monitor data for 2003-2006 for national network sites across the UK and corresponding model data that were generated by the EMEP-WRF CTM. Average within-site correlations between observed monitor and model data were 0.73 and 0.76 for rural and urban daily maximum 8-hour ozone respectively, and 0.67 and 0.61 for rural and urban loge(daily 1-hour maximum NO2). When regional averages were based on 5 or 10 monitors per region, health effect estimates exhibited little bias. However, with only 1 monitor per region, the regression coefficient in our time-series analysis was attenuated by an estimated 6% for urban background ozone, 13% for rural ozone, 29% for urban background loge(NO2) and 38% for rural loge(NO2). For grid-specific model data the corresponding figures were 19%, 22%, 54% and 44% respectively, i.e. similar for rural loge(NO2) but more marked for urban loge(NO2). Even if correlations between model and monitor data appear reasonably strong, additive classical measurement error in model data may lead to appreciable bias in health effect estimates. As process-based air pollution models become more widely used in epidemiological time-series analysis, assessments of error impact that include statistical simulation may be useful.
NASA Technical Reports Server (NTRS)
Stern, Boris E.; Svensson, Roland; Begelman, Mitchell C.; Sikora, Marek
1995-01-01
High-energy radiation processes in compact cosmic objects are often expected to have a strongly non-linear behavior. Such behavior is shown, for example, by electron-positron pair cascades and the time evolution of relativistic proton distributions in dense radiation fields. Three independent techniques have been developed to simulate these non-linear problems: the kinetic equation approach; the phase-space density (PSD) Monte Carlo method; and the large-particle (LP) Monte Carlo method. In this paper, we present the latest version of the LP method and compare it with the other methods. The efficiency of the method in treating geometrically complex problems is illustrated by showing results of simulations of 1D, 2D and 3D systems. The method is shown to be powerful enough to treat non-spherical geometries, including such effects as bulk motion of the background plasma, reflection of radiation from cold matter, and anisotropic distributions of radiating particles. It can therefore be applied to simulate high-energy processes in such astrophysical systems as accretion discs with coronae, relativistic jets, pulsar magnetospheres and gamma-ray bursts.
A stochastic Markov chain approach for tennis: Monte Carlo simulation and modeling
NASA Astrophysics Data System (ADS)
Aslam, Kamran
This dissertation describes the computational formulation of probability density functions (pdfs) that facilitate head-to-head match simulations in tennis along with ranking systems developed from their use. A background on the statistical method used to develop the pdfs , the Monte Carlo method, and the resulting rankings are included along with a discussion on ranking methods currently being used both in professional sports and in other applications. Using an analytical theory developed by Newton and Keller in [34] that defines a tennis player's probability of winning a game, set, match and single elimination tournament, a computational simulation has been developed in Matlab that allows further modeling not previously possible with the analytical theory alone. Such experimentation consists of the exploration of non-iid effects, considers the concept the varying importance of points in a match and allows an unlimited number of matches to be simulated between unlikely opponents. The results of these studies have provided pdfs that accurately model an individual tennis player's ability along with a realistic, fair and mathematically sound platform for ranking them.
Best, Virginia; Keidser, Gitte; Buchholz, Jörg M; Freeston, Katrina
2015-01-01
There is increasing demand in the hearing research community for the creation of laboratory environments that better simulate challenging real-world listening environments. The hope is that the use of such environments for testing will lead to more meaningful assessments of listening ability, and better predictions about the performance of hearing devices. Here we present one approach for simulating a complex acoustic environment in the laboratory, and investigate the effect of transplanting a speech test into such an environment. Speech reception thresholds were measured in a simulated reverberant cafeteria, and in a more typical anechoic laboratory environment containing background speech babble. The participants were 46 listeners varying in age and hearing levels, including 25 hearing-aid wearers who were tested with and without their hearing aids. Reliable SRTs were obtained in the complex environment, but led to different estimates of performance and hearing-aid benefit from those measured in the standard environment. The findings provide a starting point for future efforts to increase the real-world relevance of laboratory-based speech tests.
Best, Virginia; Keidser, Gitte; Buchholz, J(x004E7)rg M.; Freeston, Katrina
2016-01-01
Objective There is increasing demand in the hearing research community for the creation of laboratory environments that better simulate challenging real-world listening environments. The hope is that the use of such environments for testing will lead to more meaningful assessments of listening ability, and better predictions about the performance of hearing devices. Here we present one approach for simulating a complex acoustic environment in the laboratory, and investigate the effect of transplanting a speech test into such an environment. Design Speech reception thresholds were measured in a simulated reverberant cafeteria, and in a more typical anechoic laboratory environment containing background speech babble. Study Sample The participants were 46 listeners varying in age and hearing levels, including 25 hearing-aid wearers who were tested with and without their hearing aids. Results Reliable SRTs were obtained in the complex environment, but led to different estimates of performance and hearing aid benefit from those measured in the standard environment. Conclusions The findings provide a starting point for future efforts to increase the real-world relevance of laboratory-based speech tests. PMID:25853616
HI and Low Metal Ions at the Intersection of Galaxies and the CGM
NASA Astrophysics Data System (ADS)
Oppenheimer, Benjamin
2017-08-01
Over 1000 COS orbits have revealed a surprisingly complex picture of circumgalactic gas flows surrounding the diversity of galaxies in the evolved Universe. Cosmological hydrodynamic simulations have only begun to confront the vast amount of galaxy formation physics, chemistry, and dynamics revealed in the multi-ion CGM datasets. We propose the next generation of EAGLE zoom simulations, called EAGLE Cosmic Origins, to model HI and low metal ions (C II, Mg II, & Si II) throughout not just the CGM but also within the galaxies themselves. We will employ a novel, new chemistry solver, CHIMES, to follow time-dependent ionization, chemistry, and cooling of 157 ionic and molecular species, and include multiple ionization sources from the extra-galactic background, episodic AGN, and star formation. Our aim is to understand the complete baryon cycle of inflows, outflows, and gas recycling traced over 10 decades of HI column densities as well as the complex kinematic information encoded low ion absorption spectroscopy. This simulation project represents a pilot program for a larger suite of zoom simulations, which will be publicly released and lead to additional publications.
Attitudes and Perception of Baccalaureate Nursing Students toward Educational Simulation
ERIC Educational Resources Information Center
Gharaibeh, Besher; Hweidi, Issa; Al-Smadi, Ahmed
2017-01-01
Background: Simulation can produce highly qualified professionals, however, it can also be perceived as stressful and frustrating by the nursing students. Purposes: This study was to identify the attitudes and perceptions of Jordanian nursing students toward simulation as an educational strategy, to investigate whether certain students'…
Designing, Implementing and Evaluating Preclinical Simulation Lab for Maternity Nursing Course
ERIC Educational Resources Information Center
ALFozan, Haya; El Sayed, Yousria; Habib, Farida
2015-01-01
Background: The opportunity for students to deliver care safely in today's, complex health care environment is limited. Simulation allows students to practice skills in a safe environment. Purpose: to assess the students' perception, satisfaction, and learning outcomes after a simulation based maternity course. Method: a quasi experimental design…
Applications of Low Density Flow Techniques and Catalytic Recombination at the Johnson Space Center
NASA Technical Reports Server (NTRS)
Scott, Carl D.
2000-01-01
The talk presents a brief background on defInitions of catalysis and effects associated with chemically nonequilibrium and low-density flows of aerospace interest. Applications of catalytic recombination on surfaces in dissociated flow are given, including aero heating on reentry spacecraft thermal protection surfaces and reflection of plume flow on pressure distributions associated with the space station. Examples include aero heating predictions for the X-38 test vehicle, the inlet of a proposed gas-sampling probe used in high enthalpy test facilities, and a parabolic body at angle of attack. The effect of accommodation coefficients on thruster induced pressure distributions is also included. Examples of tools used include simple aero heating formulas based on boundary layer solutions, an engineering approximation that uses axisymmetric viscous shock layer flow to simulate full three dimensional flow, full computational fluid dynamics, and direct simulation Monte-Carlo calculations. Methods of determining catalytic recombination rates in arc jet flow are discus ed. An area of catalysis not fully understood is the formation of single-wall carbon nanotubes (SWNT) with gas phase or nano-size metal particles. The Johnson Space Center is making SWNTs using both a laser ablation technique and an electric arc vaporization technique.
RF Models for Plasma-Surface Interactions in VSim
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Smithe, D. N.; Pankin, A. Y.; Roark, C. M.; Zhou, C. D.; Stoltz, P. H.; Kruger, S. E.
2014-10-01
An overview of ongoing enhancements to the Plasma Discharge (PD) module of Tech-X's VSim software tool is presented. A sub-grid kinetic sheath model, developed for the accurate computation of sheath potentials near metal and dielectric-coated walls, enables the physical effects of DC and RF sheath physics to be included in macroscopic-scale plasma simulations that need not explicitly resolve sheath scale lengths. Sheath potential evolution, together with particle behavior near the sheath, can thus be simulated in complex geometries. Generalizations of the model to include sputtering, secondary electron emission, and effects from multiple ion species and background magnetic fields are summarized; related numerical results are also presented. In addition, improved tools for plasma chemistry and IEDF/EEDF visualization and modeling are discussed, as well as our initial efforts toward the development of hybrid fluid/kinetic transition capabilities within VSim. Ultimately, we aim to establish VSimPD as a robust, efficient computational tool for modeling industrial plasma processes. Supported by US DoE SBIR-I/II Award DE-SC0009501.
simulation component models within EnergyPlus and OpenStudio. Prior to working at NREL, Anthony was a member building envelope and developed attic and roof simulation tools. His background is in modeling heat
Incoherent pair generation in a beam-beam interaction simulation
NASA Astrophysics Data System (ADS)
Rimbault, C.; Bambade, P.; Mönig, K.; Schulte, D.
2006-03-01
This paper deals with two topics: the generation of incoherent pairs in two beam-beam simulation programs, GUINEA-PIG and CAIN, and the influence of the International Linear Collider (ILC) beam parameter choices on the background in the micro vertex detector (VD) induced by direct hits. One of the processes involved in incoherent pair creation (IPC) is equivalent to a four fermions interaction and its cross section can be calculated exactly with a dedicated generator, BDK. A comparison of GUINEA-PIG and CAIN results with BDK allows to identify and quantify the uncertainties on IPC background predictions and to benchmark the GUINEA-PIG calculation. Based on this simulation and different VD designs, the five currently suggested ILC beam parameter sets have been compared regarding IPC background induced in the VD by direct IPC hits. We emphasize that the high luminosity set, as it is currently defined, would constrain both the choices of magnetic field and VD inner layer radius.
Breier, R; Brudanin, V B; Loaiza, P; Piquemal, F; Povinec, P P; Rukhadze, E; Rukhadze, N; Štekl, I
2018-05-21
The main limitation in the high-sensitive HPGe gamma-ray spectrometry has been the detector background, even for detectors placed deep underground. Environmental radionuclides such as 40 K and decay products in the 238 U and 232 Th chains have been identified as the most important radioactive contaminants of construction parts of HPGe gamma-ray spectrometers. Monte Carlo simulations have shown that the massive inner and outer lead shields have been the main contributors to the HPGe-detector background, followed by aluminum cryostat, copper cold finger, detector holder and the lead ring with FET. The Monte Carlo simulated cosmic-ray background gamma-ray spectrum has been by about three orders of magnitude lower than the experimental spectrum measured in the Modane underground laboratory (4800 m w.e.), underlying the importance of using radiopure materials for the construction of ultra-low-level HPGe gamma-ray spectrometers. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Achim, Pascal; Generoso, Sylvia; Morin, Mireille; Gross, Philippe; Le Petit, Gilbert; Moulin, Christophe
2016-05-01
Monitoring atmospheric concentrations of radioxenons is relevant to provide evidence of atmospheric or underground nuclear weapon tests. However, when the design of the International Monitoring Network (IMS) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) was set up, the impact of industrial releases was not perceived. It is now well known that industrial radioxenon signature can interfere with that of nuclear tests. Therefore, there is a crucial need to characterize atmospheric distributions of radioxenons from industrial sources—the so-called atmospheric background—in the frame of the CTBT. Two years of Xe-133 atmospheric background have been simulated using 2013 and 2014 meteorological data together with the most comprehensive emission inventory of radiopharmaceutical facilities and nuclear power plants to date. Annual average simulated activity concentrations vary from 0.01 mBq/m3 up to above 5 mBq/m3 nearby major sources. Average measured and simulated concentrations agree on most of the IMS stations, which indicates that the main sources during the time frame are properly captured. Xe-133 atmospheric background simulated at IMS stations turn out to be a complex combination of sources. Stations most impacted are in Europe and North America and can potentially detect Xe-133 every day. Predicted occurrences of detections of atmospheric Xe-133 show seasonal variations, more accentuated in the Northern Hemisphere, where the maximum occurs in winter. To our knowledge, this study presents the first global maps of Xe-133 atmospheric background from industrial sources based on two years of simulation and is a first attempt to analyze its composition in terms of origin at IMS stations.
Validation of Aquarius Measurements Using Radiative Transfer Models at L-Band
NASA Technical Reports Server (NTRS)
Dinnat, E.; LeVine, David M.; Abraham, S.; DeMattheis, P.; Utku, C.
2012-01-01
Aquarius/SAC-D was launched in June 2011 by NASA and CONAE (Argentine space agency). Aquarius includes three L-band (1.4 GHz) radiometers dedicated to measuring sea surface salinity. We report detailed comparisons of Aquarius measurements with radiative transfer model predictions. These comparisons were used as part ofthe initial assessment of Aquarius data. In particular, they were used successfully to estimate the radiometer calibration bias and stability. Further comparisons are being performed to assess the performance of models in the retrieval algorithm for correcting the effect of sources of geophysical "noise" (e.g. the galactic background, atmospheric attenuation and reflected signal from the Sun). Such corrections are critical in bringing the error in retrieved salinity down to the required 0.2 practical salinity unit (psu) on monthly global maps at 150 km by 150 km resolution. The forward models making up the Aquarius simulator have been very useful for preparatory studies in the years leading to Aquarius' launch. The simulator includes various components to compute effects ofthe following processes on the measured signal: 1) emission from Earth surfaces (ocean, land, ice), 2) atmospheric emission and absorption, 3) emission from the Sun, Moon and celestial Sky (directly through the antenna sidelobes or after reflection/scattering at the Earth surface), 4) Faraday rotation, and 5) convolution of the scene by the antenna gain patterns. Since the Aquarius radiometers tum-on in late July 2011, the simulator has been used to perform a first order validation of the data. This included checking the order of magnitude ofthe signal over ocean, land and ice surfaces, checking the relative amplitude of signal at different polarizations, and checking the variation with incidence angle. The comparisons were also used to assess calibration bias and monitor instruments calibration drift. The simulator is also being used in the salinity retrieval. For example, initial assessments of the salinity retrieved from Aquarius data showed degradation in accuracy at locations where glint from the galactic sky background was important. This was traced to an inaccurate correction for the Sky glint. We present comparisons of the simulator prediction to the Aquarius data in order to assess the performances of the models of various physical processes impacting the measurements, such as the effect of sea surface roughness, the impact of the celestial Sky and the Sun emission scattered at the rough ocean surface. We discuss what components of the simulator appear reliable and which ones need improvements. Improved knowledge on the radiative transfer models at L-band will not only lead to better salinity retrieved from Aquarius data, it will also allow be beneficial for SMOS or the upcoming SMAP mission.
Modeling earthquake magnitudes from injection-induced seismicity on rough faults
NASA Astrophysics Data System (ADS)
Maurer, J.; Dunham, E. M.; Segall, P.
2017-12-01
It is an open question whether perturbations to the in-situ stress field due to fluid injection affect the magnitudes of induced earthquakes. It has been suggested that characteristics such as the total injected fluid volume control the size of induced events (e.g., Baisch et al., 2010; Shapiro et al., 2011). On the other hand, Van der Elst et al. (2016) argue that the size distribution of induced earthquakes follows Gutenberg-Richter, the same as tectonic events. Numerical simulations support the idea that ruptures nucleating inside regions with high shear-to-effective normal stress ratio may not propagate into regions with lower stress (Dieterich et al., 2015; Schmitt et al., 2015), however, these calculations are done on geometrically smooth faults. Fang & Dunham (2013) show that rupture length on geometrically rough faults is variable, but strongly dependent on background shear/effective normal stress. In this study, we use a 2-D elasto-dynamic rupture simulator that includes rough fault geometry and off-fault plasticity (Dunham et al., 2011) to simulate earthquake ruptures under realistic conditions. We consider aggregate results for faults with and without stress perturbations due to fluid injection. We model a uniform far-field background stress (with local perturbations around the fault due to geometry), superimpose a poroelastic stress field in the medium due to injection, and compute the effective stress on the fault as inputs to the rupture simulator. Preliminary results indicate that even minor stress perturbations on the fault due to injection can have a significant impact on the resulting distribution of rupture lengths, but individual results are highly dependent on the details of the local stress perturbations on the fault due to geometric roughness.
Monte Carlo studies of medium-size telescope designs for the Cherenkov Telescope Array
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, M. D.; Jogler, T.; Dumm, J.
In this paper, we present studies for optimizing the next generation of ground-based imaging atmospheric Cherenkov telescopes (IACTs). Results focus on mid-sized telescopes (MSTs) for CTA, detecting very high energy gamma rays in the energy range from a few hundred GeV to a few tens of TeV. We describe a novel, flexible detector Monte Carlo package, FAST (FAst Simulation for imaging air cherenkov Telescopes), that we use to simulate different array and telescope designs. The simulation is somewhat simplified to allow for efficient exploration over a large telescope design parameter space. We investigate a wide range of telescope performance parametersmore » including optical resolution, camera pixel size, and light collection area. In order to ensure a comparison of the arrays at their maximum sensitivity, we analyze the simulations with the most sensitive techniques used in the field, such as maximum likelihood template reconstruction and boosted decision trees for background rejection. Choosing telescope design parameters representative of the proposed Davies–Cotton (DC) and Schwarzchild–Couder (SC) MST designs, we compare the performance of the arrays by examining the gamma-ray angular resolution and differential point-source sensitivity. We further investigate the array performance under a wide range of conditions, determining the impact of the number of telescopes, telescope separation, night sky background, and geomagnetic field. We find a 30–40% improvement in the gamma-ray angular resolution at all energies when comparing arrays with an equal number of SC and DC telescopes, significantly enhancing point-source sensitivity in the MST energy range. Finally, we attribute the increase in point-source sensitivity to the improved optical point-spread function and smaller pixel size of the SC telescope design.« less
Monte Carlo studies of medium-size telescope designs for the Cherenkov Telescope Array
Wood, M. D.; Jogler, T.; Dumm, J.; ...
2015-06-07
In this paper, we present studies for optimizing the next generation of ground-based imaging atmospheric Cherenkov telescopes (IACTs). Results focus on mid-sized telescopes (MSTs) for CTA, detecting very high energy gamma rays in the energy range from a few hundred GeV to a few tens of TeV. We describe a novel, flexible detector Monte Carlo package, FAST (FAst Simulation for imaging air cherenkov Telescopes), that we use to simulate different array and telescope designs. The simulation is somewhat simplified to allow for efficient exploration over a large telescope design parameter space. We investigate a wide range of telescope performance parametersmore » including optical resolution, camera pixel size, and light collection area. In order to ensure a comparison of the arrays at their maximum sensitivity, we analyze the simulations with the most sensitive techniques used in the field, such as maximum likelihood template reconstruction and boosted decision trees for background rejection. Choosing telescope design parameters representative of the proposed Davies–Cotton (DC) and Schwarzchild–Couder (SC) MST designs, we compare the performance of the arrays by examining the gamma-ray angular resolution and differential point-source sensitivity. We further investigate the array performance under a wide range of conditions, determining the impact of the number of telescopes, telescope separation, night sky background, and geomagnetic field. We find a 30–40% improvement in the gamma-ray angular resolution at all energies when comparing arrays with an equal number of SC and DC telescopes, significantly enhancing point-source sensitivity in the MST energy range. Finally, we attribute the increase in point-source sensitivity to the improved optical point-spread function and smaller pixel size of the SC telescope design.« less
Theory and simulations of current drive via injection of an electron beam in the ACT-1 device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Okuda, H.; Horton, R.; Ono, M.
1985-02-01
One- and two-dimensional particle simulations of beam-plasma interaction have been carried out in order to understand current drive experiments that use an electron beam injected into the ACT-1 device. Typically, the beam velocity along the magnetic field is V = 10/sup 9/ cm/sec while the thermal velocity of the background electrons is v/sub t/ = 10/sup 8//cm. The ratio of the beam density to the background density is about 10% so that a strong beam-plasma instability develops causing rapid diffusion of beam particles. For both one- and two- dimensional simulations, it is found that a significant amount of beam andmore » background electrons is accelerated considerably beyond the initial beam velocity when the beam density is more than a few percent of the background plasma density. In addition, electron distribution along the magnetic field has a smooth negative slope, f' (v/sub parallel/) < 0, for v/ sub parallel/ > 0 extending v/sub parallel/ = 1.5 V approx. 2 V, which is in sharp contrast to the predictions from quasilinear theory. An estimate of the mean-free path for beam electrons due to Coulomb collisions reveals that the beam electrons can propagate a much longer distance than is predicted from a quasilinear theory, due to the presence of a high energy tail. These simulation results agree well with the experimental observations from the ACT-1 device.« less
Dense Regions in Supersonic Isothermal Turbulence
NASA Astrophysics Data System (ADS)
Robertson, Brant; Goldreich, Peter
2018-02-01
The properties of supersonic isothermal turbulence influence a variety of astrophysical phenomena, including the structure and evolution of star-forming clouds. This work presents a simple model for the structure of dense regions in turbulence in which the density distribution behind isothermal shocks originates from rough hydrostatic balance between the pressure gradient behind the shock and its deceleration from ram pressure applied by the background fluid. Using simulations of supersonic isothermal turbulence and idealized waves moving through a background medium, we show that the structural properties of dense, shocked regions broadly agree with our analytical model. Our work provides a new conceptual picture for describing the dense regions, which complements theoretical efforts to understand the bulk statistical properties of turbulence and attempts to model the more complex features of star-forming clouds like magnetic fields, self-gravity, or radiative properties.
Integrated infrared detector arrays for low-background astronomy
NASA Technical Reports Server (NTRS)
Mccreight, C. R.
1979-01-01
Existing integrated infrared detector array technology is being evaluated under low-background conditions to determine its applicability in orbiting astronomical applications where extended integration times and photometric accuracy are of interest. Preliminary performance results of a 1 x 20 elements InSb CCD array under simulated astronomical conditions are presented. Using the findings of these tests, improved linear- and area-array technology will be developed for use in NASA programs such as the Shuttle Infrared Telescope Facility. For wavelengths less than 30 microns, extrinsic silicon and intrinsic arrays with CCD readout will be evaluated and improved as required, while multiplexed arrays of Ge:Ga for wavelengths in the range 30 to 120 microns will be developed as fundamental understanding of this material improves. Future efforts will include development of improved drive and readout circuitry, and consideration of alternate multiplexing schemes.
Backgrounds, radiation damage, and spacecraft orbits
NASA Astrophysics Data System (ADS)
Grant, Catherine E.; Miller, Eric D.; Bautz, Mark W.
2017-08-01
The scientific utility of any space-based observatory can be limited by the on-orbit charged particle background and the radiation-induced damage. All existing and proposed missions have had to make choices about orbit selection, trading off the radiation environment against other factors. We present simulations from ESA’s SPace ENVironment Information System (SPENVIS) of the radiation environment for spacecraft in a variety of orbits, from Low Earth Orbit (LEO) at multiple inclinations to High Earth Orbit (HEO) to Earth-Sun L2 orbit. We summarize how different orbits change the charged particle background and the radiation damage to the instrument. We also discuss the limitations of SPENVIS simulations, particularly outside the Earth’s trapped radiation and point to new resources attempting to address those limitations.
Efficient generation of image chips for training deep learning algorithms
NASA Astrophysics Data System (ADS)
Han, Sanghui; Fafard, Alex; Kerekes, John; Gartley, Michael; Ientilucci, Emmett; Savakis, Andreas; Law, Charles; Parhan, Jason; Turek, Matt; Fieldhouse, Keith; Rovito, Todd
2017-05-01
Training deep convolutional networks for satellite or aerial image analysis often requires a large amount of training data. For a more robust algorithm, training data need to have variations not only in the background and target, but also radiometric variations in the image such as shadowing, illumination changes, atmospheric conditions, and imaging platforms with different collection geometry. Data augmentation is a commonly used approach to generating additional training data. However, this approach is often insufficient in accounting for real world changes in lighting, location or viewpoint outside of the collection geometry. Alternatively, image simulation can be an efficient way to augment training data that incorporates all these variations, such as changing backgrounds, that may be encountered in real data. The Digital Imaging and Remote Sensing Image Image Generation (DIRSIG) model is a tool that produces synthetic imagery using a suite of physics-based radiation propagation modules. DIRSIG can simulate images taken from different sensors with variation in collection geometry, spectral response, solar elevation and angle, atmospheric models, target, and background. Simulation of Urban Mobility (SUMO) is a multi-modal traffic simulation tool that explicitly models vehicles that move through a given road network. The output of the SUMO model was incorporated into DIRSIG to generate scenes with moving vehicles. The same approach was used when using helicopters as targets, but with slight modifications. Using the combination of DIRSIG and SUMO, we quickly generated many small images, with the target at the center with different backgrounds. The simulations generated images with vehicles and helicopters as targets, and corresponding images without targets. Using parallel computing, 120,000 training images were generated in about an hour. Some preliminary results show an improvement in the deep learning algorithm when real image training data are augmented with the simulated images, especially when obtaining sufficient real data was particularly challenging.
Educational aspects of molecular simulation
NASA Astrophysics Data System (ADS)
Allen, Michael P.
This article addresses some aspects of teaching simulation methods to undergraduates and graduate students. Simulation is increasingly a cross-disciplinary activity, which means that the students who need to learn about simulation methods may have widely differing backgrounds. Also, they may have a wide range of views on what constitutes an interesting application of simulation methods. Almost always, a successful simulation course includes an element of practical, hands-on activity: a balance always needs to be struck between treating the simulation software as a 'black box', and becoming bogged down in programming issues. With notebook computers becoming widely available, students often wish to take away the programs to run themselves, and access to raw computer power is not the limiting factor that it once was; on the other hand, the software should be portable and, if possible, free. Examples will be drawn from the author's experience in three different contexts. (1) An annual simulation summer school for graduate students, run by the UK CCP5 organization, in which practical sessions are combined with an intensive programme of lectures describing the methodology. (2) A molecular modelling module, given as part of a doctoral training centre in the Life Sciences at Warwick, for students who might not have a first degree in the physical sciences. (3) An undergraduate module in Physics at Warwick, also taken by students from other disciplines, teaching high performance computing, visualization, and scripting in the context of a physical application such as Monte Carlo simulation.
More than Meets the Eye--a Simulation of Natural Selection.
ERIC Educational Resources Information Center
Allen, J. A.; And Others
1987-01-01
Presents experiments using wild birds as predators and pastry as prey and colored stones as background to demonstrate natural selection. Describes the exercise as an exercise in simulating natural selection. (Author/CW)
NASA Astrophysics Data System (ADS)
Nair, U. S.; Keiser, K.; Wu, Y.; Maskey, M.; Berendes, D.; Glass, P.; Dhakal, A.; Christopher, S. A.
2012-12-01
The Alabama Forestry Commission (AFC) is responsible for wildfire control and also prescribed burn management in the state of Alabama. Visibility and air quality degradation resulting from smoke are two pieces of information that are crucial for this activity. Currently the tools available to AFC are the dispersion index available from the National Weather Service and also surface smoke concentrations. The former provides broad guidance for prescribed burning activities but does not provide specific information regarding smoke transport, areas affected and quantification of air quality and visibility degradation. While the NOAA operational air quality guidance includes surface smoke concentrations from existing fire events, it does not account for contributions from background aerosols, which are important for the southeastern region including Alabama. Also lacking is the quantification of visibility. The University of Alabama in Huntsville has developed a state-of-the-art integrated modeling system to address these concerns. This system based on the Community Air Quality Modeling System (CMAQ) that ingests satellite derived smoke emissions and also assimilates NASA MODIS derived aerosol optical thickness. In addition, this operational modeling system also simulates the impact of potential prescribed burn events based on location information derived from the AFC prescribed burn permit database. A lagrangian model is used to simulate smoke plumes for the prescribed burns requests. The combined air quality and visibility degradation resulting from these smoke plumes and background aerosols is computed and the information is made available through a web based decision support system utilizing open source GIS components. This system provides information regarding intersections between highways and other critical facilities such as old age homes, hospitals and schools. The system also includes satellite detected fire locations and other satellite derived datasets relevant for fire and smoke management.
Verification of Loop Diagnostics
NASA Technical Reports Server (NTRS)
Winebarger, A.; Lionello, R.; Mok, Y.; Linker, J.; Mikic, Z.
2014-01-01
Many different techniques have been used to characterize the plasma in the solar corona: density-sensitive spectral line ratios are used to infer the density, the evolution of coronal structures in different passbands is used to infer the temperature evolution, and the simultaneous intensities measured in multiple passbands are used to determine the emission measure. All these analysis techniques assume that the intensity of the structures can be isolated through background subtraction. In this paper, we use simulated observations from a 3D hydrodynamic simulation of a coronal active region to verify these diagnostics. The density and temperature from the simulation are used to generate images in several passbands and spectral lines. We identify loop structures in the simulated images and calculate the loop background. We then determine the density, temperature and emission measure distribution as a function of time from the observations and compare with the true temperature and density of the loop. We find that the overall characteristics of the temperature, density, and emission measure are recovered by the analysis methods, but the details of the true temperature and density are not. For instance, the emission measure curves calculated from the simulated observations are much broader than the true emission measure distribution, though the average temperature evolution is similar. These differences are due, in part, to inadequate background subtraction, but also indicate a limitation of the analysis methods.
NASA Astrophysics Data System (ADS)
Yücel, Mete; Bayrak, Ahmet; Yücel, Esra Barlas; Ozben, Cenap S.
2018-02-01
Massive Ammonium Nitrate (NH4-NO3) based explosives buried underground are commonly used in terror attacks. These explosives can be detected using neutron scattering method with some limitations. Simulations are very useful tools for designing a possible detection system for these kind of explosives. Geant4 simulations were used for generating neutrons at 14 MeV energy and tracking them through the scattering off the explosive embedded in soil. Si-PIN photodiodes were used as detector elements in the design for their low costs and simplicity for signal readout electronics. Various neutron-charge particle converters were applied on to the surface of the photodiodes to increase the detection efficiency. Si-PIN photodiodes coated with 6LiF provided the best result for a certain energy interval. Energy depositions in silicon detector from all secondary particles generated including photons were taken into account to generate a realistic background. Humidity of soil, one of the most important parameter for limiting the detection, was also studied.
Automatising the analysis of stochastic biochemical time-series
2015-01-01
Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821
ISS Radiation Shielding and Acoustic Simulation Using an Immersive Environment
NASA Technical Reports Server (NTRS)
Verhage, Joshua E.; Sandridge, Chris A.; Qualls, Garry D.; Rizzi, Stephen A.
2002-01-01
The International Space Station Environment Simulator (ISSES) is a virtual reality application that uses high-performance computing, graphics, and audio rendering to simulate the radiation and acoustic environments of the International Space Station (ISS). This CAVE application allows the user to maneuver to different locations inside or outside of the ISS and interactively compute and display the radiation dose at a point. The directional dose data is displayed as a color-mapped sphere that indicates the relative levels of radiation from all directions about the center of the sphere. The noise environment is rendered in real time over headphones or speakers and includes non-spatial background noise, such as air-handling equipment, and spatial sounds associated with specific equipment racks, such as compressors or fans. Changes can be made to equipment rack locations that produce changes in both the radiation shielding and system noise. The ISSES application allows for interactive investigation and collaborative trade studies between radiation shielding and noise for crew safety and comfort.
Protein-membrane electrostatic interactions: Application of the Lekner summation technique
NASA Astrophysics Data System (ADS)
Juffer, André H.; Shepherd, Craig M.; Vogel, Hans J.
2001-01-01
A model has been developed to calculate the electrostatic interaction between biomolecules and lipid bilayers. The effect of ionic strength is included by means of explicit ions, while water is described as a background continuum. The bilayer is considered at the atomic level. The Lekner summation technique is employed to calculate the long-range electrostatic interactions. The new method is employed to estimate the electrostatic contribution to the free energy of binding of sandostatin, a cyclic eight-residue analogue of the peptide hormone somatostatin, to lipid bilayers with thermodynamic integration. Monte Carlo simulation techniques were employed to determine ion distributions and peptide orientations. Both neutral as well as negatively charged lipid bilayers were used. An error analysis to judge the quality of the computation is also presented. The applicability of the Lekner summation technique to combine it with computer simulation models that simulate the adsorption of peptides (and proteins) into the interfacial region of lipid bilayers is discussed.
Multiple Ions Resonant Heating and Acceleration by Alfven/cyclotron Fluctuations in the Solar Wind
NASA Astrophysics Data System (ADS)
Xie, H.; Ofman, L.
2003-12-01
We study the interaction between protons, and multiple minor ions (O5+, He++) and a given cyclotron resonant spectra in coronal hole plasma. One-dimensional hybrid simulations are performed in initially homogeneous, collisionless, magnetized plasma with waves propagating parallel to the background magnetic field. The self-consistent hybrid simulations are used to study how multiple minor species may affect the resonance interaction between a spectrum of waves and the solar wind protons. The results of the simulations provide a clear picture of wave-particle interaction under various coronal conditions, which can explain 1) how multiple minor ions affect the resonant heating and the temperature anisotropy of the solar wind protons by a given wave spectrum; 2) how energy is distributed and transferred among waves and different ion species; 3) the growth and damping of different beam microinstability modes, including both inward and outward waves; 4) the formation of proton double-peak distribution in the solar wind.
Background and imaging simulations for the hard X-ray camera of the MIRAX mission
NASA Astrophysics Data System (ADS)
Castro, M.; Braga, J.; Penacchioni, A.; D'Amico, F.; Sacahui, R.
2016-07-01
We report the results of detailed Monte Carlo simulations of the performance expected both at balloon altitudes and at the probable satellite orbit of a hard X-ray coded-aperture camera being developed for the Monitor e Imageador de RAios X (MIRAX) mission. Based on a thorough mass model of the instrument and detailed specifications of the spectra and angular dependence of the various relevant radiation fields at both the stratospheric and orbital environments, we have used the well-known package GEANT4 to simulate the instrumental background of the camera. We also show simulated images of source fields to be observed and calculated the detailed sensitivity of the instrument in both situations. The results reported here are especially important to researchers in this field considering that we provide important information, not easily found in the literature, on how to prepare input files and calculate crucial instrumental parameters to perform GEANT4 simulations for high-energy astrophysics space experiments.
Creating Interactive Physics Simulations Using the Power of GeoGebra
ERIC Educational Resources Information Center
Walsh, Tom
2017-01-01
I have long incorporated physics simulations in my physics teaching, and truly appreciate those who have made their simulations available to the public. I often would think of an idea for a simulation I would love to be able to use, but with no real programming background I did not know how I could make my own. That was the case until I discovered…
NASA Astrophysics Data System (ADS)
Kulisek, J. A.; Schweppe, J. E.; Stave, S. C.; Bernacki, B. E.; Jordan, D. V.; Stewart, T. N.; Seifert, C. E.; Kernan, W. J.
2015-06-01
Helicopter-mounted gamma-ray detectors can provide law enforcement officials the means to quickly and accurately detect, identify, and locate radiological threats over a wide geographical area. The ability to accurately distinguish radiological threat-generated gamma-ray signatures from background gamma radiation in real time is essential in order to realize this potential. This problem is non-trivial, especially in urban environments for which the background may change very rapidly during flight. This exacerbates the challenge of estimating background due to the poor counting statistics inherent in real-time airborne gamma-ray spectroscopy measurements. To address this challenge, we have developed a new technique for real-time estimation of background gamma radiation from aerial measurements without the need for human analyst intervention. The method can be calibrated using radiation transport simulations along with data from previous flights over areas for which the isotopic composition need not be known. Over the examined measured and simulated data sets, the method generated accurate background estimates even in the presence of a strong, 60Co source. The potential to track large and abrupt changes in background spectral shape and magnitude was demonstrated. The method can be implemented fairly easily in most modern computing languages and environments.
ERIC Educational Resources Information Center
Kukkonen, Jari Ensio; Kärkkäinen, Sirpa; Dillon, Patrick; Keinonen, Tuula
2014-01-01
Research has demonstrated that simulation-based inquiry learning has significant advantages for learning outcomes when properly scaffolded. For successful learning in science with simulation-based inquiry, one needs to ascertain levels of background knowledge so as to support learners in making, evaluating and modifying hypotheses, conducting…
Nursing Students' Nonverbal Reactions to Malodor in Wound Care Simulation
ERIC Educational Resources Information Center
Baker, Gloria Waters
2012-01-01
Background: Wound care is an essential competency which nursing students are expected to acquire. To foster students' competency, nurse educators use high fidelity simulation to expose nursing students to various wound characteristics. Problem: Little is known about how nursing students react to simulated wound characteristics. Malodor is a…
ERIC Educational Resources Information Center
Kay, Gary G.; Michaels, M. Alex; Pakull, Barton
2009-01-01
Background: Psychostimulant treatment may improve simulated driving performance in young adults with attention-deficit/hyperactivity disorder (ADHD). Method: This was a randomized, double-blind, placebo-controlled, crossover study of simulated driving performance with mixed amphetamine salts--extended release (MAS XR) 50 mg/day (Cohort 1) and…
This study analyzes simulated regional-scale ozone burdens both near the surface and aloft, estimates process contributions to these burdens, and calculates the sensitivity of the simulated regional-scale ozone burden to several key model inputs with a particular emphasis on boun...
Exploring the Dynamics of Exoplanetary Systems in a Young Stellar Cluster
NASA Astrophysics Data System (ADS)
Thornton, Jonathan Daniel; Glaser, Joseph Paul; Wall, Joshua Edward
2018-01-01
I describe a dynamical simulation of planetary systems in a young star cluster. One rather arbitrary aspect of cluster simulations is the choice of initial conditions. These are typically chosen from some standard model, such as Plummer or King, or from a “fractal” distribution to try to model young clumpy systems. Here I adopt the approach of realizing an initial cluster model directly from a detailed magnetohydrodynamical model of cluster formation from a 1000-solar-mass interstellar gas cloud, with magnetic fields and radiative and wind feedback from massive stars included self-consistently. The N-body simulation of the stars and planets starts once star formation is largely over and feedback has cleared much of the gas from the region where the newborn stars reside. It continues until the cluster dissolves in the galactic field. Of particular interest is what would happen to the free-floating planets created in the gas cloud simulation. Are they captured by a star or are they ejected from the cluster? This method of building a dynamical cluster simulation directly from the results of a cluster formation model allows us to better understand the evolution of young star clusters and enriches our understanding of extrasolar planet development in them. These simulations were performed within the AMUSE simulation framework, and combine N-body, multiples and background potential code.
NASA Astrophysics Data System (ADS)
Fu, A.; Xue, Y.
2017-12-01
Corn is one of most important agricultural production in China. Research on the simulation of corn yields and the impacts of climate change and agricultural management practices on corn yields is important in maintaining the stable corn production. After climatic data including daily temperature, precipitation, solar radiation, relative humidity, and wind speed from 1948 to 2010, soil properties, observed corn yields, and farmland management information were collected, corn yields grown in humidity and hot environment (Sichuang province) and cold and dry environment (Hebei province) in China in the past 63 years were simulated by Daycent, and the results was evaluated based on published yield record. The relationship between regional climate change, global warming and corn yield were analyzed, the uncertainties of simulation derived from agricultural management practices by changing fertilization levels, land fertilizer maintenance and tillage methods were reported. The results showed that: (1) Daycent model is capable to simulate corn yields under the different climatic background in China. (2) When studying the relationship between regional climate change and corn yields, it has been found that observed and simulated corn yields increased along with total regional climate change. (3) When studying the relationship between the global warming and corn yields, It was discovered that newly-simulated corn yields after removing the global warming trend of original temperature data were lower than before.
Background evaluation for the neutron sources in the Daya Bay experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gu, W. Q.; Cao, G. F.; Chen, X. H.
2016-07-06
Here, we present an evaluation of the background induced by 241Am–13C neutron calibration sources in the Daya Bay reactor neutrino experiment. Furthermore, as a significant background for electron-antineutrino detection at 0.26 ± 0.12 detector per day on average, it has been estimated by a Monte Carlo simulation that was benchmarked by a special calibration data set. This dedicated data set also provides the energy spectrum of the background.
Improvement of Accuracy for Background Noise Estimation Method Based on TPE-AE
NASA Astrophysics Data System (ADS)
Itai, Akitoshi; Yasukawa, Hiroshi
This paper proposes a method of a background noise estimation based on the tensor product expansion with a median and a Monte carlo simulation. We have shown that a tensor product expansion with absolute error method is effective to estimate a background noise, however, a background noise might not be estimated by using conventional method properly. In this paper, it is shown that the estimate accuracy can be improved by using proposed methods.
Friedly, J.C.; Kent, D.B.; Davis, J.A.
2002-01-01
Reactive transport simulations were conducted to model chemical reactions between metal - EDTA (ethylenediaminetetraacetic acid) complexes during transport in a mildly acidic quartz - sand aquifer. Simulations were compared with the results of small-scale tracer tests wherein nickel-, zinc-, and calcium - EDTA complexes and free EDTA were injected into three distinct chemical zones of a plume of sewage-contaminated groundwater. One zone had a large mass of adsorbed, sewage-derived zinc; one zone had a large mass of adsorbed manganese resulting from mildly reducing conditions created bythe sewage plume; and one zone had significantly less adsorbed manganese and negligible zinc background. The chemical model assumed that the dissolution of iron(III) from metal - hydroxypolymer coatings on the aquifer sediments by the metal - EDTA complexes was kinetically restricted. All other reactions, including metal - EDTA complexation, zinc and manganese adsorption, and aluminum hydroxide dissolution were assumed to reach equilibrium on the time scale of transport; equilibrium constants were either taken from the literature or determined independently in the laboratory. A single iron(III) dissolution rate constant was used to fit the breakthrough curves observed in the zone with negligible zinc background. Simulation results agreed well with the experimental data in all three zones, which included temporal moments derived from breakthrough curves at different distances downgradient from the injections and spatial moments calculated from synoptic samplings conducted at different times. Results show that the tracer cloud was near equilibrium with respect to Fe in the sediment after 11 m of transport in the Zn-contaminated region but remained far from equilibrium in the other two zones. Sensitivity studies showed that the relative rate of iron(III) dissolution by the different metal - EDTA complexes was less important than the fact that these reactions are rate controlled. Results suggest that the published solubility for ferrihydrite reasonably approximates the Fe solubility of the hydroxypolymer coatings on the sediments. Aluminum may be somewhat more soluble than represented by the equilibrium constant for gibbsite, and its dissolution may be rate controlled when reacting with Ca - EDTA complexes.
Modeling of the energetic ion observations in the vicinity of Rhea and Dione
NASA Astrophysics Data System (ADS)
Kotova, Anna; Roussos, Elias; Krupp, Norbert; Dandouras, Iannis
2015-09-01
During several flybys of the Cassini spacecraft by the saturnian moons Rhea and Dione the energetic particle detector MIMI/LEMMS measured a significant reduction of energetic ion fluxes (20-300 keV) in their vicinity, which is caused by the absorption of those ions at the moon surfaces. In order to simulate the observed depletion profiles we developed an energetic particle tracer, which can be used to simulate the charged particle trajectories considering different models of the saturnian magnetosphere. This particle tracer is using an adaptive fourth order Gauss Runge-Kutta calculation method and its background magnetospheric model can be varied from that of a simple dipole, to a more complex one that includes also non-dipolar perturbations. The electromagnetic environment of each local, moon-magnetosphere interaction region is modeled through a hybrid plasma simulation code. Using this energetic particle tracer we explore which of these magnetospheric characteristics are more important in shaping the MIMI/LEMMS ion profiles. We also examine if MIMI/LEMMS responds primarily to protons (as typically assumed in many studies) or also to heavier ions, using calibration information, observations of the energy flux spectrum by the MIMI/CHEMS instrument (on board of Cassini as well) and different simulation results. Our results show that MIMI/LEMMS indeed measures heavier ions as well. Also we discovered that wrapping of magnetic field lines, even if it caused local perturbations only about few percent of the background magnetic field, can cause measurable changes in the spatial and energy distribution of fluxes measured by MIMI/LEMMS. These results are important for correct interpretation of MIMI/LEMMS data, and offer capabilities for a precise in-flight instruments' cross-calibration. Besides that, our simulation approach can be employed in similar environments (Titan, Enceladus, jovian moons, etc.) for constraining the magnetic topology of their interaction region and for identifying the composition and charge-states of ions at high energies, where capabilities of the available or future instruments can be limited.
Tsai, Chia-Wei; Tipple, Christopher A; Yost, Richard A
2018-04-15
Paper spray ionization (PSI) is an attractive ambient ionization source for mass spectrometry (MS) since it allows the combination of surface sampling and ionization. The minimal sample preparation inherent in this approach greatly reduces the time needed for analysis. However, the ions generated from interfering compounds in the sample and the paper substrate may interfere with the analyte ions. Therefore, the integration of PSI with high-field asymmetric ion mobility spectrometry (FAIMS) is of significant interest since it should reduce the background ions entering the mass analyzer without complicating the analysis or increasing analysis time. Here we demonstrate the integration of PSI with FAIMS/MS and its potential for analysis of samples of forensic interest. In this work, the parameters that can influence the integration, including sampling and ionization by paper spray, the FAIMS separation of analytes from each other and background interferences, and the length of time that a usable signal can be observed for explosives on paper, were evaluated with the integrated system. In the negative ion analysis of 2,4,6-trinitrotoluene (TNT), pentaerythritol tetranitrate (PETN), octahydro-1,3,5,7-tetranitro-1,3,5,7-tetrazocine (HMX), and 1,3,5-trinitroperhydro-1,3,5-triazine (RDX), amounts as low as 1 ng on paper were readily observed. The successful positive ion separation of a set of illicit drugs including heroin, methamphetamine, and cocaine was also achieved. In addition, the positive ion analysis of the chemical warfare agent simulants dimethyl methylphosphonate (DMMP) and diisopropyl methylphosphonate (DIMP) was evaluated. The integration of PSI-FAIMS/MS was demonstrated for the analyses of explosives in negative ion mode and for illicit drugs and CW simulants in positive mode. Paper background ions that could interfere with these analyses were separated by FAIMS. The compensation voltage of an ion obtained by FAIMS provided an additional identification parameter to be combined with the mass spectrum for each analyte. Copyright © 2018 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babich, L. P., E-mail: babich@elph.vniief.ru; Bochkov, E. I.; Kutsyk, I. M.
2011-05-15
The mechanism of lightning initiation due to electric field enhancement by the polarization of a conducting channel produced by relativistic runaway electron avalanches triggered by background cosmic radiation has been simulated numerically. It is shown that the fields at which the start of a lightning leader is possible even in the absence of precipitations are locally realized for realistic thundercloud configurations and charges. The computational results agree with the in-situ observations of penetrating radiation enhancement in thunderclouds.
Simulation of radial expansion of an electron beam injected into a background plasma
NASA Technical Reports Server (NTRS)
Koga, J.; Lin, C. S.
1989-01-01
A 2-D electrostatic particle code was used to study the beam radial expansion of a nonrelativistic electron beam injected from an isolated equipotential conductor into a background plasma. The simulations indicate that the beam radius is generally proportional to the beam electron gyroradius when the conductor is charged to a large potential. The simulations also suggest that the charge buildup at the beam stagnation point causes the beam radial expansion. From a survey of the simulation results, it is found that the ratio of the beam radius to the beam electron gyroradius increases with the square root of beam density and decreases inversely with beam injection velocity. This dependence is explained in terms of the ratio of the beam electron Debye length to the ambient electron Debye length. These results are most applicable to the SEPAC electron beam injection experiments from Spacelab 1, where high charging potential was observed.
NASA Astrophysics Data System (ADS)
Tartakovsky, A.; Brown, A.; Brown, J.
The paper describes the development and evaluation of a suite of advanced algorithms which provide significantly-improved capabilities for finding, fixing, and tracking multiple ballistic and flying low observable objects in highly stressing cluttered environments. The algorithms have been developed for use in satellite-based staring and scanning optical surveillance suites for applications including theatre and intercontinental ballistic missile early warning, trajectory prediction, and multi-sensor track handoff for midcourse discrimination and intercept. The functions performed by the algorithms include electronic sensor motion compensation providing sub-pixel stabilization (to 1/100 of a pixel), as well as advanced temporal-spatial clutter estimation and suppression to below sensor noise levels, followed by statistical background modeling and Bayesian multiple-target track-before-detect filtering. The multiple-target tracking is performed in physical world coordinates to allow for multi-sensor fusion, trajectory prediction, and intercept. Output of detected object cues and data visualization are also provided. The algorithms are designed to handle a wide variety of real-world challenges. Imaged scenes may be highly complex and infinitely varied -- the scene background may contain significant celestial, earth limb, or terrestrial clutter. For example, when viewing combined earth limb and terrestrial scenes, a combination of stationary and non-stationary clutter may be present, including cloud formations, varying atmospheric transmittance and reflectance of sunlight and other celestial light sources, aurora, glint off sea surfaces, and varied natural and man-made terrain features. The targets of interest may also appear to be dim, relative to the scene background, rendering much of the existing deployed software useless for optical target detection and tracking. Additionally, it may be necessary to detect and track a large number of objects in the threat cloud, and these objects may not always be resolvable in individual data frames. In the present paper, the performance of the developed algorithms is demonstrated using real-world data containing resident space objects observed from the MSX platform, with backgrounds varying from celestial to combined celestial and earth limb, with instances of extremely bright aurora clutter. Simulation results are also presented for parameterized variations in signal-to-clutter levels (down to 1/1000) and signal-to-noise levels (down to 1/6) for simulated targets against real-world terrestrial clutter backgrounds. We also discuss algorithm processing requirements and C++ software processing capabilities from our on-going MDA- and AFRL-sponsored development of an image processing toolkit (iPTK). In the current effort, the iPTK is being developed to a Technology Readiness Level (TRL) of 6 by mid-2010, in preparation for possible integration with STSS-like, SBIRS high-like and SBSS-like surveillance suites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Blanc, Katya Lee; Bower, Gordon Ross; Hill, Rachael Ann
In order to provide a basis for industry adoption of advanced technologies, the Control Room Upgrades Benefits Research Project will investigate the benefits of including advanced technologies as part of control room modernization This report describes the background, methodology, and research plan for the first in a series of full-scale studies to test the effects of advanced technology in NPP control rooms. This study will test the effect of Advanced Overview Displays in the partner Utility’s control room simulator
Ionization and expansion of barium clouds in the ionosphere
NASA Technical Reports Server (NTRS)
Ma, T.-Z.; Schunk, R. W.
1993-01-01
A recently envelope 3D model is used here to study the motion of the barium clouds released in the ionosphere, including the ionization stage. The ionization and the expansion of the barium clouds and the interaction between the clouds and the background ions are investigated using three simulations: a cloud without a directional velocity, a cloud with an initial velocity of 5 km/s across the B field, and a cloud with initial velocity components of 2 km/s both along and across the B field.
2002-03-18
KENNEDY SPACE CENTER, FLA. -- STS-110 Mission Specialist Jerry Ross waits his turn at driving the M-113 armored personnel carrier, part of Terminal Countdown Demonstration Test activities. In the background, right, is Mission Specialist Lee Morin. TCDT includes emergency egress training and a simulated launch countdown, and is held at KSC prior to each Space Shuttle flight. Scheduled for launch April 4, the 11-day mission will feature Shuttle Atlantis docking with the International Space Station (ISS) and delivering the S0 truss, the centerpiece-segment of the primary truss structure that will eventually extend over 300 feet
Traffic Flow Management and Optimization
NASA Technical Reports Server (NTRS)
Rios, Joseph Lucio
2014-01-01
This talk will present an overview of Traffic Flow Management (TFM) research at NASA Ames Research Center. Dr. Rios will focus on his work developing a large-scale, parallel approach to solving traffic flow management problems in the national airspace. In support of this talk, Dr. Rios will provide some background on operational aspects of TFM as well a discussion of some of the tools needed to perform such work including a high-fidelity airspace simulator. Current, on-going research related to TFM data services in the national airspace system and general aviation will also be presented.
Dosanjh, Manjit; Cirilli, Manuela; Navin, Sparsh
2015-01-01
Between 2011 and 2015, the ENTERVISION Marie Curie Initial Training Network has been training 15 young researchers from a variety of backgrounds on topics ranging from in-beam Positron Emission Tomography or Single Particle Tomography techniques, to adaptive treatment planning, optical imaging, Monte Carlo simulations and biological phantom design. This article covers the main research activities, as well as the training scheme implemented by the participating institutes, which included academia, research, and industry. PMID:26697403
KC-135 and Other Microgravity Simulations
NASA Technical Reports Server (NTRS)
2005-01-01
This document represents a summary of medical and scientific evaluations conducted aboard the KC-135 from June 23, 2004 to June 27, 2005. Included is a general overview of KC-135 activities manifested and coordinated by the Human Adaptation and Countermeasures Office. A collection of brief reports that describe tests conducted aboard the KC-135 follows the overview. Principal investigators and test engineers contributed significantly to the content of the report describing their particular experiment or hardware evaluation. This document concludes with an appendix that provides background information concerning the KC-135 and the Reduced-Gravity Program.
Taylor, Stephen R; Simon, Joseph; Sampson, Laura
2017-05-05
We introduce a technique for gravitational-wave analysis, where Gaussian process regression is used to emulate the strain spectrum of a stochastic background by training on population-synthesis simulations. This leads to direct Bayesian inference on astrophysical parameters. For pulsar timing arrays specifically, we interpolate over the parameter space of supermassive black-hole binary environments, including three-body stellar scattering, and evolving orbital eccentricity. We illustrate our approach on mock data, and assess the prospects for inference with data similar to the NANOGrav 9-yr data release.
Enhancement of free tropospheric ozone production by deep convection
NASA Technical Reports Server (NTRS)
Pickering, Kenneth E.; Thompson, Anne M.; Scala, John R.; Tao, Wei-Kuo; Simpson, Joanne
1994-01-01
It is found from model simulations of trace gas and meteorological data from aircraft campaigns that deep convection may enhance the potential for photochemical ozone production in the middle and upper troposphere by up to a factor of 60. Examination of half a dozen individual convective episodes show that the degree of enhancement is highly variable. Factors affecting enhancement include boundary layer NO(x) mixing ratios, differences in the strength and structure of convective cells, as well as variation in the amount of background pollution already in the free troposphere.
Radiator Enhanced Geothermal System - A Revolutionary Method for Extracting Geothermal Energy
NASA Astrophysics Data System (ADS)
Karimi, S.; Marsh, B. D.; Hilpert, M.
2017-12-01
A new method of extracting geothermal energy, the Radiator Enhanced Geothermal System (RAD-EGS) has been developed. RAD-EGS attempts to mimic natural hydrothermal systems by 1) generating a vertical vane of artificially produced high porosity/permeability material deep in a hot sedimentary aquifer, 2) injecting water at surface temperatures to the bottom of the vane, where the rock is the hottest, 3) extracting super-heated water at the top of the vane. The novel RAD-EGS differs greatly from the currently available Enhanced Geothermal Systems in vane orientation, determined in the governing local crustal stress field by Shmax and Sl (meaning it is vertical), and in the vane location in a hot sedimentary aquifer, which naturally increases the longevity of the system. In this study, we explore several parameters regimes affecting the water temperature in the extraction well, keeping in mind that the minimum temperature of the extracted water has to be 150 °C in order for a geothermal system to be commercially viable. We used the COMSOL finite element package to simulate coupled heat and fluid transfer within the RAD-EGS model. The following geologic layers from top to bottom are accounted for in the model: i) confining upper layer, ii) hot sedimentary aquifer, and iii) underlying basement rock. The vane is placed vertically within the sedimentary aquifer. An injection well and an extraction well are also included in the simulation. We tested the model for a wide range of various parameters including background heat flux, thickness of geologic layers, geometric properties of the vane, diameter and location of the wells, fluid flow within the wells, regional hydraulic gradient, and permeability and porosity of the layers. The results show that among the aforementioned parameters, background heat flux and the depth of vane emplacement are highly significant in determining the level of commercial viability of the geothermal system. These results indicate that for the terrains with relatively high background heat flux or for vanes located in relatively deep layers, the RAD-EGS can produce economic geothermal energy for more than 40 years. Moreover, these simulations show that the geothermal vane design with the injection well at the bottom and production well at the top of the vane greatly contributes to the longevity of the system.
NASA Technical Reports Server (NTRS)
Weidenspointner, G.; Harris, M. J.; Sturner, S.; Teegarden, B. J.; Ferguson, C.
2004-01-01
Intense and complex instrumental backgrounds, against which the much smaller signals from celestial sources have to be discerned, are a notorious problem for low and intermediate energy gamma-ray astronomy (approximately 50 keV - 10 MeV). Therefore a detailed qualitative and quantitative understanding of instrumental line and continuum backgrounds is crucial for most stages of gamma-ray astronomy missions, ranging from the design and development of new instrumentation through performance prediction to data reduction. We have developed MGGPOD, a user-friendly suite of Monte Carlo codes built around the widely used GEANT (Version 3.21) package, to simulate ab initio the physical processes relevant for the production of instrumental backgrounds. These include the build-up and delayed decay of radioactive isotopes as well as the prompt de-excitation of excited nuclei, both of which give rise to a plethora of instrumental gamma-ray background lines in addition t o continuum backgrounds. The MGGPOD package and documentation are publicly available for download. We demonstrate the capabilities of the MGGPOD suite by modeling high resolution gamma-ray spectra recorded by the Transient Gamma-Ray Spectrometer (TGRS) on board Wind during 1995. The TGRS is a Ge spectrometer operating in the 40 keV to 8 MeV range. Due to its fine energy resolution, these spectra reveal the complex instrumental background in formidable detail, particularly the many prompt and delayed gamma-ray lines. We evaluate the successes and failures of the MGGPOD package in reproducing TGRS data, and provide identifications for the numerous instrumental lines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gómez, Daniel O.; DeLuca, Edward E.; Mininni, Pablo D.
Recent high-resolution Atmospheric Imaging Assembly/Solar Dynamics Observatory images show evidence of the development of the Kelvin–Helmholtz (KH) instability, as coronal mass ejections (CMEs) expand in the ambient corona. A large-scale magnetic field mostly tangential to the interface is inferred, both on the CME and on the background sides. However, the magnetic field component along the shear flow is not strong enough to quench the instability. There is also observational evidence that the ambient corona is in a turbulent regime, and therefore the criteria for the development of the instability are a priori expected to differ from the laminar case. To studymore » the evolution of the KH instability with a turbulent background, we perform three-dimensional simulations of the incompressible magnetohydrodynamic equations. The instability is driven by a velocity profile tangential to the CME–corona interface, which we simulate through a hyperbolic tangent profile. The turbulent background is generated by the application of a stationary stirring force. We compute the instability growth rate for different values of the turbulence intensity, and find that the role of turbulence is to attenuate the growth. The fact that KH instability is observed sets an upper limit on the correlation length of the coronal background turbulence.« less
Evaluation of appropriate sensor specifications for space based ballistic missile detection
NASA Astrophysics Data System (ADS)
Schweitzer, Caroline; Stein, Karin; Wendelstein, Norbert
2012-10-01
The detection and tracking of ballistic missiles (BMs) during launch or cloud break using satellite based electro-optical (EO) sensors is a promising possibility for pre-instructing early warning and fire control radars. However, the successful detection of a BM is depending on the applied infrared (IR)-channel, as emission and reflection of threat and background vary in different spectral (IR-) bands and for different observation scenarios. In addition, the spatial resolution of the satellite based system also conditions the signal-to-clutter-ratio (SCR) and therefore the predictability of the flight path. Generally available satellite images provide data in spectral bands, which are suitable for remote sensing applications and earth surface observations. However, in the fields of BM early warning, these bands are not of interest making the simulation of background data essential. The paper focuses on the analysis of IR-bands suitable for missile detection by trading off the suppression of background signature against threat signal strength. This comprises a radiometric overview of the background radiation in different spectral bands for different climates and seasons as well as for various cloud types and covers. A brief investigation of the BM signature and its trajectory within a threat scenario is presented. Moreover, the influence on the SCR caused by different observation scenarios and varying spatial resolution are pointed out. The paper also introduces the software used for simulating natural background spectral radiance images, MATISSE ("Advanced Modeling of the Earth for Environment and Scenes Simulation") by ONERA [1].
Data model, dictionaries, and desiderata for biomolecular simulation data indexing and sharing
2014-01-01
Background Few environments have been developed or deployed to widely share biomolecular simulation data or to enable collaborative networks to facilitate data exploration and reuse. As the amount and complexity of data generated by these simulations is dramatically increasing and the methods are being more widely applied, the need for new tools to manage and share this data has become obvious. In this paper we present the results of a process aimed at assessing the needs of the community for data representation standards to guide the implementation of future repositories for biomolecular simulations. Results We introduce a list of common data elements, inspired by previous work, and updated according to feedback from the community collected through a survey and personal interviews. These data elements integrate the concepts for multiple types of computational methods, including quantum chemistry and molecular dynamics. The identified core data elements were organized into a logical model to guide the design of new databases and application programming interfaces. Finally a set of dictionaries was implemented to be used via SQL queries or locally via a Java API built upon the Apache Lucene text-search engine. Conclusions The model and its associated dictionaries provide a simple yet rich representation of the concepts related to biomolecular simulations, which should guide future developments of repositories and more complex terminologies and ontologies. The model still remains extensible through the decomposition of virtual experiments into tasks and parameter sets, and via the use of extended attributes. The benefits of a common logical model for biomolecular simulations was illustrated through various use cases, including data storage, indexing, and presentation. All the models and dictionaries introduced in this paper are available for download at http://ibiomes.chpc.utah.edu/mediawiki/index.php/Downloads. PMID:24484917
NASA Astrophysics Data System (ADS)
Li, Yizhen; McGillicuddy, Dennis J.; Dinniman, Michael S.; Klinck, John M.
2017-02-01
Both remotely sensed and in situ observations in austral summer of early 2012 in the Ross Sea suggest the presence of cold, low-salinity, and high-biomass eddies along the edge of the Ross Ice Shelf (RIS). Satellite measurements include sea surface temperature and ocean color, and shipboard data sets include hydrographic profiles, towed instrumentation, and underway acoustic Doppler current profilers. Idealized model simulations are utilized to examine the processes responsible for ice shelf eddy formation. 3-D model simulations produce similar cold and fresh eddies, although the simulated vertical lenses are quantitatively thinner than observed. Model sensitivity tests show that both basal melting underneath the ice shelf and irregularity of the ice shelf edge facilitate generation of cold and fresh eddies. 2-D model simulations further suggest that both basal melting and downwelling-favorable winds play crucial roles in forming a thick layer of low-salinity water observed along the edge of the RIS. These properties may have been entrained into the observed eddies, whereas that entrainment process was not captured in the specific eddy formation events studied in our 3-D model-which may explain the discrepancy between the simulated and observed eddies, at least in part. Additional sensitivity experiments imply that uncertainties associated with background stratification and wind stress may also explain why the model underestimates the thickness of the low-salinity lens in the eddy interiors. Our study highlights the importance of incorporating accurate wind forcing, basal melting, and ice shelf irregularity for simulating eddy formation near the RIS edge. The processes responsible for generating the high phytoplankton biomass inside these eddies remain to be elucidated. Appendix B. Details for the basal melting and mechanical forcing by the ice shelf edge.
Mohammed, Yassene; Verhey, Janko F
2005-01-01
Background Laser Interstitial ThermoTherapy (LITT) is a well established surgical method. The use of LITT is so far limited to homogeneous tissues, e.g. the liver. One of the reasons is the limited capability of existing treatment planning models to calculate accurately the damage zone. The treatment planning in inhomogeneous tissues, especially of regions near main vessels, poses still a challenge. In order to extend the application of LITT to a wider range of anatomical regions new simulation methods are needed. The model described with this article enables efficient simulation for predicting damaged tissue as a basis for a future laser-surgical planning system. Previously we described the dependency of the model on geometry. With the presented paper including two video files we focus on the methodological, physical and mathematical background of the model. Methods In contrast to previous simulation attempts, our model is based on finite element method (FEM). We propose the use of LITT, in sensitive areas such as the neck region to treat tumours in lymph node with dimensions of 0.5 cm – 2 cm in diameter near the carotid artery. Our model is based on calculations describing the light distribution using the diffusion approximation of the transport theory; the temperature rise using the bioheat equation, including the effect of microperfusion in tissue to determine the extent of thermal damage; and the dependency of thermal and optical properties on the temperature and the injury. Injury is estimated using a damage integral. To check our model we performed a first in vitro experiment on porcine muscle tissue. Results We performed the derivation of the geometry from 3D ultrasound data and show for this proposed geometry the energy distribution, the heat elevation, and the damage zone. Further on, we perform a comparison with the in-vitro experiment. The calculation shows an error of 5% in the x-axis parallel to the blood vessel. Conclusions The FEM technique proposed can overcome limitations of other methods and enables an efficient simulation for predicting the damage zone induced using LITT. Our calculations show clearly that major vessels would not be damaged. The area/volume of the damaged zone calculated from both simulation and in-vitro experiment fits well and the deviation is small. One of the main reasons for the deviation is the lack of accurate values of the tissue optical properties. In further experiments this needs to be validated. PMID:15631630
Regional Background Fine Particulate Matter
A modeling system composed of the global model GEOS-Chem providing hourly lateral boundary conditions to the regional model CMAQ was used to calculate the policy relevant background level of fine particulate: matter. Simulations were performed for the full year of 2004 over the d...
Dynamical influences on thermospheric composition: implications for semi-empirical models
NASA Astrophysics Data System (ADS)
Sutton, E. K.; Solomon, S. C.
2014-12-01
The TIE-GCM was recently augmented to include helium and argon, two approximately inert species that can be used as tracers of dynamics in the thermosphere. The former species is treated as a major species due to its large abundance near the upper boundary. The effects of exospheric transport are also included in order to simulate realistic seasonal and latitudinal helium distributions. The latter species is treated as a classical minor species, imparting absolutely no forces on the background atmosphere. In this study, we examine the interplay of the various dynamical terms - i.e. background circulation, molecular and Eddy diffusion - as they drive departures from the distributions that would be expected under the assumption of diffusive equilibrium. As this has implications on the formulation of all empirical thermospheric models, we use this understanding to address the following questions: (1) how do errors caused by the assumption of diffusive equilibrium manifest within empirical models of the thermosphere? and (2) where and when does an empirical model's output disagree with its underlying datasets due to the inherent limitations of said model's formulation?
Soil Moisture Active/Passive (SMAP) Forward Brightness Temperature Simulator
NASA Technical Reports Server (NTRS)
Peng, Jinzheng; Peipmeier, Jeffrey; Kim, Edward
2012-01-01
The SMAP is one of four first-tier missions recommended by the US National Research Council's Committee on Earth Science and Applications from Space (Earth Science and Applications from Space: National Imperatives for the Next Decade and Beyond, Space Studies Board, National Academies Press, 2007) [1]. It is to measure the global soil moisture and freeze/thaw from space. One of the spaceborne instruments is an L-band radiometer with a shared single feedhorn and parabolic mesh reflector. While the radiometer measures the emission over a footprint of interest, unwanted emissions are also received by the antenna through the antenna sidelobes from the cosmic background and other error sources such as the Sun, the Moon and the galaxy. Their effects need to be considered accurately, and the analysis of the overall performance of the radiometer requires end-to-end performance simulation from Earth emission to antenna brightness temperature, such as the global simulation of L-band brightness temperature simulation over land and sea [2]. To assist with the SMAP radiometer level 1B algorithm development, the SMAP forward brightness temperature simulator is developed by adapting the Aquarius simulator [2] with necessary modifications. This poster presents the current status of the SMAP forward brightness simulator s development including incorporating the land microwave emission model and its input datasets, and a simplified atmospheric radiative transfer model. The latest simulation results are also presented to demonstrate the ability of supporting the SMAP L1B algorithm development.
Realistic Reflections for Marine Environments in Augmented Reality Training Systems
2009-09-01
Static Backgrounds. Top: Agua Background. Bottom: Blue Background.............48 Figure 27. Ship Textures Used to Generate Reflections. In Order from...Like virtual simulations, augmented reality trainers can be configured to meet specific training needs and can be restarted and reused to train...Wave Distortion, Blurring and Shadow Many of the same methods outlined in Full Reflection shader were reused for the Physics shader. The same
Fuzzy logic and neural network technologies
NASA Technical Reports Server (NTRS)
Villarreal, James A.; Lea, Robert N.; Savely, Robert T.
1992-01-01
Applications of fuzzy logic technologies in NASA projects are reviewed to examine their advantages in the development of neural networks for aerospace and commercial expert systems and control. Examples of fuzzy-logic applications include a 6-DOF spacecraft controller, collision-avoidance systems, and reinforcement-learning techniques. The commercial applications examined include a fuzzy autofocusing system, an air conditioning system, and an automobile transmission application. The practical use of fuzzy logic is set in the theoretical context of artificial neural systems (ANSs) to give the background for an overview of ANS research programs at NASA. The research and application programs include the Network Execution and Training Simulator and faster training algorithms such as the Difference Optimized Training Scheme. The networks are well suited for pattern-recognition applications such as predicting sunspots, controlling posture maintenance, and conducting adaptive diagnoses.
Zhang, J; Feng, J-Y; Ni, Y-L; Wen, Y-J; Niu, Y; Tamba, C L; Yue, C; Song, Q; Zhang, Y-M
2017-06-01
Multilocus genome-wide association studies (GWAS) have become the state-of-the-art procedure to identify quantitative trait nucleotides (QTNs) associated with complex traits. However, implementation of multilocus model in GWAS is still difficult. In this study, we integrated least angle regression with empirical Bayes to perform multilocus GWAS under polygenic background control. We used an algorithm of model transformation that whitened the covariance matrix of the polygenic matrix K and environmental noise. Markers on one chromosome were included simultaneously in a multilocus model and least angle regression was used to select the most potentially associated single-nucleotide polymorphisms (SNPs), whereas the markers on the other chromosomes were used to calculate kinship matrix as polygenic background control. The selected SNPs in multilocus model were further detected for their association with the trait by empirical Bayes and likelihood ratio test. We herein refer to this method as the pLARmEB (polygenic-background-control-based least angle regression plus empirical Bayes). Results from simulation studies showed that pLARmEB was more powerful in QTN detection and more accurate in QTN effect estimation, had less false positive rate and required less computing time than Bayesian hierarchical generalized linear model, efficient mixed model association (EMMA) and least angle regression plus empirical Bayes. pLARmEB, multilocus random-SNP-effect mixed linear model and fast multilocus random-SNP-effect EMMA methods had almost equal power of QTN detection in simulation experiments. However, only pLARmEB identified 48 previously reported genes for 7 flowering time-related traits in Arabidopsis thaliana.
Measuring the Largest Angular Scale CMB B-mode Polarization with Galactic Foregrounds on a Cut Sky
NASA Astrophysics Data System (ADS)
Watts, Duncan J.; Larson, David; Marriage, Tobias A.; Abitbol, Maximilian H.; Appel, John W.; Bennett, Charles L.; Chuss, David T.; Eimer, Joseph R.; Essinger-Hileman, Thomas; Miller, Nathan J.; Rostem, Karwan; Wollack, Edward J.
2015-12-01
We consider the effectiveness of foreground cleaning in the recovery of Cosmic Microwave Background (CMB) polarization sourced by gravitational waves for tensor-to-scalar ratios in the range 0\\lt r\\lt 0.1. Using the planned survey area, frequency bands, and sensitivity of the Cosmology Large Angular Scale Surveyor (CLASS), we simulate maps of Stokes Q and U parameters at 40, 90, 150, and 220 GHz, including realistic models of the CMB, diffuse Galactic thermal dust and synchrotron foregrounds, and Gaussian white noise. We use linear combinations (LCs) of the simulated multifrequency data to obtain maximum likelihood estimates of r, the relative scalar amplitude s, and LC coefficients. We find that for 10,000 simulations of a CLASS-like experiment using only measurements of the reionization peak ({\\ell }≤slant 23), there is a 95% C.L. upper limit of r\\lt 0.017 in the case of no primordial gravitational waves. For simulations with r=0.01, we recover at 68% C.L. r={0.012}-0.006+0.011. The reionization peak corresponds to a fraction of the multipole moments probed by CLASS, and simulations including 30≤slant {\\ell }≤slant 100 further improve our upper limits to r\\lt 0.008 at 95% C.L. (r={0.010}-0.004+0.004 for primordial gravitational waves with r = 0.01). In addition to decreasing the current upper bound on r by an order of magnitude, these foreground-cleaned low multipole data will achieve a cosmic variance limited measurement of the E-mode polarization’s reionization peak.
Minică, Camelia C; Dolan, Conor V; Hottenga, Jouke-Jan; Willemsen, Gonneke; Vink, Jacqueline M; Boomsma, Dorret I
2013-05-01
When phenotypic, but no genotypic data are available for relatives of participants in genetic association studies, previous research has shown that family-based imputed genotypes can boost the statistical power when included in such studies. Here, using simulations, we compared the performance of two statistical approaches suitable to model imputed genotype data: the mixture approach, which involves the full distribution of the imputed genotypes and the dosage approach, where the mean of the conditional distribution features as the imputed genotype. Simulations were run by varying sibship size, size of the phenotypic correlations among siblings, imputation accuracy and minor allele frequency of the causal SNP. Furthermore, as imputing sibling data and extending the model to include sibships of size two or greater requires modeling the familial covariance matrix, we inquired whether model misspecification affects power. Finally, the results obtained via simulations were empirically verified in two datasets with continuous phenotype data (height) and with a dichotomous phenotype (smoking initiation). Across the settings considered, the mixture and the dosage approach are equally powerful and both produce unbiased parameter estimates. In addition, the likelihood-ratio test in the linear mixed model appears to be robust to the considered misspecification in the background covariance structure, given low to moderate phenotypic correlations among siblings. Empirical results show that the inclusion in association analysis of imputed sibling genotypes does not always result in larger test statistic. The actual test statistic may drop in value due to small effect sizes. That is, if the power benefit is small, that the change in distribution of the test statistic under the alternative is relatively small, the probability is greater of obtaining a smaller test statistic. As the genetic effects are typically hypothesized to be small, in practice, the decision on whether family-based imputation could be used as a means to increase power should be informed by prior power calculations and by the consideration of the background correlation.
NASA Technical Reports Server (NTRS)
Lauer, M.; Poirier, D. R.; Ghods, M.; Tewari, S. N.; Grugel, R. N.
2017-01-01
Simulations of the directional solidification of two hypoeutectic alloys (Al-7Si alloy and Al-19Cu) and resulting macrosegregation patterns are presented. The casting geometries include abrupt changes in cross-section from a larger width of 9.5 mm to a narrower 3.2 mm width then through an expansion back to a width of 9.5 mm. The alloys were chosen as model alloys because they have similar solidification shrinkages, but the effect of Cu on changing the density of the liquid alloy is about an order of magnitude greater than that of Si. The simulations compare well with experimental castings that were directionally solidified in a graphite mold in a Bridgman furnace. In addition to the simulations of the directional solidification in graphite molds, some simulations were effected for solidification in an alumina mold. This study showed that the mold must be included in numerical simulations of directional solidification because of its effect on the temperature field and solidification. For the model alloys used for the study, the simulations clearly show the interaction of the convection field with the solidifying alloys to produce a macrosegregation pattern known as "steepling" in sections with a uniform width. Details of the complex convection- and segregation-patterns at both the contraction and expansion of the cross-sectional area are revealed by the computer simulations. The convection and solidification through the expansions suggest a possible mechanism for the formation of stray grains. The computer simulations and the experimental castings have been part of on-going ground-based research with the goal of providing necessary background for eventual experiments aboard the ISS. For casting practitioners, the results of the simulations demonstrate that computer simulations should be applied to reveal interactions between alloy solidification properties, solidification conditions, and mold geometries on macrosegregation. The simulations also presents the possibility of engineering the mold-material to avoid, or mitigate, the effects of thermosolutal convection and macrosegregation by selecting a mold material with suitable thermal properties, especially its thermal conductivity.
Uriev, N B; Kuchin, I V
2007-10-31
A review of the basic theories and models of shear flow of suspensions is presented and the results of modeling of structured suspensions under flow conditions. The physical backgrounds and conditions of macroscopic discontinuity in the behaviour of high-concentrated systems are analyzed. The use of surfactants and imposed vibration for regulation of rheological properties of suspensions are considered. A review of the recent approaches and methods of computer simulation of concentrated suspensions is undertaken and results of computer simulation of suspensions are presented. Formation and destruction of the structure of suspension under static and dynamic conditions (including imposed combined shear and orthogonal oscillations) are discussed. The influence of interaction of particles as well as of some parameters characterizing a type and intensity of external perturbations on suspensions behavior is demonstrated.
New developments in the McStas neutron instrument simulation package
NASA Astrophysics Data System (ADS)
Willendrup, P. K.; Knudsen, E. B.; Klinkby, E.; Nielsen, T.; Farhi, E.; Filges, U.; Lefmann, K.
2014-07-01
The McStas neutron ray-tracing software package is a versatile tool for building accurate simulators of neutron scattering instruments at reactors, short- and long-pulsed spallation sources such as the European Spallation Source. McStas is extensively used for design and optimization of instruments, virtual experiments, data analysis and user training. McStas was founded as a scientific, open-source collaborative code in 1997. This contribution presents the project at its current state and gives an overview of the main new developments in McStas 2.0 (December 2012) and McStas 2.1 (expected fall 2013), including many new components, component parameter uniformisation, partial loss of backward compatibility, updated source brilliance descriptions, developments toward new tools and user interfaces, web interfaces and a new method for estimating beam losses and background from neutron optics.
Hellfeld, D.; Bernstein, A.; Dazeley, S.; ...
2017-01-01
The potential of elastic antineutrino-electron scattering (ν¯ e + e – → ν¯ e + e –) in a Gd-doped water Cherenkov detector to determine the direction of a nuclear reactor antineutrino flux was investigated using the recently proposed WATCHMAN antineutrino experiment as a baseline model. The expected scattering rate was determined assuming a 13 km standoff from a 3.758 GWt light water nuclear reactor. Background was estimated via independent simulations and by appropriately scaling published measurements from similar detectors. Many potential backgrounds were considered, including solar neutrinos, misidentified reactor-based inverse beta decay interactions, cosmogenic radionuclide and water-borne radon decays,more » and gamma rays from the photomultiplier tubes, detector walls, and surrounding rock. The detector response was modeled using a GEANT4-based simulation package. The results indicate that with the use of low radioactivity PMTs and sufficient fiducialization, water-borne radon and cosmogenic radionuclides pose the largest threats to sensitivity. The directional sensitivity was then analyzed as a function of radon contamination, detector depth, and detector size. Lastly, the results provide a list of theoretical conditions that, if satisfied in practice, would enable nuclear reactor antineutrino directionality in a Gd-doped water Cherenkov detector approximately 10 km from a large power reactor.« less
NASA Astrophysics Data System (ADS)
Shen, Huizhong; Tao, Shu
2014-05-01
Global atmospheric emissions of 16 polycyclic aromatic hydrocarbons (PAHs) from 69 major sources were estimated for a period from 1960 to 2030. Regression models and a technology split method were used to estimated country and time specific emission factors, resulting in a new estimate of PAH emission factor variation among different countries and over time. PAH emissions in 2007 were spatially resolved to 0.1° × 0.1° grids based on a newly developed global high-resolution fuel combustion inventory (PKU-FUEL-2007). MOZART-4 (The Model for Ozone and Related Chemical Tracers, version 4) was applied to simulate the global tropospheric transport of Benzo(a)pyrene, one of the high molecular weight carcinogenic PAHs, at a horizontal resolution of 1.875° (longitude) × 1.8947° (latitude). The reaction with OH radical, gas/particle partitioning, wet deposition, dry deposition, and dynamic soil/ocean-air exchange of PAHs were considered. The simulation was validated by observations at both background and non-background sites, including Alert site in Canadian High Arctic, EMEP sites in Europe, and other 254 urban/rural sites reported from literatures. Key factors effecting long-range transport of BaP were addressed, and transboundary pollution was discussed.
CMB Polarization B-mode Delensing with SPTpol and Herschel
NASA Astrophysics Data System (ADS)
Manzotti, A.; Story, K. T.; Wu, W. L. K.; Austermann, J. E.; Beall, J. A.; Bender, A. N.; Benson, B. A.; Bleem, L. E.; Bock, J. J.; Carlstrom, J. E.; Chang, C. L.; Chiang, H. C.; Cho, H.-M.; Citron, R.; Conley, A.; Crawford, T. M.; Crites, A. T.; de Haan, T.; Dobbs, M. A.; Dodelson, S.; Everett, W.; Gallicchio, J.; George, E. M.; Gilbert, A.; Halverson, N. W.; Harrington, N.; Henning, J. W.; Hilton, G. C.; Holder, G. P.; Holzapfel, W. L.; Hoover, S.; Hou, Z.; Hrubes, J. D.; Huang, N.; Hubmayr, J.; Irwin, K. D.; Keisler, R.; Knox, L.; Lee, A. T.; Leitch, E. M.; Li, D.; McMahon, J. J.; Meyer, S. S.; Mocanu, L. M.; Natoli, T.; Nibarger, J. P.; Novosad, V.; Padin, S.; Pryke, C.; Reichardt, C. L.; Ruhl, J. E.; Saliwanchik, B. R.; Sayre, J. T.; Schaffer, K. K.; Smecher, G.; Stark, A. A.; Vanderlinde, K.; Vieira, J. D.; Viero, M. P.; Wang, G.; Whitehorn, N.; Yefremenko, V.; Zemcov, M.
2017-09-01
We present a demonstration of delensing the observed cosmic microwave background (CMB) B-mode polarization anisotropy. This process of reducing the gravitational-lensing-generated B-mode component will become increasingly important for improving searches for the B modes produced by primordial gravitational waves. In this work, we delens B-mode maps constructed from multi-frequency SPTpol observations of a 90 deg2 patch of sky by subtracting a B-mode template constructed from two inputs: SPTpol E-mode maps and a lensing potential map estimated from the Herschel 500 μm map of the cosmic infrared background. We find that our delensing procedure reduces the measured B-mode power spectrum by 28% in the multipole range 300< {\\ell }< 2300; this is shown to be consistent with expectations from simulations and to be robust against systematics. The null hypothesis of no delensing is rejected at 6.9σ . Furthermore, we build and use a suite of realistic simulations to study the general properties of the delensing process and find that the delensing efficiency achieved in this work is limited primarily by the noise in the lensing potential map. We demonstrate the importance of including realistic experimental non-idealities in the delensing forecasts used to inform instrument and survey-strategy planning of upcoming lower-noise experiments, such as CMB-S4.
Experimental task-based optimization of a four-camera variable-pinhole small-animal SPECT system
NASA Astrophysics Data System (ADS)
Hesterman, Jacob Y.; Kupinski, Matthew A.; Furenlid, Lars R.; Wilson, Donald W.
2005-04-01
We have previously utilized lumpy object models and simulated imaging systems in conjunction with the ideal observer to compute figures of merit for hardware optimization. In this paper, we describe the development of methods and phantoms necessary to validate or experimentally carry out these optimizations. Our study was conducted on a four-camera small-animal SPECT system that employs interchangeable pinhole plates to operate under a variety of pinhole configurations and magnifications (representing optimizable system parameters). We developed a small-animal phantom capable of producing random backgrounds for each image sequence. The task chosen for the study was the detection of a 2mm diameter sphere within the phantom-generated random background. A total of 138 projection images were used, half of which included the signal. As our observer, we employed the channelized Hotelling observer (CHO) with Laguerre-Gauss channels. The signal-to-noise (SNR) of this observer was used to compare different system configurations. Results indicate agreement between experimental and simulated data with higher detectability rates found for multiple-camera, multiple-pinhole, and high-magnification systems, although it was found that mixtures of magnifications often outperform systems employing a single magnification. This work will serve as a basis for future studies pertaining to system hardware optimization.
Challenges & Roadmap for Beyond CMOS Computing Simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodrigues, Arun F.; Frank, Michael P.
Simulating HPC systems is a difficult task and the emergence of “Beyond CMOS” architectures and execution models will increase that difficulty. This document presents a “tutorial” on some of the simulation challenges faced by conventional and non-conventional architectures (Section 1) and goals and requirements for simulating Beyond CMOS systems (Section 2). These provide background for proposed short- and long-term roadmaps for simulation efforts at Sandia (Sections 3 and 4). Additionally, a brief explanation of a proof-of-concept integration of a Beyond CMOS architectural simulator is presented (Section 2.3).
Case study on risk evaluation of printed electronics using nanosilver ink.
Kim, Ellen; Lee, Ji Hyun; Kim, Jin Kwon; Lee, Gun Ho; Ahn, Kangho; Park, Jung Duck; Yu, Il Je
2016-01-01
With the ever-increasing development of nanotechnology, our society is being surrounded by possible risks related to exposure to manufactured nanomaterials. The consumer market already includes many products that contain silver nanoparticles (AgNPs), including various household products, such as yoga mats, cutting boards, running shirts, and socks. There is a growing concern over the release of AgNPs in workplaces related to the manufacture and application of nanomaterials. This study investigated the release of AgNPs during the operation of a printed electronics printer. Using an exposure simulation chamber, a nanoparticle collector, scanning mobility particle sizer (SMPS), condensation particle counter (CPC), dust monitor, and mixed cellulose ester (MCE) filters are all connected to measure the AgNP exposure levels when operating a printed electronics printer. A very small amount of AgNPs was released during the operation of the printed electronics printer, and the number of AgNPs inside the exposure simulation chamber was lower than that outside background. In addition, when evaluating the potential risks for consumers and workers using a margin of exposure (MOE) approach and target MOE of 1000, the operational results far exceeded the target MOE in this simulation study and in a previous workplace exposure study. The overall results indicate a no-risk concern level in the case of printed electronics using nanosilver ink.
Rapp, Jennifer L.; Reilly, Pamela A.
2017-11-14
BackgroundThe U.S. Geological Survey (USGS), in cooperation with the Virginia Department of Environmental Quality (DEQ), reviewed a previously compiled set of linear regression models to assess their utility in defining the response of the aquatic biological community to streamflow depletion.As part of the 2012 Virginia Healthy Watersheds Initiative (HWI) study conducted by Tetra Tech, Inc., for the U.S. Environmental Protection Agency (EPA) and Virginia DEQ, a database with computed values of 72 hydrologic metrics, or indicators of hydrologic alteration (IHA), 37 fish metrics, and 64 benthic invertebrate metrics was compiled and quality assured. Hydrologic alteration was represented by simulation of streamflow record for a pre-water-withdrawal condition (baseline) without dams or developed land, compared to the simulated recent-flow condition (2008 withdrawal simulation) including dams and altered landscape to calculate a percent alteration of flow. Biological samples representing the existing populations represent a range of alteration in the biological community today.For this study, all 72 IHA metrics, which included more than 7,272 linear regression models, were considered. This extensive dataset provided the opportunity for hypothesis testing and prioritization of flow-ecology relations that have the potential to explain the effect(s) of hydrologic alteration on biological metrics in Virginia streams.
Students' Development of Representational Competence Through the Sense of Touch
NASA Astrophysics Data System (ADS)
Magana, Alejandra J.; Balachandran, Sadhana
2017-06-01
Electromagnetism is an umbrella encapsulating several different concepts like electric current, electric fields and forces, and magnetic fields and forces, among other topics. However, a number of studies in the past have highlighted the poor conceptual understanding of electromagnetism concepts by students even after instruction. This study aims to identify novel forms of "hands-on" instruction that can result in representational competence and conceptual gain. Specifically, this study aimed to identify if the use of visuohaptic simulations can have an effect on student representations of electromagnetic-related concepts. The guiding questions is How do visuohaptic simulations influence undergraduate students' representations of electric forces? Participants included nine undergraduate students from science, technology, or engineering backgrounds who participated in a think-aloud procedure while interacting with a visuohaptic simulation. The think-aloud procedure was divided in three stages, a prediction stage, a minimally visual haptic stage, and a visually enhanced haptic stage. The results of this study suggest that students' accurately characterized and represented the forces felt around a particle, line, and ring charges either in the prediction stage, a minimally visual haptic stage or the visually enhanced haptic stage. Also, some students accurately depicted the three-dimensional nature of the field for each configuration in the two stages that included a tactile mode, where the point charge was the most challenging one.
Effects of in-plane magnetic field on the transport of 2D electron vortices in non-uniform plasmas
NASA Astrophysics Data System (ADS)
Angus, Justin; Richardson, Andrew; Schumer, Joseph; Pulsed Power Team
2015-11-01
The formation of electron vortices in current-carrying plasmas is observed in 2D particle-in-cell (PIC) simulations of the plasma-opening switch. In the presence of a background density gradient in Cartesian systems, vortices drift in the direction found by crossing the magnetic field with the background density gradient as a result of the Hall effect. However, most of the 2D simulations where electron vortices are seen and studied only allow for in-plane currents and thus only an out-of-plane magnetic field. Here we present results of numerical simulations of 2D, seeded electron vortices in an inhomogeneous background using the generalized 2D electron-magneto-hydrodynamic model that additionally allows for in-plane components of the magnetic field. By seeding vortices with a varying axial component of the velocity field, so that the vortex becomes a corkscrew, it is found that a pitch angle of around 20 degrees is sufficient to completely prevent the vortex from propagating due to the Hall effect for typical plasma parameters. This work is supported by the NRL Base Program.
Astrophysically Relevant Dipole Studies at WiPAL
NASA Astrophysics Data System (ADS)
Endrizzi, Douglass; Forest, Cary; Wallace, John; WiPAL Team
2015-11-01
A novel terrella experiment is being developed to immerse a dipole magnetic field in the large, unmagnetized, and fully ionized background plasma of WiPAL (Wisconsin Plasma Astrophysics Lab). This allows for a series of related experiments motivated by astrophysical processes, including (1) inward transport of plasma into a magnetosphere with focus on development of Kelvin-Helmholtz instabilities from boundary shear flow; (2) helicity injection and simulation of solar eruptive events via electrical breakdown along dipole field lines; (3) interaction of Coronal Mass Ejection-like flows with a target magnetosphere and dependence on background plasma pressure; (4) production of a centrifugally driven wind to study how dipolar magnetic topology changes as closed field lines open. A prototype has been developed and preliminary results will be presented. An overview of the final design and construction progress will be given. This material is based upon work supported by the NSF Graduate Research Fellowship Program.
An analog retina model for detecting dim moving objects against a bright moving background
NASA Technical Reports Server (NTRS)
Searfus, R. M.; Colvin, M. E.; Eeckman, F. H.; Teeters, J. L.; Axelrod, T. S.
1991-01-01
We are interested in applications that require the ability to track a dim target against a bright, moving background. Since the target signal will be less than or comparable to the variations in the background signal intensity, sophisticated techniques must be employed to detect the target. We present an analog retina model that adapts to the motion of the background in order to enhance targets that have a velocity difference with respect to the background. Computer simulation results and our preliminary concept of an analog 'Z' focal plane implementation are also presented.
Study of new anticoincidence systems design
NASA Astrophysics Data System (ADS)
Chabaud, J.; Laurent, P.; Baronick, J.-P.; Oger, R.; Prévôt, G.
2012-12-01
The scientific performances of future hard X-ray missions will necessitate a very low detector background level. This will imply thorough background simulations, and efficient background rejection systems. It necessitates also a very good knowledge of the detectors to be shielded. We got experience on these activities by conceiving and optimizing the active and passive background rejection system of the Simbol-X and IXO/HXI missions. Considering that this work may naturally be extended to other X-ray missions, we have initiated with CNES, in 2010, a R&T project on the study of background rejection systems, whose status will be presented in this paper.
Using HexSim to simulate complex species, landscape, and stressor interactions
Background / Question / Methods The use of simulation models in conservation biology, landscape ecology, and other disciplines is increasing. Models are essential tools for researchers who, for example, need to forecast future conditions, weigh competing recovery and mitigation...
Advanced radiometric and interferometric milimeter-wave scene simulations
NASA Technical Reports Server (NTRS)
Hauss, B. I.; Moffa, P. J.; Steele, W. G.; Agravante, H.; Davidheiser, R.; Samec, T.; Young, S. K.
1993-01-01
Smart munitions and weapons utilize various imaging sensors (including passive IR, active and passive millimeter-wave, and visible wavebands) to detect/identify targets at short standoff ranges and in varied terrain backgrounds. In order to design and evaluate these sensors under a variety of conditions, a high-fidelity scene simulation capability is necessary. Such a capability for passive millimeter-wave scene simulation exists at TRW. TRW's Advanced Radiometric Millimeter-Wave Scene Simulation (ARMSS) code is a rigorous, benchmarked, end-to-end passive millimeter-wave scene simulation code for interpreting millimeter-wave data, establishing scene signatures and evaluating sensor performance. In passive millimeter-wave imaging, resolution is limited due to wavelength and aperture size. Where high resolution is required, the utility of passive millimeter-wave imaging is confined to short ranges. Recent developments in interferometry have made possible high resolution applications on military platforms. Interferometry or synthetic aperture radiometry allows the creation of a high resolution image with a sparsely filled aperture. Borrowing from research work in radio astronomy, we have developed and tested at TRW scene reconstruction algorithms that allow the recovery of the scene from a relatively small number of spatial frequency components. In this paper, the TRW modeling capability is described and numerical results are presented.
Data-based Considerations in Portal Radiation Monitoring of Cargo Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weier, Dennis R.; O'Brien, Robert F.; Ely, James H.
2004-07-01
Radiation portal monitoring of cargo vehicles often includes a configuration of four-panel monitors that record gamma and neutron counts from vehicles transporting cargo. As vehicles pass the portal monitors, they generate a count profile over time that can be compared to the average panel background counts obtained just prior to the time the vehicle entered the area of the monitors. Pacific Northwest National Laboratory has accumulated considerable data regarding such background radiation and vehicle profiles from portal installations, as well as in experimental settings using known sources and cargos. Several considerations have a bearing on how alarm thresholds are setmore » in order to maintain sensitivity to radioactive sources while also controlling to a manageable level the rate of false or nuisance alarms. False alarms are statistical anomalies while nuisance alarms occur due to the presence of naturally occurring radioactive material (NORM) in cargo, for example, kitty litter. Considerations to be discussed include: • Background radiation suppression due to the shadow shielding from the vehicle. • The impact of the relative placement of the four panels on alarm decision criteria. • Use of plastic scintillators to separate gamma counts into energy windows. • The utility of using ratio criteria for the energy window counts rather than simply using total window counts. • Detection likelihood for these various decision criteria based on computer simulated injections of sources into vehicle profiles.« less
Development of a new type of germanium detector for dark matter searches
NASA Astrophysics Data System (ADS)
Wei, Wenzhao
Monte Carlo simulation is an important tool used to develop a better understanding of important physical processes. This thesis describes three Monte Carlo simulations used to understand germanium detector response to low energy nuclear recoils and radiogenic backgrounds for direct dark matter searches. The first simulation is the verification of Barker-Mei model, a theoretical model for calculating the ionization efficiency for germanium detector for the energy range of 1 - 100 keV. Utilizing the shape analysis, a bin-to-bin comparison between simulation and experimental data was performed for verifying the accuracy of the Barker-Mei model. A percentage difference within 4% was achieved between data and simulation, which showed the validity of the Barker-Mei model. The second simulation is the study of a new type of germanium detector for n/gamma discrimination at 77 K with plasma time difference in pulse shape. Due to the poor time resolution, conventional P-type Point Contact (PPC) and coaxial germanium detectors are not capable of discriminating nuclear recoils from electron recoils. In this thesis, a new idea of using great detector granularity and plasma time difference in pulse shape to discriminate nuclear recoils from electron recoils with planar germanium detectors in strings was discussed. The anticipated sensitivity of this new detector array is shown for detecting dark matter. The last simulation is a study of a new type of germanium-detector array serving as a PMT screening facility for ultra-low background dark matter experiments using noble liquid xenon as detector material such LUX/LZ and XENON100/XENON1T. A well-shaped germanium detector array and a PMT were simulated to study the detector response to the signal and background for a better understanding of the radiogenic gamma rays from PMTs. The detector efficiency and other detector performance were presented in this work.
A fast scintillator Compton telescope for medium-energy gamma-ray astronomy
NASA Astrophysics Data System (ADS)
Bloser, Peter F.; Ryan, James M.; Legere, Jason S.; Julien, Manuel; Bancroft, Christopher M.; McConnell, Mark L.; Wallace, Mark; Kippen, R. Marc; Tornga, Shawn
2010-07-01
The field of medium-energy gamma-ray astronomy urgently needs a new mission to build on the success of the COMPTEL instrument on the Compton Gamma Ray Observatory. This mission must achieve sensitivity significantly greater than that of COMPTEL in order to advance the science of relativistic particle accelerators, nuclear astrophysics, and diffuse backgrounds, and bridge the gap between current and future hard X-ray missions and the high-energy Fermi mission. Such an increase in sensitivity can only come about via a dramatic decrease in the instrumental background. We are currently developing a concept for a low-background Compton telescope that employs modern scintillator technology to achieve this increase in sensitivity. Specifically, by employing LaBr3 scintillators for the calorimeter, one can take advantage of the unique speed and resolving power of this material to improve the instrument sensitivity while simultaneously enhancing its spectroscopic and imaging performance. Also, using deuterated organic scintillator in the scattering detector will reduce internal background from neutron capture. We present calibration results from a laboratory prototype of such an instrument, including time-of-flight, energy, and angular resolution, and compare them to simulation results using a detailed Monte Carlo model. We also describe the balloon payload we have built for a test flight of the instrument in the fall of 2010.
Multispectral infrared target detection: phenomenology and modeling
NASA Astrophysics Data System (ADS)
Cederquist, Jack N.; Rogne, Timothy J.; Schwartz, Craig R.
1993-10-01
Many targets of interest provide only very small signature differences from the clutter background. The ability to detect these small difference targets should be improved by using data which is diverse in space, time, wavelength or some other observable. Target materials often differ from background materials in the variation of their reflectance or emittance with wavelength. A multispectral sensor is therefore considered as a means to improve detection of small signal targets. If this sensor operates in the thermal infrared, it will not need solar illumination and will be useful at night as well as during the day. An understanding of the phenomenology of the spectral properties of materials and an ability to model and simulate target and clutter signatures is needed to understand potential target detection performance from multispectral infrared sensor data. Spectral variations in material emittance are due to vibrational energy transitions in molecular bonds. The spectral emittances of many materials of interest have been measured. Examples are vegetation, soil, construction and road materials, and paints. A multispectral infrared signature model has been developed which includes target and background temperature and emissivity, sky, sun, cloud and background irradiance, multiple reflection effects, path radiance, and atmospheric attenuation. This model can be used to predict multispectral infrared signatures for small signal targets.
Background concentrations for high resolution satellite observing systems of methane
NASA Astrophysics Data System (ADS)
Benmergui, J. S.; Propp, A. M.; Turner, A. J.; Wofsy, S. C.
2017-12-01
Emerging satellite technologies promise to measure total column dry-air mole fractions of methane (XCH4) at resolutions on the order of a kilometer. XCH4 is linearly related to regional methane emissions through enhancements in the mixed layer, giving these satellites the ability to constrain emissions at unprecedented resolution. However, XCH4 is also sensitive to variability in transport of upwind concentrations (the "background concentration"). Variations in the background concentration are caused by synoptic scale transport in both the free troposphere and the stratosphere, as well as the rate of methane oxidation. Misspecification of the background concentration is aliased onto retrieved emissions as bias. This work explores several methods of specifying the background concentration for high resolution satellite observations of XCH4. We conduct observing system simulation experiments (OSSEs) that simulate the retrieval of emissions in the Barnett Shale using observations from a 1.33 km resolution XCH4 imaging satellite. We test background concentrations defined (1) from an external continental-scale model, (2) using pixels along the edge of the image as a boundary value, (3) using differences between adjacent pixels, and (4) using differences between the same pixel separated by one hour in time. We measure success using the accuracy of the retrieval, the potential for bias induced by misspecification of the background, and the computational expedience of the method. Pathological scenarios are given to each method.
Impacts of Central American Fires on Ozone Air Quality in Texas
NASA Astrophysics Data System (ADS)
Wang, S. C.; Wang, Y.; Lei, R.; Talbot, R. W.
2016-12-01
Background ozone represents the portion of ozone level in one day that cannot be reduced by local emission controls. One of the important factors causing high background ozone events is wildfires. Satellite observations have documented frequent transport of wildfire smoke from Mexico and Central America to the southern US, particularly Texas, causing haze and exceedance of fine particle matters. However, the impact of those fires on background ozone in Texas is poorly understood. In this study, the effects of the Central America fire emissions in spring (Apr-May) from 2000 to 2013 on high background ozone events in Texas are investigated and quantified. We first examine through back trajectory analysis if any high background ozone days in cities of Texas such as Houston can be traced back to fire events in Central America. The GEOS-Chem global chemical transport model and its nested-grid version over North America are used to simulate the periods of the selected cases studies of Central American fires. Long-large transport of gaseous emissions (NOx, VOCs, and CO) from Central American fires are simulated and background ozone concentrations variations in Texas region due to those fire events are also quantified through the difference in model results with and without fire emissions in Central America. Finally, this study connects those fires and high background ozone events, and also quantifies the contribution of fire emissions from Central America on Texas ozone air quality.
Fidelity of Simulation for Pilot Training
1980-12-01
is worthwhile emphasizing at this point that the study is focused on fidelity of simulators for pilot training. It does not consider simulation for...significantly higher cost than low fidelity. Motivation for 0~is study is to obtain background information on the effect of simulator fidel- ity on ...bottom of the diagram is the recom- mended approach. In practice, however, it is often the case that emphasis is placed on work in the bottom segment of
Simulated cosmic microwave background maps at 0.5 deg resolution: Basic results
NASA Technical Reports Server (NTRS)
Hinshaw, G.; Bennett, C. L.; Kogut, A.
1995-01-01
We have simulated full-sky maps of the cosmic microwave background (CMB) anisotropy expected from cold dark matter (CDM) models at 0.5 deg and 1.0 deg angular resolution. Statistical properties of the maps are presented as a function of sky coverage, angular resolution, and instrument noise, and the implications of these results for observability of the Doppler peak are discussed. The rms fluctuations in a map are not a particularly robust probe of the existence of a Doppler peak; however, a full correlation analysis can provide reasonable sensitivity. We find that sensitivity to the Doppler peak depends primarily on the fraction of sky covered, and only secondarily on the angular resolution and noise level. Color plates of the simulated maps are presented to illustrate the anisotropies.
Comparison of N2O Emissions from Soils at Three Temperate Agricultural Sites
NASA Technical Reports Server (NTRS)
Frolking, S. E.; Moiser, A. R.; Ojima, D. S.; Li, C.; Parton, W. J.; Potter, C. S.; Priesack, E.; Stenger, R.; Haberbosch, C.; Dorsch, P.;
1997-01-01
Nitrous oxide (N2O) flux simulations by four models were compared with year-round field measurements from five temperate agricultural sites in three countries. The field sites included an unfertilized, semi-arid rangeland with low N2O fluxes in eastern Colorado, USA; two fertilizer treatments (urea and nitrate) on a fertilized grass ley cut for silage in Scotland; and two fertilized, cultivated crop fields in Germany where N2O loss during the winter was quite high. The models used were daily trace gas versions of the CENTURY model, DNDC, ExpertN, and the NASA-Ames version of the CASA model. These models included similar components (soil physics, decomposition, plant growth, and nitrogen transformations), but in some cases used very different algorithms for these processes. All models generated similar results for the general cycling of nitrogen through the agro-ecosystems, but simulated nitrogen trace gas fluxes were quite different. In most cases the simulated N20 fluxes were within a factor of about 2 of the observed annual fluxes, but even when models produced similar N2O fluxes they often produced very different estimates of gaseous N loss as nitric oxide (NO), dinitrogen (N2), and ammonia (NH3). Accurate simulation of soil moisture appears to be a key requirement for reliable simulation of N2O emissions. All models simulated the general pattern of low background fluxes with high fluxes following fertilization at the Scottish sites, but they could not (or were not designed to) accurately capture the observed effects of different fertilizer types on N2O flux. None of the models were able to reliably generate large pulses of N2O during brief winter thaws that were observed at the two German sites. All models except DNDC simulated very low N2O fluxes for the dry site in Colorado. The US Trace Gas Network (TRAGNET) has provided a mechanism for this model and site intercomparison. Additional intercomparisons are needed with these and other models and additional data sets; these should include both tropical agro-ecosystems and new agricultural management techniques designed for sustainability.
Input comparison of radiogenic neutron estimates for ultra-low background experiments
NASA Astrophysics Data System (ADS)
Cooley, J.; Palladino, K. J.; Qiu, H.; Selvi, M.; Scorza, S.; Zhang, C.
2018-04-01
Ultra-low-background experiments address some of the most important open questions in particle physics, cosmology and astrophysics: the nature of dark matter, whether the neutrino is its own antiparticle, and does the proton decay. These rare event searches require well-understood and minimized backgrounds. Simulations are used to understand backgrounds caused by naturally occurring radioactivity in the rock and in every piece of shielding and detector material used in these experiments. Most important are processes like spontaneous fission and (α,n) reactions in material close to the detectors that can produce neutrons. A comparison study of the (α,n) reactions between two dedicated software packages is detailed. The cross section libraries, neutron yields, and spectra from the Mei-Zhang-Hime and the SOURCES-4A codes are presented. The resultant yields and spectra are used as inputs to direct dark matter detector toy models in GEANT4, to study the impact of their differences on background estimates and fits. Although differences in neutron yield calculations up to 50% were seen, there was no systematic difference between the Mei-Hime-Zhang and SOURCES-4A results. Neutron propagation simulations smooth differences in spectral shape and yield, and both tools were found to meet the broad requirements of the low-background community.
Systematic simulations of modified gravity: chameleon models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brax, Philippe; Davis, Anne-Christine; Li, Baojiu
2013-04-01
In this work we systematically study the linear and nonlinear structure formation in chameleon theories of modified gravity, using a generic parameterisation which describes a large class of models using only 4 parameters. For this we have modified the N-body simulation code ecosmog to perform a total of 65 simulations for different models and parameter values, including the default ΛCDM. These simulations enable us to explore a significant portion of the parameter space. We have studied the effects of modified gravity on the matter power spectrum and mass function, and found a rich and interesting phenomenology where the difference withmore » the ΛCDM paradigm cannot be reproduced by a linear analysis even on scales as large as k ∼ 0.05 hMpc{sup −1}, since the latter incorrectly assumes that the modification of gravity depends only on the background matter density. Our results show that the chameleon screening mechanism is significantly more efficient than other mechanisms such as the dilaton and symmetron, especially in high-density regions and at early times, and can serve as a guidance to determine the parts of the chameleon parameter space which are cosmologically interesting and thus merit further studies in the future.« less
Effects of the guard electrode on the photoelectron distribution around an electric field sensor
NASA Astrophysics Data System (ADS)
Miyake, Y.; Usui, H.; Kojima, H.
2011-05-01
We have developed a numerical model of a double-probe electric field sensor equipped with a photoelectron guard electrode for the particle-in-cell simulation. The model includes typical elements of modern double-probe sensors on, e.g., BepiColombo/MMO, Cluster, and THEMIS spacecraft, such as a conducting boom and a preamplifier housing called a puck. The puck is also used for the guard electrode, and its potential is negatively biased by reference to the floating spacecraft potential. We apply the proposed model to an analysis of an equilibrium plasma environment around the sensor by assuming that the sun illuminates the spacecraft from the direction perpendicular to the sensor deployment axis. As a simulation result, it is confirmed that a substantial number of spacecraft-originating photoelectrons are once emitted sunward and then fall onto the puck and sensing element positions. In order to effectively repel such photoelectrons coming from the sun direction, a potential hump for electrons, i.e., a negative potential region, should be created in a plasma region around the sunlit side of the guard electrode surface. The simulation results reveal the significance of the guard electrode potential being not only lower than the spacecraft body but also lower than the background plasma potential of the region surrounding the puck and the sensing element. One solution for realizing such an operational condition is to bias the guard potential negatively by reference to the sensor potential because the sensor is usually operated nearly at the background plasma potential.
Crew Training - Apollo X (Apollo Mission Simulator [AMS])
1969-04-05
S69-32787 (3 April 1969) --- Two members of the Apollo 10 prime crew participate in simulation activity at the Kennedy Space Center during preparations for their scheduled lunar orbit mission. Astronaut Thomas P. Stafford, commander, is in the background; and in the foreground is astronaut Eugene A. Cernan, lunar module pilot. The two crewmen are in the Lunar Module Mission Simulator.
ERIC Educational Resources Information Center
Thies, Anna-Lena; Weissenstein, Anne; Haulsen, Ivo; Marschall, Bernhard; Friederichs, Hendrik
2014-01-01
Simulation as a tool for medical education has gained considerable importance in the past years. Various studies have shown that the mastering of basic skills happens best if taught in a realistic and workplace-based context. It is necessary that simulation itself takes place in the realistic background of a genuine clinical or in an accordingly…
NASA Astrophysics Data System (ADS)
Caldwell, A.; Cossavella, F.; Majorovits, B.; Palioselitis, D.; Volynets, O.
2015-07-01
A pulse-shape discrimination method based on artificial neural networks was applied to pulses simulated for different background, signal and signal-like interactions inside a germanium detector. The simulated pulses were used to investigate variations of efficiencies as a function of used training set. It is verified that neural networks are well-suited to identify background pulses in true-coaxial high-purity germanium detectors. The systematic uncertainty on the signal recognition efficiency derived using signal-like evaluation samples from calibration measurements is estimated to be 5 %. This uncertainty is due to differences between signal and calibration samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soffientini, Chiara Dolores, E-mail: chiaradolores.soffientini@polimi.it; Baselli, Giuseppe; De Bernardi, Elisabetta
Purpose: Quantitative {sup 18}F-fluorodeoxyglucose positron emission tomography is limited by the uncertainty in lesion delineation due to poor SNR, low resolution, and partial volume effects, subsequently impacting oncological assessment, treatment planning, and follow-up. The present work develops and validates a segmentation algorithm based on statistical clustering. The introduction of constraints based on background features and contiguity priors is expected to improve robustness vs clinical image characteristics such as lesion dimension, noise, and contrast level. Methods: An eight-class Gaussian mixture model (GMM) clustering algorithm was modified by constraining the mean and variance parameters of four background classes according to the previousmore » analysis of a lesion-free background volume of interest (background modeling). Hence, expectation maximization operated only on the four classes dedicated to lesion detection. To favor the segmentation of connected objects, a further variant was introduced by inserting priors relevant to the classification of neighbors. The algorithm was applied to simulated datasets and acquired phantom data. Feasibility and robustness toward initialization were assessed on a clinical dataset manually contoured by two expert clinicians. Comparisons were performed with respect to a standard eight-class GMM algorithm and to four different state-of-the-art methods in terms of volume error (VE), Dice index, classification error (CE), and Hausdorff distance (HD). Results: The proposed GMM segmentation with background modeling outperformed standard GMM and all the other tested methods. Medians of accuracy indexes were VE <3%, Dice >0.88, CE <0.25, and HD <1.2 in simulations; VE <23%, Dice >0.74, CE <0.43, and HD <1.77 in phantom data. Robustness toward image statistic changes (±15%) was shown by the low index changes: <26% for VE, <17% for Dice, and <15% for CE. Finally, robustness toward the user-dependent volume initialization was demonstrated. The inclusion of the spatial prior improved segmentation accuracy only for lesions surrounded by heterogeneous background: in the relevant simulation subset, the median VE significantly decreased from 13% to 7%. Results on clinical data were found in accordance with simulations, with absolute VE <7%, Dice >0.85, CE <0.30, and HD <0.81. Conclusions: The sole introduction of constraints based on background modeling outperformed standard GMM and the other tested algorithms. Insertion of a spatial prior improved the accuracy for realistic cases of objects in heterogeneous backgrounds. Moreover, robustness against initialization supports the applicability in a clinical setting. In conclusion, application-driven constraints can generally improve the capabilities of GMM and statistical clustering algorithms.« less
Brühl, C; Lelieveld, J; Tost, H; Höpfner, M; Glatthor, N
2015-01-01
Multiyear simulations with the atmospheric chemistry general circulation model EMAC with a microphysical modal aerosol module at high vertical resolution demonstrate that the sulfur gases COS and SO2, the latter from low-latitude and midlatitude volcanic eruptions, predominantly control the formation of stratospheric aerosol. Marine dimethyl sulfide (DMS) and other SO2 sources, including strong anthropogenic emissions in China, are found to play a minor role except in the lowermost stratosphere. Estimates of volcanic SO2 emissions are based on satellite observations using Total Ozone Mapping Spectrometer and Ozone Monitoring Instrument for total injected mass and Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) on Envisat or Stratospheric Aerosol and Gases Experiment for the spatial distribution. The 10 year SO2 and COS data set of MIPAS is also used for model evaluation. The calculated radiative forcing of stratospheric background aerosol including sulfate from COS and small contributions by DMS oxidation, and organic aerosol from biomass burning, is about 0.07W/m2. For stratospheric sulfate aerosol from medium and small volcanic eruptions between 2005 and 2011 a global radiative forcing up to 0.2W/m2 is calculated, moderating climate warming, while for the major Pinatubo eruption the simulated forcing reaches 5W/m2, leading to temporary climate cooling. The Pinatubo simulation demonstrates the importance of radiative feedback on dynamics, e.g., enhanced tropical upwelling, for large volcanic eruptions. PMID:25932352
Simulation and theory of spontaneous TAE frequency sweeping
NASA Astrophysics Data System (ADS)
Wang, Ge; Berk, H. L.
2012-09-01
A simulation model, based on the linear tip model of Rosenbluth, Berk and Van Dam (RBV), is developed to study frequency sweeping of toroidal Alfvén eigenmodes (TAEs). The time response of the background wave in the RBV model is given by a Volterra integral equation. This model captures the properties of TAE waves both in the gap and in the continuum. The simulation shows that phase space structures form spontaneously at frequencies close to the linearly predicted frequency, due to resonant particle-wave interactions and background dissipation. The frequency sweeping signals are found to chirp towards the upper and lower continua. However, the chirping signals penetrate only the lower continuum, whereupon the frequency chirps and mode amplitude increases in synchronism to produce an explosive solution. An adiabatic theory describing the evolution of a chirping signal is developed which replicates the chirping dynamics of the simulation in the lower continuum. This theory predicts that a decaying chirping signal will terminate at the upper continuum though in the numerical simulation the hole disintegrates before the upper continuum is reached.
Allvin, Renée; Berndtzon, Magnus; Carlzon, Liisa; Edelbring, Samuel; Hult, Håkan; Hultin, Magnus; Karlgren, Klas; Masiello, Italo; Södersved Källestedt, Marie-Louise; Tamás, Éva
2017-01-01
Background Medical simulation enables the design of learning activities for competency areas (eg, communication and leadership) identified as crucial for future health care professionals. Simulation educators and medical teachers follow different career paths, and their education backgrounds and teaching contexts may be very different in a simulation setting. Although they have a key role in facilitating learning, information on the continuing professional development (pedagogical development) of simulation educators is not available in the literature. Objectives To explore changes in experienced simulation educators’ perceptions of their own teaching skills, practices, and understanding of teaching over time. Methods A qualitative exploratory study. Fourteen experienced simulation educators participated in individual open-ended interviews focusing on their development as simulation educators. Data were analyzed using an inductive thematic analysis. Results Marked educator development was discerned over time, expressed mainly in an altered way of thinking and acting. Five themes were identified: shifting focus, from following to utilizing a structure, setting goals, application of technology, and alignment with profession. Being confident in the role as an instructor seemed to constitute a foundation for the instructor’s pedagogical development. Conclusion Experienced simulation educators’ pedagogical development was based on self-confidence in the educator role, and not on a deeper theoretical understanding of teaching and learning. This is the first clue to gain increased understanding regarding educational level and possible education needs among simulation educators, and it might generate several lines of research for further studies. PMID:28176931
Discrete Element Method (DEM) Simulations using PFC3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matt Evans
Contains input scripts, background information, reduced data, and results associated with the discrete element method (DEM) simulations of interface shear tests, plate anchor pullout tests, and torpedo anchor installation and pullout tests, using the software PFC3D (v4.0).
Readout circuit with novel background suppression for long wavelength infrared focal plane arrays
NASA Astrophysics Data System (ADS)
Xie, L.; Xia, X. J.; Zhou, Y. F.; Wen, Y.; Sun, W. F.; Shi, L. X.
2011-02-01
In this article, a novel pixel readout circuit using a switched-capacitor integrator mode background suppression technique is presented for long wavelength infrared focal plane arrays. This circuit can improve dynamic range and signal-to-noise ratio by suppressing the large background current during integration. Compared with other background suppression techniques, the new background suppression technique is less sensitive to the process mismatch and has no additional shot noise. The proposed circuit is theoretically analysed and simulated while taking into account the non-ideal characteristics. The result shows that the background suppression non-uniformity is ultra-low even for a large process mismatch. The background suppression non-uniformity of the proposed circuit can also remain very small with technology scaling.
NASA Astrophysics Data System (ADS)
Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei
2017-12-01
Laser-induced breakdown spectroscopy (LIBS) is an analytical technique that has gained increasing attention because of many applications. The production of continuous background in LIBS is inevitable because of factors associated with laser energy, gate width, time delay, and experimental environment. The continuous background significantly influences the analysis of the spectrum. Researchers have proposed several background correction methods, such as polynomial fitting, Lorenz fitting and model-free methods. However, less of them apply these methods in the field of LIBS Technology, particularly in qualitative and quantitative analyses. This study proposes a method based on spline interpolation for detecting and estimating the continuous background spectrum according to its smooth property characteristic. Experiment on the background correction simulation indicated that, the spline interpolation method acquired the largest signal-to-background ratio (SBR) over polynomial fitting, Lorenz fitting and model-free method after background correction. These background correction methods all acquire larger SBR values than that acquired before background correction (The SBR value before background correction is 10.0992, whereas the SBR values after background correction by spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 26.9576, 24.6828, 18.9770, and 25.6273 respectively). After adding random noise with different kinds of signal-to-noise ratio to the spectrum, spline interpolation method acquires large SBR value, whereas polynomial fitting and model-free method obtain low SBR values. All of the background correction methods exhibit improved quantitative results of Cu than those acquired before background correction (The linear correlation coefficient value before background correction is 0.9776. Moreover, the linear correlation coefficient values after background correction using spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 0.9998, 0.9915, 0.9895, and 0.9940 respectively). The proposed spline interpolation method exhibits better linear correlation and smaller error in the results of the quantitative analysis of Cu compared with polynomial fitting, Lorentz fitting and model-free methods, The simulation and quantitative experimental results show that the spline interpolation method can effectively detect and correct the continuous background.
Esquinas, Pedro L; Uribe, Carlos F; Gonzalez, M; Rodríguez-Rodríguez, Cristina; Häfeli, Urs O; Celler, Anna
2017-07-20
The main applications of 188 Re in radionuclide therapies include trans-arterial liver radioembolization and palliation of painful bone-metastases. In order to optimize 188 Re therapies, the accurate determination of radiation dose delivered to tumors and organs at risk is required. Single photon emission computed tomography (SPECT) can be used to perform such dosimetry calculations. However, the accuracy of dosimetry estimates strongly depends on the accuracy of activity quantification in 188 Re images. In this study, we performed a series of phantom experiments aiming to investigate the accuracy of activity quantification for 188 Re SPECT using high-energy and medium-energy collimators. Objects of different shapes and sizes were scanned in Air, non-radioactive water (Cold-water) and water with activity (Hot-water). The ordered subset expectation maximization algorithm with clinically available corrections (CT-based attenuation, triple-energy window (TEW) scatter and resolution recovery was used). For high activities, the dead-time corrections were applied. The accuracy of activity quantification was evaluated using the ratio of the reconstructed activity in each object to this object's true activity. Each object's activity was determined with three segmentation methods: a 1% fixed threshold (for cold background), a 40% fixed threshold and a CT-based segmentation. Additionally, the activity recovered in the entire phantom, as well as the average activity concentration of the phantom background were compared to their true values. Finally, Monte-Carlo simulations of a commercial [Formula: see text]-camera were performed to investigate the accuracy of the TEW method. Good quantification accuracy (errors <10%) was achieved for the entire phantom, the hot-background activity concentration and for objects in cold background segmented with a 1% threshold. However, the accuracy of activity quantification for objects segmented with 40% threshold or CT-based methods decreased (errors >15%), mostly due to partial-volume effects. The Monte-Carlo simulations confirmed that TEW-scatter correction applied to 188 Re, although practical, yields only approximate estimates of the true scatter.
NASA Astrophysics Data System (ADS)
Silva, James
2017-09-01
The Ricochet experiment seeks to measure Coherent (neutral-current) Elastic Neutrino-Nucleus Scattering (CE νNS) using metallic superconducting and germanium semi-conducting detectors with sub-keV thresholds placed near a neutrino source such as the Chooz Nuclear Reactor Complex. In this poster, we present an estimate of the flux of cosmic-ray induced neutrons, which represent an important background in any (CE νNS) search, based on reconstructed cosmic ray data from the Chooz Site. We have simulated a possible Ricochet deployment at the Chooz site in GEANT4 focusing on the spallation neutrons generated when cosmic rays interact with the water tank veto that would surround our detector. We further simulate and discuss the effectiveness of various shielding configurations for optimizing the background levels for a future Ricochet deployment.
NASA Astrophysics Data System (ADS)
Subramaniam, Vivek; Underwood, Thomas C.; Raja, Laxminarayan L.; Cappelli, Mark A.
2018-02-01
We present a magnetohydrodynamic (MHD) numerical simulation to study the physical mechanisms underlying plasma acceleration in a coaxial plasma gun. Coaxial plasma accelerators are known to exhibit two distinct modes of operation depending on the delay between gas loading and capacitor discharging. Shorter delays lead to a high velocity plasma deflagration jet and longer delays produce detonation shocks. During a single operational cycle that typically consists of two discharge events, the plasma acceleration exhibits a behavior characterized by a mode transition from deflagration to detonation. The first of the discharge events, a deflagration that occurs when the discharge expands into an initially evacuated domain, requires a modification of the standard MHD algorithm to account for rarefied regions of the simulation domain. The conventional approach of using a low background density gas to mimic the vacuum background results in the formation of an artificial shock, inconsistent with the physics of free expansion. To this end, we present a plasma-vacuum interface tracking framework with the objective of predicting a physically consistent free expansion, devoid of the spurious shock obtained with the low background density approach. The interface tracking formulation is integrated within the MHD framework to simulate the plasma deflagration and the second discharge event, a plasma detonation, formed due to its initiation in a background prefilled with gas remnant from the deflagration. The mode transition behavior obtained in the simulations is qualitatively compared to that observed in the experiments using high framing rate Schlieren videography. The deflagration mode is further investigated to understand the jet formation process and the axial velocities obtained are compared against experimentally obtained deflagration plasma front velocities. The simulations are also used to provide insight into the conditions responsible for the generation and sustenance of the magnetic pinch. The pinch width and number density distribution are compared to experimentally obtained data to calibrate the inlet boundary conditions used to set up the plasma acceleration problem.
Rader, T; Fastl, H; Baumann, U
2017-03-01
After implantation of cochlear implants with hearing preservation for combined electronic acoustic stimulation (EAS), the residual acoustic hearing ability relays fundamental speech frequency information in the low frequency range. With the help of acoustic simulation of EAS hearing perception the impact of frequency and level fine structure of speech signals can be systematically examined. The aim of this study was to measure the speech reception threshold (SRT) under various noise conditions with acoustic EAS simulation by variation of the frequency and level information of the fundamental frequency f0 of speech. The study was carried out to determine to what extent the SRT is impaired by modification of the f0 fine structure. Using partial tone time pattern analysis an acoustic EAS simulation of the speech material from the Oldenburg sentence test (OLSA) was generated. In addition, determination of the f0 curve of the speech material was conducted. Subsequently, either the parameter frequency or level of f0 was fixed in order to remove one of the two fine contour information of the speech signal. The processed OLSA sentences were used to determine the SRT in background noise under various test conditions. The conditions "f0 fixed frequency" and "f0 fixed level" were tested under two different situations, under "amplitude modulated background noise" and "continuous background noise" conditions. A total of 24 subjects with normal hearing participated in the study. The SRT in background noise for the condition "f0 fixed frequency" was more favorable in continuous noise with 2.7 dB and in modulated noise with 0.8 dB compared to the condition "f0 fixed level" with 3.7 dB and 2.9 dB, respectively. In the simulation of speech perception with cochlear implants and acoustic components, the level information of the fundamental frequency had a stronger impact on speech intelligibility than the frequency information. The method of simulation of transmission of cochlear implants allows investigation of how various parameters influence speech intelligibility in subjects with normal hearing.
Study of Background Rejection Systems for the IXO Mission.
NASA Astrophysics Data System (ADS)
Laurent, Philippe; Limousin, O.; Tatischeff, V.
2009-01-01
The scientific performances of the IXO mission will necessitate a very low detector background level. This will imply thorough background simulations, and efficient background rejection systems. It necessitates also a very good knowledge of the detectors to be shielded. In APC, Paris, and CEA, Saclay, we got experience on these activities by conceiving and optimising in parallel the high energy detector and the active and passive background rejection system of the Simbol-X mission. Considering that this work may be naturally extended to other X-ray missions, we have initiated with CNES a R&D project on the study of background rejection systems mainly in view the IXO project. We will detail this activity in the poster.
Using the NASA GRC Sectored-One-Dimensional Combustor Simulation
NASA Technical Reports Server (NTRS)
Paxson, Daniel E.; Mehta, Vishal R.
2014-01-01
The document is a user manual for the NASA GRC Sectored-One-Dimensional (S-1-D) Combustor Simulation. It consists of three sections. The first is a very brief outline of the mathematical and numerical background of the code along with a description of the non-dimensional variables on which it operates. The second section describes how to run the code and includes an explanation of the input file. The input file contains the parameters necessary to establish an operating point as well as the associated boundary conditions (i.e. how it is fed and terminated) of a geometrically configured combustor. It also describes the code output. The third section describes the configuration process and utilizes a specific example combustor to do so. Configuration consists of geometrically describing the combustor (section lengths, axial locations, and cross sectional areas) and locating the fuel injection point and flame region. Configuration requires modifying the source code and recompiling. As such, an executable utility is included with the code which will guide the requisite modifications and insure that they are done correctly.
Impact of Sociocultural Background and Assessment Data Upon School Psychologists' Decisions.
ERIC Educational Resources Information Center
Huebner, E. Scott; Cummings, Jack A.
1985-01-01
Psychologists (N=56) participated in an adapted version of Algozzine and Ysseldyke's (1981) diagnostic simulation to investigate the effects of sociocultural background (rural vs. suburban) and assessment data (normal vs. learning disabled) on educational decisions. Findings suggest school psychologists utilize multiple sources of information but…
The Impact of Missing Background Data on Subpopulation Estimation
ERIC Educational Resources Information Center
Rutkowski, Leslie
2011-01-01
Although population modeling methods are well established, a paucity of literature appears to exist regarding the effect of missing background data on subpopulation achievement estimates. Using simulated data that follows typical large-scale assessment designs with known parameters and a number of missing conditions, this paper examines the extent…
Confronting History: Simulations of Historical Conflicts. Grades 5-8.
ERIC Educational Resources Information Center
Collins, Katie; Draze, Dianne, Ed.; Conroy, Sonsie, Ed.
This booklet presents four different scenarios of conflict from United States history. Students take on the roles of some of the characters in the conflicts to learn the differing viewpoints of the situations. Each simulation presents five sections: "background information"; "meet the people"; "investigator…
NASA Astrophysics Data System (ADS)
Hielscher, Andreas H.; Liu, Hanli; Wang, Lihong V.; Tittel, Frank K.; Chance, Britton; Jacques, Steven L.
1994-07-01
Near infrared light has been used for the determination of blood oxygenation in the brain but little attention has been paid to the fact that the states of blood oxygenation in arteries, veins, and capillaries differ substantially. In this study, Monte Carlo simulations for a heterogeneous system were conducted, and near infrared time-resolved reflectance measurements were performed on a heterogeneous tissue phantom model. The model was made of a solid polyester resin, which simulates the tissue background. A network of tubes was distributed uniformly through the resin to simulate the blood vessels. The time-resolved reflectance spectra were taken with different absorbing solutions filled in the network. Based on the simulation and experimental results, we investigated the dependence of the absorption coefficient obtained from the heterogeneous system on the absorption of the actual absorbing solution filled in the tubes. We show that light absorption by the brain should result from the combination of blood and blood-free tissue background.
Visuospatial Aptitude Testing Differentially Predicts Simulated Surgical Skill.
Hinchcliff, Emily; Green, Isabel; Destephano, Christopher; Cox, Mary; Smink, Douglas; Kumar, Amanika; Hokenstad, Erik; Bengtson, Joan; Cohen, Sarah
2018-02-05
To determine if visuospatial perception (VSP) testing is correlated to simulated or intraoperative surgical performance as rated by the American College of Graduate Medical Education (ACGME) milestones. Classification II-2 SETTING: Two academic training institutions PARTICIPANTS: 41 residents, including 19 Brigham and Women's Hospital and 22 Mayo Clinic residents from three different specialties (OBGYN, general surgery, urology). Participants underwent three different tests: visuospatial perception testing (VSP), Fundamentals of Laparoscopic Surgery (FLS®) peg transfer, and DaVinci robotic simulation peg transfer. Surgical grading from the ACGME milestones tool was obtained for each participant. Demographic and subject background information was also collected including specialty, year of training, prior experience with simulated skills, and surgical interest. Standard statistical analysis using Student's t test were performed, and correlations were determined using adjusted linear regression models. In univariate analysis, BWH and Mayo training programs differed in both times and overall scores for both FLS® peg transfer and DaVinci robotic simulation peg transfer (p<0.05 for all). Additionally, type of residency training impacted time and overall score on robotic peg transfer. Familiarity with tasks correlated with higher score and faster task completion (p= 0.05 for all except VSP score). There was no difference in VSP scores by program, specialty, or year of training. In adjusted linear regression modeling, VSP testing was correlated only to robotic peg transfer skills (average time p=0.006, overall score p=0.001). Milestones did not correlate to either VSP or surgical simulation testing. VSP score was correlated with robotic simulation skills but not with FLS skills or ACGME milestones. This suggests that the ability of VSP score to predict competence differs between tasks. Therefore, further investigation is required into aptitude testing, especially prior to its integration as an entry examination into a surgical subspecialty. Copyright © 2018. Published by Elsevier Inc.
Simulation-Based Bronchoscopy Training
Kennedy, Cassie C.; Maldonado, Fabien
2013-01-01
Background: Simulation-based bronchoscopy training is increasingly used, but effectiveness remains uncertain. We sought to perform a comprehensive synthesis of published work on simulation-based bronchoscopy training. Methods: We searched MEDLINE, EMBASE, CINAHL, PsycINFO, ERIC, Web of Science, and Scopus for eligible articles through May 11, 2011. We included all original studies involving health professionals that evaluated, in comparison with no intervention or an alternative instructional approach, simulation-based training for flexible or rigid bronchoscopy. Study selection and data abstraction were performed independently and in duplicate. We pooled results using random effects meta-analysis. Results: From an initial pool of 10,903 articles, we identified 17 studies evaluating simulation-based bronchoscopy training. In comparison with no intervention, simulation training was associated with large benefits on skills and behaviors (pooled effect size, 1.21 [95% CI, 0.82-1.60]; n = 8 studies) and moderate benefits on time (0.62 [95% CI, 0.12-1.13]; n = 7). In comparison with clinical instruction, behaviors with real patients showed nonsignificant effects favoring simulation for time (0.61 [95% CI, −1.47 to 2.69]) and process (0.33 [95% CI, −1.46 to 2.11]) outcomes (n = 2 studies each), although variation in training time might account for these differences. Four studies compared alternate simulation-based training approaches. Inductive analysis to inform instructional design suggested that longer or more structured training is more effective, authentic clinical context adds value, and animal models and plastic part-task models may be superior to more costly virtual-reality simulators. Conclusions: Simulation-based bronchoscopy training is effective in comparison with no intervention. Comparative effectiveness studies are few. PMID:23370487
Template-Based Geometric Simulation of Flexible Frameworks
Wells, Stephen A.; Sartbaeva, Asel
2012-01-01
Specialised modelling and simulation methods implementing simplified physical models are valuable generators of insight. Template-based geometric simulation is a specialised method for modelling flexible framework structures made up of rigid units. We review the background, development and implementation of the method, and its applications to the study of framework materials such as zeolites and perovskites. The “flexibility window” property of zeolite frameworks is a particularly significant discovery made using geometric simulation. Software implementing geometric simulation of framework materials, “GASP”, is freely available to researchers. PMID:28817055
Dual-tracer background subtraction approach for fluorescent molecular tomography
Holt, Robert W.; El-Ghussein, Fadi; Davis, Scott C.; Samkoe, Kimberley S.; Gunn, Jason R.; Leblond, Frederic
2013-01-01
Abstract. Diffuse fluorescence tomography requires high contrast-to-background ratios to accurately reconstruct inclusions of interest. This is a problem when imaging the uptake of fluorescently labeled molecularly targeted tracers in tissue, which can result in high levels of heterogeneously distributed background uptake. We present a dual-tracer background subtraction approach, wherein signal from the uptake of an untargeted tracer is subtracted from targeted tracer signal prior to image reconstruction, resulting in maps of targeted tracer binding. The approach is demonstrated in simulations, a phantom study, and in a mouse glioma imaging study, demonstrating substantial improvement over conventional and homogenous background subtraction image reconstruction approaches. PMID:23292612
Establish an Agent-Simulant Technology Relationship (ASTR)
2017-04-14
for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT
Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)
2005-02-01
method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background
Second-order Cosmological Perturbations Engendered by Point-like Masses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brilenkov, Ruslan; Eingorn, Maxim, E-mail: ruslan.brilenkov@gmail.com, E-mail: maxim.eingorn@gmail.com
2017-08-20
In the ΛCDM framework, presenting nonrelativistic matter inhomogeneities as discrete massive particles, we develop the second‐order cosmological perturbation theory. Our approach relies on the weak gravitational field limit. The derived equations for the second‐order scalar, vector, and tensor metric corrections are suitable at arbitrary distances, including regions with nonlinear contrasts of the matter density. We thoroughly verify fulfillment of all Einstein equations, as well as self‐consistency of order assignments. In addition, we achieve logical positive results in the Minkowski background limit. Feasible investigations of the cosmological back-reaction manifestations by means of relativistic simulations are also outlined.
Recent progress in tidal modeling
NASA Technical Reports Server (NTRS)
Vial, F.; Forbes, J. M.
1989-01-01
Recent contributions to tidal theory during the last five years are reviewed. Specific areas where recent progress has occurred include: the action of mean wind and dissipation on tides, interactions of other waves with tides, the use of TGCM in tidal studies. Furthermore, attention is put on the nonlinear interaction between semidiurnal and diurnal tides. Finally, more realistic thermal excitation and background wind and temperature models have been developed in the past few years. This has led to new month-to-month numerical simulations of the semidiurnal tide. Some results using these models are presented and compared with ATMAP tidal climatologies.
STS-110 M.S. Ross in M-113 personnel carrier during TCDT
NASA Technical Reports Server (NTRS)
2002-01-01
KENNEDY SPACE CENTER, FLA. -- STS-110 Mission Specialist Jerry Ross waits his turn at driving the M-113 armored personnel carrier, part of Terminal Countdown Demonstration Test activities. In the background, right, is Mission Specialist Lee Morin. TCDT includes emergency egress training and a simulated launch countdown, and is held at KSC prior to each Space Shuttle flight. Scheduled for launch April 4, the 11-day mission will feature Shuttle Atlantis docking with the International Space Station (ISS) and delivering the S0 truss, the centerpiece-segment of the primary truss structure that will eventually extend over 300 feet.
STS-110 M.S. Ochoa in M-113 personnel carrier during TCDT
NASA Technical Reports Server (NTRS)
2002-01-01
KENNEDY SPACE CENTER, FLA. -- STS-110 Mission Specialist Ellen Ochoa waits her turn at driving the M-113 armored personnel carrier, part of Terminal Countdown Demonstration Test activities. In the background, right, is Pilot Stephen Frick. TCDT includes emergency egress training and a simulated launch countdown. The TCDT is held at KSC prior to each Space Shuttle flight. Scheduled for launch April 4, the 11-day mission will feature Shuttle Atlantis docking with the International Space Station (ISS) and delivering the S0 truss, the centerpiece-segment of the primary truss structure that will eventually extend over 300 feet.
Research on optimal investment path of transmission corridor under the global energy Internet
NASA Astrophysics Data System (ADS)
Huang, Yuehui; Li, Pai; Wang, Qi; Liu, Jichun; Gao, Han
2018-02-01
Under the background of the global energy Internet, the investment planning of transmission corridor from XinJiang to Germany is studied in this article, which passes through four countries: Kazakhstan, Russia, Belarus and Poland. Taking the specific situation of different countries into account, including the length of transmission line, unit construction cost, completion time, transmission price, state tariff, inflation rate and so on, this paper constructed a power transmission investment model. Finally, the dynamic programming method is used to simulate the example, and the optimal strategies under different objective functions are obtained.
Advanced educational program in optoelectronics for undergraduates and graduates in electronics
NASA Astrophysics Data System (ADS)
Vladescu, Marian; Schiopu, Paul
2015-02-01
The optoelectronics education included in electronics curricula at Faculty of Electronics, Telecommunications and Information Technology of "Politehnica" University of Bucharest started in early '90s, and evolved constantly since then, trying to address the growing demand of engineers with a complex optoelectronics profile and to meet the increased requirements of microelectronics, optoelectronics, and lately nanotechnologies. Our goal is to provide a high level of theoretical background combined with advanced experimental tools in laboratories, and also with simulation platforms. That's why we propose an advanced educational program in optoelectronics for both grades of our study program, bachelor and master.
Chavez, Margeaux; Nazi, Kim; Antinori, Nicole; Melillo, Christine; Cotner, Bridget A; Hathaway, Wendy; Cook, Ashley; Wilck, Nancy; Noonan, Abigail
2017-01-01
Background The Department of Veterans Affairs (VA) has multiple health information technology (HIT) resources for veterans to support their health care management. These include a patient portal, VetLink Kiosks, mobile apps, and telehealth services. The veteran patient population has a variety of needs and preferences that can inform current VA HIT redesign efforts to meet consumer needs. Objective This study aimed to describe veterans’ experiences using the current VA HIT and identify their vision for the future of an integrated VA HIT system. Methods Two rounds of focus group interviews were conducted with a single cohort of 47 veterans and one female caregiver recruited from Bedford, Massachusetts, and Tampa, Florida. Focus group interviews included simulation modeling activities and a self-administered survey. This study also used an expert panel group to provide data and input throughout the study process. High-fidelity, interactive simulations were created and used to facilitate collection of qualitative data. The simulations were developed based on system requirements, data collected through operational efforts, and participants' reported preferences for using VA HIT. Pairwise comparison activities of HIT resources were conducted with both focus groups and the expert panel. Rapid iterative content analysis was used to analyze qualitative data. Descriptive statistics summarized quantitative data. Results Data themes included (1) current use of VA HIT, (2) non-VA HIT use, and (3) preferences for future use of VA HIT. Data indicated that, although the Secure Messaging feature was often preferred, a full range of HIT options are needed. These data were then used to develop veteran-driven simulations that illustrate user needs and expectations when using a HIT system and services to access VA health care services. Conclusions Patient participant redesign processes present critical opportunities for creating a human-centered design. Veterans value virtual health care options and prefer standardized, integrated, and synchronized user-friendly interface designs. PMID:29061553
A Model Independent General Search for new physics in ATLAS
NASA Astrophysics Data System (ADS)
Amoroso, S.; ATLAS Collaboration
2016-04-01
We present results of a model-independent general search for new phenomena in proton-proton collisions at a centre-of-mass energy of 8 TeV with the ATLAS detector at the LHC. The data set corresponds to a total integrated luminosity of 20.3 fb-1. Event topologies involving isolated electrons, photons and muons, as well as jets, including those identified as originating from b-quarks (b-jets) and missing transverse momentum are investigated. The events are subdivided according to their final states into exclusive event classes. For the 697 classes with a Standard Model expectation greater than 0.1 events, a search algorithm tests the compatibility of data against the Monte Carlo simulated background in three kinematic variables sensitive to new physics effects. No significant deviation is found in data. The number and size of the observed deviations follow the Standard Model expectation obtained from simulated pseudo-experiments.
Exploring biological interaction networks with tailored weighted quasi-bicliques
2012-01-01
Background Biological networks provide fundamental insights into the functional characterization of genes and their products, the characterization of DNA-protein interactions, the identification of regulatory mechanisms, and other biological tasks. Due to the experimental and biological complexity, their computational exploitation faces many algorithmic challenges. Results We introduce novel weighted quasi-biclique problems to identify functional modules in biological networks when represented by bipartite graphs. In difference to previous quasi-biclique problems, we include biological interaction levels by using edge-weighted quasi-bicliques. While we prove that our problems are NP-hard, we also describe IP formulations to compute exact solutions for moderately sized networks. Conclusions We verify the effectiveness of our IP solutions using both simulation and empirical data. The simulation shows high quasi-biclique recall rates, and the empirical data corroborate the abilities of our weighted quasi-bicliques in extracting features and recovering missing interactions from biological networks. PMID:22759421
Simulations of Beam Optics and Bremsstrahlung for High Intensity and Brightness Channeling Radiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyun, J.; Piot, P.; Sen, T.
2018-04-12
This paper presents X-ray spectra of channeling radiation expected at the FAST (Fermi Accelerator Science and Technology) facility in Fermilab. Our purpose is to produce high brightness quasi-monochromatic X-rays in an energy range from 40 keV to 110 keV. We will use a diamond crystal and low emittance electrons with an energy of around 43 MeV. The quality of emitted X-rays depends on parameters of the electron beam at the crystal. We present simulations of the beam optics for high brightness and high yield operations for a range of bunch charges. We estimate the X-ray spectra including bremsstrahlung background. Wemore » discuss how the electron beam distributions after the diamond crystal are affected by channeling. We discuss an X-ray detector system to avoid pile-up effects during high charge operations.« less
Inter-Identity Autobiographical Amnesia in Patients with Dissociative Identity Disorder
Huntjens, Rafaële J. C.; Verschuere, Bruno; McNally, Richard J.
2012-01-01
Background A major symptom of Dissociative Identity Disorder (DID; formerly Multiple Personality Disorder) is dissociative amnesia, the inability to recall important personal information. Only two case studies have directly addressed autobiographical memory in DID. Both provided evidence suggestive of dissociative amnesia. The aim of the current study was to objectively assess transfer of autobiographical information between identities in a larger sample of DID patients. Methods Using a concealed information task, we assessed recognition of autobiographical details in an amnesic identity. Eleven DID patients, 27 normal controls, and 23 controls simulating DID participated. Controls and simulators were matched to patients on age, education level, and type of autobiographical memory tested. Findings Although patients subjectively reported amnesia for the autobiographical details included in the task, the results indicated transfer of information between identities. Conclusion The results call for a revision of the DID definition. The amnesia criterion should be modified to emphasize its subjective nature. PMID:22815769
Simulations of Beam Optics and Bremsstrahlung for High Intensity and Brightness Channeling Radiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyun, J.; Piot, P.; Sen, T.
This paper presents X-ray spectra of channeling radiation expected at the FAST (Fermi Accelerator Science and Technology) facility in Fermilab. Our purpose is to produce high brightness quasi-monochromatic X-rays in an energy range from 40 keV to 110 keV. We will use a diamond crystal and low emittance electrons with an energy of around 43 MeV. The quality of emitted X-rays depends on parameters of the electron beam at the crystal. We present simulations of the beam optics for high brightness and high yield operations for a range of bunch charges. We estimate the X-ray spectra including bremsstrahlung background. Wemore » discuss how the electron beam distributions after the diamond crystal are affected by channeling. Here, we discuss an X-ray detector system to avoid pile-up effects during high charge operations.« less
Formation of the Giant Planets by Concurrent Accretion of Solids and Gas
NASA Technical Reports Server (NTRS)
Hubickyj, Olenka
1997-01-01
Models were developed to simulate planet formation. Three major phases are characterized in the simulations: (1) planetesimal accretion rate, which dominates that of gas, rapidly increases owing to runaway accretion, then decreases as the planet's feeding zone is depleted; (2) occurs when both solid and gas accretion rates are small and nearly independent of time; and (3) starts when the solid and gas masses are about equal and is marked by runaway gas accretion. The models applicability to planets in our Solar System are judged using two basic "yardsticks". The results suggest that the solar nebula dissipated while Uranus and Neptune were in the second phase, during which, for a relatively long time, the masses of their gaseous envelopes were small but not negligible compared to the total masses. Background information, results and a published article are included in the report.
Smoothed Particle Hydrodynamics Simulations of Ultrarelativistic Shocks with Artificial Viscosity
NASA Astrophysics Data System (ADS)
Siegler, S.; Riffert, H.
2000-03-01
We present a fully Lagrangian conservation form of the general relativistic hydrodynamic equations for perfect fluids with artificial viscosity in a given arbitrary background spacetime. This conservation formulation is achieved by choosing suitable Lagrangian time evolution variables, from which the generic fluid variables of rest-mass density, 3-velocity, and thermodynamic pressure have to be determined. We present the corresponding equations for an ideal gas and show the existence and uniqueness of the solution. On the basis of the Lagrangian formulation we have developed a three-dimensional general relativistic smoothed particle hydrodynamics (SPH) code using the standard SPH formalism as known from nonrelativistic fluid dynamics. One-dimensional simulations of a shock tube and a wall shock are presented together with a two-dimensional test calculation of an inclined shock tube. With our method we can model ultrarelativistic fluid flows including shocks with Lorentz factors of even 1000.
Grid-connected in-stream hydroelectric generation based on the doubly fed induction machine
NASA Astrophysics Data System (ADS)
Lenberg, Timothy J.
Within the United States, there is a growing demand for new environmentally friendly power generation. This has led to a surge in wind turbine development. Unfortunately, wind is not a stable prime mover, but water is. Why not apply the advances made for wind to in-stream hydroelectric generation? One important advancement is the creation of the Doubly Fed Induction Machine (DFIM). This thesis covers the application of a gearless DFIM topology for hydrokinetic generation. After providing background, this thesis presents many of the options available for the mechanical portion of the design. A mechanical turbine is then specified. Next, a method is presented for designing a DFIM including the actual design for this application. In Chapter 4, a simulation model of the system is presented, complete with a control system that maximizes power generation based on water speed. This section then goes on to present simulation results demonstrating proper operation.
Simulations of Beam Optics and Bremsstrahlung for High Intensity and Brightness Channeling Radiation
Hyun, J.; Piot, P.; Sen, T.
2018-06-14
This paper presents X-ray spectra of channeling radiation expected at the FAST (Fermi Accelerator Science and Technology) facility in Fermilab. Our purpose is to produce high brightness quasi-monochromatic X-rays in an energy range from 40 keV to 110 keV. We will use a diamond crystal and low emittance electrons with an energy of around 43 MeV. The quality of emitted X-rays depends on parameters of the electron beam at the crystal. We present simulations of the beam optics for high brightness and high yield operations for a range of bunch charges. We estimate the X-ray spectra including bremsstrahlung background. Wemore » discuss how the electron beam distributions after the diamond crystal are affected by channeling. Here, we discuss an X-ray detector system to avoid pile-up effects during high charge operations.« less
SDSS-IV MaNGA: the spectroscopic discovery of strongly lensed galaxies
NASA Astrophysics Data System (ADS)
Talbot, Michael S.; Brownstein, Joel R.; Bolton, Adam S.; Bundy, Kevin; Andrews, Brett H.; Cherinka, Brian; Collett, Thomas E.; More, Anupreeta; More, Surhud; Sonnenfeld, Alessandro; Vegetti, Simona; Wake, David A.; Weijmans, Anne-Marie; Westfall, Kyle B.
2018-06-01
We present a catalogue of 38 spectroscopically detected strong galaxy-galaxy gravitational lens candidates identified in the Sloan Digital Sky Survey IV (SDSS-IV). We were able to simulate narrow-band images for eight of them demonstrating evidence of multiple images. Two of our systems are compound lens candidates, each with two background source-planes. One of these compound systems shows clear lensing features in the narrow-band image. Our sample is based on 2812 galaxies observed by the Mapping Nearby Galaxies at APO (MaNGA) integral field unit (IFU). This Spectroscopic Identification of Lensing Objects (SILO) survey extends the methodology of the Sloan Lens ACS Survey (SLACS) and BOSS Emission-Line Survey (BELLS) to lower redshift and multiple IFU spectra. We searched ˜1.5 million spectra, of which 3065 contained multiple high signal-to-noise ratio background emission-lines or a resolved [O II] doublet, that are included in this catalogue. Upon manual inspection, we discovered regions with multiple spectra containing background emission-lines at the same redshift, providing evidence of a common source-plane geometry which was not possible in previous SLACS and BELLS discovery programs. We estimate more than half of our candidates have an Einstein radius ≳ 1.7 arcsec, which is significantly greater than seen in SLACS and BELLS. These larger Einstein radii produce more extended images of the background galaxy increasing the probability that a background emission-line will enter one of the IFU spectroscopic fibres, making detection more likely.
Kapitán, Josef; Johannessen, Christian; Bour, Petr; Hecht, Lutz; Barron, Laurence D
2009-01-01
The samples used for the first observations of vibrational Raman optical activity (ROA) in 1972, namely both enantiomers of 1-phenylethanol and 1-phenylethylamine, have been revisited using a modern commercial ROA instrument together with state-of-the-art ab initio calculations. The simulated ROA spectra reveal for the first time the vibrational origins of the first reported ROA signals, which comprised similar couplets in the alcohol and amine in the spectral range approximately 280-400 cm(-1). The results demonstrate how easy and routine ROA measurements have become, and how current ab initio quantum-chemical calculations are capable of simulating experimental ROA spectra quite closely provided sufficient averaging over accessible conformations is included. Assignment of absolute configuration is, inter alia, completely secure from results of this quality. Anharmonic corrections provided small improvements in the simulated Raman and ROA spectra. The importance of conformational averaging emphasized by this and previous related work provides the underlying theoretical background to ROA studies of dynamic aspects of chiral molecular and biomolecular structure and behavior. (c) 2009 Wiley-Liss, Inc.
Persistent homology and non-Gaussianity
NASA Astrophysics Data System (ADS)
Cole, Alex; Shiu, Gary
2018-03-01
In this paper, we introduce the topological persistence diagram as a statistic for Cosmic Microwave Background (CMB) temperature anisotropy maps. A central concept in 'Topological Data Analysis' (TDA), the idea of persistence is to represent a data set by a family of topological spaces. One then examines how long topological features 'persist' as the family of spaces is traversed. We compute persistence diagrams for simulated CMB temperature anisotropy maps featuring various levels of primordial non-Gaussianity of local type. Postponing the analysis of observational effects, we show that persistence diagrams are more sensitive to local non-Gaussianity than previous topological statistics including the genus and Betti number curves, and can constrain Δ fNLloc= 35.8 at the 68% confidence level on the simulation set, compared to Δ fNLloc= 60.6 for the Betti number curves. Given the resolution of our simulations, we expect applying persistence diagrams to observational data will give constraints competitive with those of the Minkowski Functionals. This is the first in a series of papers where we plan to apply TDA to different shapes of non-Gaussianity in the CMB and Large Scale Structure.
DOT National Transportation Integrated Search
1975-10-01
This document forms part of the Subway Environmental Design Handbook. It contains the background information and instructions to enable an engineer to perform an analysis of a subway system by using the Subway Environment Simulation (SES) computer pr...
Clinical Core Competency Training for NASA Flight Surgeons
NASA Technical Reports Server (NTRS)
Polk, J. D.; Schmid, Josef; Hurst, Victor, IV; Doerr, Harold K.; Doerr, Harold K.
2007-01-01
Introduction: The cohort of NASA flight surgeons (FS) is a very accomplished group with varied clinical backgrounds; however, the NASA Flight Surgeon Office has identified that the extremely demanding schedule of this cohort prevents many of these physicians from practicing clinical medicine on a regular basis. In an effort to improve clinical competency, the NASA FS Office has dedicated one day a week for the FS to receive clinical training. Each week, an FS is assigned to one of five clinical settings, one being medical patient simulation. The Medical Operations Support Team (MOST) was tasked to develop curricula using medical patient simulation that would meet the clinical and operational needs of the NASA FS Office. Methods: The MOST met with the Lead FS and Training Lead FS to identify those core competencies most important to the FS cohort. The MOST presented core competency standards from the American Colleges of Emergency Medicine and Internal Medicine as a basis for developing the training. Results: The MOST identified those clinical areas that could be best demonstrated and taught using medical patient simulation, in particular, using high fidelity human patient simulators. Curricula are currently being developed and additional classes will be implemented to instruct the FS cohort. The curricula will incorporate several environments for instruction, including lab-based and simulated microgravity-based environments. Discussion: The response from the NASA FS cohort to the initial introductory class has been positive. As a result of this effort, the MOST has identified three types of training to meet the clinical needs of the FS Office; clinical core competency training, individual clinical refresher training, and just-in-time training (specific for post-ISS Expedition landings). The MOST is continuing to work with the FS Office to augment the clinical training for the FS cohort, including the integration of Web-based learning.
Radiogenic and muon-induced backgrounds in the LUX dark matter detector
NASA Astrophysics Data System (ADS)
Akerib, D. S.; Araújo, H. M.; Bai, X.; Bailey, A. J.; Balajthy, J.; Bernard, E.; Bernstein, A.; Bradley, A.; Byram, D.; Cahn, S. B.; Carmona-Benitez, M. C.; Chan, C.; Chapman, J. J.; Chiller, A. A.; Chiller, C.; Coffey, T.; Currie, A.; de Viveiros, L.; Dobi, A.; Dobson, J.; Druszkiewicz, E.; Edwards, B.; Faham, C. H.; Fiorucci, S.; Flores, C.; Gaitskell, R. J.; Gehman, V. M.; Ghag, C.; Gibson, K. R.; Gilchriese, M. G. D.; Hall, C.; Hertel, S. A.; Horn, M.; Huang, D. Q.; Ihm, M.; Jacobsen, R. G.; Kazkaz, K.; Knoche, R.; Larsen, N. A.; Lee, C.; Lindote, A.; Lopes, M. I.; Malling, D. C.; Mannino, R.; McKinsey, D. N.; Mei, D.-M.; Mock, J.; Moongweluwan, M.; Morad, J.; Murphy, A. St. J.; Nehrkorn, C.; Nelson, H.; Neves, F.; Ott, R. A.; Pangilinan, M.; Parker, P. D.; Pease, E. K.; Pech, K.; Phelps, P.; Reichhart, L.; Shutt, T.; Silva, C.; Solovov, V. N.; Sorensen, P.; O'Sullivan, K.; Sumner, T. J.; Szydagis, M.; Taylor, D.; Tennyson, B.; Tiedt, D. R.; Tripathi, M.; Uvarov, S.; Verbus, J. R.; Walsh, N.; Webb, R.; White, J. T.; Witherell, M. S.; Wolfs, F. L. H.; Woods, M.; Zhang, C.
2015-03-01
The Large Underground Xenon (LUX) dark matter experiment aims to detect rare low-energy interactions from Weakly Interacting Massive Particles (WIMPs). The radiogenic backgrounds in the LUX detector have been measured and compared with Monte Carlo simulation. Measurements of LUX high-energy data have provided direct constraints on all background sources contributing to the background model. The expected background rate from the background model for the 85.3 day WIMP search run is (2.6 ±0.2stat ±0.4sys) ×10-3 events keVee-1 kg-1day-1 in a 118 kg fiducial volume. The observed background rate is (3.6 ±0.4stat) ×10-3 events keVee-1 kg-1day-1 , consistent with model projections. The expectation for the radiogenic background in a subsequent one-year run is presented.
Gibbs, Kenneth D; Basson, Jacob; Xierali, Imam M; Broniatowski, David A
2016-11-17
Faculty diversity is a longstanding challenge in the US. However, we lack a quantitative and systemic understanding of how the career transitions into assistant professor positions of PhD scientists from underrepresented minority (URM) and well-represented (WR) racial/ethnic backgrounds compare. Between 1980 and 2013, the number of PhD graduates from URM backgrounds increased by a factor of 9.3, compared with a 2.6-fold increase in the number of PhD graduates from WR groups. However, the number of scientists from URM backgrounds hired as assistant professors in medical school basic science departments was not related to the number of potential candidates (R 2 =0.12, p>0.07), whereas there was a strong correlation between these two numbers for scientists from WR backgrounds (R 2 =0.48, p<0.0001). We built and validated a conceptual system dynamics model based on these data that explained 79% of the variance in the hiring of assistant professors and posited no hiring discrimination. Simulations show that, given current transition rates of scientists from URM backgrounds to faculty positions, faculty diversity would not increase significantly through the year 2080 even in the context of an exponential growth in the population of PhD graduates from URM backgrounds, or significant increases in the number of faculty positions. Instead, the simulations showed that diversity increased as more postdoctoral candidates from URM backgrounds transitioned onto the market and were hired.
2011-01-01
Background Molecular marker information is a common source to draw inferences about the relationship between genetic and phenotypic variation. Genetic effects are often modelled as additively acting marker allele effects. The true mode of biological action can, of course, be different from this plain assumption. One possibility to better understand the genetic architecture of complex traits is to include intra-locus (dominance) and inter-locus (epistasis) interaction of alleles as well as the additive genetic effects when fitting a model to a trait. Several Bayesian MCMC approaches exist for the genome-wide estimation of genetic effects with high accuracy of genetic value prediction. Including pairwise interaction for thousands of loci would probably go beyond the scope of such a sampling algorithm because then millions of effects are to be estimated simultaneously leading to months of computation time. Alternative solving strategies are required when epistasis is studied. Methods We extended a fast Bayesian method (fBayesB), which was previously proposed for a purely additive model, to include non-additive effects. The fBayesB approach was used to estimate genetic effects on the basis of simulated datasets. Different scenarios were simulated to study the loss of accuracy of prediction, if epistatic effects were not simulated but modelled and vice versa. Results If 23 QTL were simulated to cause additive and dominance effects, both fBayesB and a conventional MCMC sampler BayesB yielded similar results in terms of accuracy of genetic value prediction and bias of variance component estimation based on a model including additive and dominance effects. Applying fBayesB to data with epistasis, accuracy could be improved by 5% when all pairwise interactions were modelled as well. The accuracy decreased more than 20% if genetic variation was spread over 230 QTL. In this scenario, accuracy based on modelling only additive and dominance effects was generally superior to that of the complex model including epistatic effects. Conclusions This simulation study showed that the fBayesB approach is convenient for genetic value prediction. Jointly estimating additive and non-additive effects (especially dominance) has reasonable impact on the accuracy of prediction and the proportion of genetic variation assigned to the additive genetic source. PMID:21867519
Background Noise Reduction Using Adaptive Noise Cancellation Determined by the Cross-Correlation
NASA Technical Reports Server (NTRS)
Spalt, Taylor B.; Brooks, Thomas F.; Fuller, Christopher R.
2012-01-01
Background noise due to flow in wind tunnels contaminates desired data by decreasing the Signal-to-Noise Ratio. The use of Adaptive Noise Cancellation to remove background noise at measurement microphones is compromised when the reference sensor measures both background and desired noise. The technique proposed modifies the classical processing configuration based on the cross-correlation between the reference and primary microphone. Background noise attenuation is achieved using a cross-correlation sample width that encompasses only the background noise and a matched delay for the adaptive processing. A present limitation of the method is that a minimum time delay between the background noise and desired signal must exist in order for the correlated parts of the desired signal to be separated from the background noise in the crosscorrelation. A simulation yields primary signal recovery which can be predicted from the coherence of the background noise between the channels. Results are compared with two existing methods.
RFI in hybrid loops - Simulation and experimental results.
NASA Technical Reports Server (NTRS)
Ziemer, R. E.; Nelson, D. R.; Raghavan, H. R.
1972-01-01
A digital simulation of an imperfect second-order hybrid phase-locked loop (HPLL) operating in radio frequency interference (RFI) is described. Its performance is characterized in terms of phase error variance and phase error probability density function (PDF). Monte-Carlo simulation is used to show that the HPLL can be superior to the conventional phase-locked loops in RFI backgrounds when minimum phase error variance is the goodness criterion. Similar experimentally obtained data are given in support of the simulation data.
NASA Technical Reports Server (NTRS)
Roman, Juan A.; Stitt, George F.; Roman, Felix R.
1997-01-01
This paper will provide a general overview of the molecular contamination philosophy of the Space Simulation Test Engineering Section and how the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) space simulation laboratory controls and maintains the cleanliness of all its facilities, thereby, minimizing down time between tests. It will also briefly cover the proper selection and safety precautions needed when using some chemical solvents for wiping, washing, or spraying thermal shrouds when molecular contaminants increase to unacceptable background levels.
STS-26 simulation activities in JSC Mission Control Center (MCC)
NASA Technical Reports Server (NTRS)
1987-01-01
In JSC Mission Control Center (MCC) Bldg 30 Flight Control Room (FCR), flight controller Granvil A. Pennington, leaning on console, listens to communications during the STS-26 integrated simulations in progress between MCC and JSC Mission Simulation and Training Facility Bldg 5 fixed-base (FB) shuttle mission simulator (SMS). MCC FCR visual displays are seen in background. Five veteran astronauts were in the FB-SMS rehearsing their roles for the scheduled June 1988 flight aboard Discovery, Orbiter Vehicle (OV) 103.
Framework of passive millimeter-wave scene simulation based on material classification
NASA Astrophysics Data System (ADS)
Park, Hyuk; Kim, Sung-Hyun; Lee, Ho-Jin; Kim, Yong-Hoon; Ki, Jae-Sug; Yoon, In-Bok; Lee, Jung-Min; Park, Soon-Jun
2006-05-01
Over the past few decades, passive millimeter-wave (PMMW) sensors have emerged as useful implements in transportation and military applications such as autonomous flight-landing system, smart weapons, night- and all weather vision system. As an efficient way to predict the performance of a PMMW sensor and apply it to system, it is required to test in SoftWare-In-the-Loop (SWIL). The PMMW scene simulation is a key component for implementation of this simulator. However, there is no commercial on-the-shelf available to construct the PMMW scene simulation; only there have been a few studies on this technology. We have studied the PMMW scene simulation method to develop the PMMW sensor SWIL simulator. This paper describes the framework of the PMMW scene simulation and the tentative results. The purpose of the PMMW scene simulation is to generate sensor outputs (or image) from a visible image and environmental conditions. We organize it into four parts; material classification mapping, PMMW environmental setting, PMMW scene forming, and millimeter-wave (MMW) sensorworks. The background and the objects in the scene are classified based on properties related with MMW radiation and reflectivity. The environmental setting part calculates the following PMMW phenomenology; atmospheric propagation and emission including sky temperature, weather conditions, and physical temperature. Then, PMMW raw images are formed with surface geometry. Finally, PMMW sensor outputs are generated from PMMW raw images by applying the sensor characteristics such as an aperture size and noise level. Through the simulation process, PMMW phenomenology and sensor characteristics are simulated on the output scene. We have finished the design of framework of the simulator, and are working on implementation in detail. As a tentative result, the flight observation was simulated in specific conditions. After implementation details, we plan to increase the reliability of the simulation by data collecting using actual PMMW sensors. With the reliable PMMW scene simulator, it will be more efficient to apply the PMMW sensor to various applications.
Re-Evaluation of Development of the TMDL Using Long-Term Monitoring Data and Modeling
NASA Astrophysics Data System (ADS)
Squires, A.; Rittenburg, R.; Boll, J.; Brooks, E. S.
2012-12-01
Since 1996, 47,979 Total Maximum Daily Loads (TMDLs) have been approved throughout the United States for impaired water bodies. TMDLs are set through the determination of natural background loads for a given water body which then estimate contributions from point and nonpoint sources to create load allocations and determine acceptable pollutant levels to meet water quality standards. Monitoring data and hydrologic models may be used in this process. However, data sets used are often limited in duration and frequency, and model simulations are not always accurate. The objective of this study is to retrospectively look at the development and accuracy of the TMDL for a stream in an agricultural area using long-term monitoring data and a robust modeling process. The study area is the Paradise Creek Watershed in northern Idaho. A sediment TMDL was determined for the Idaho section of Paradise Creek in 1997. Sediment TMDL levels were determined using a short-term data set and the Water Erosion Prediction Project (WEPP) model. Background loads used for the TMDL in 1997 were from pre-agricultural levels, based on WEPP model results. We modified the WEPP model for simulation of saturation excess overland flow, the dominant runoff generation mechanism, and analyzed more than 10 years of high resolution monitoring data from 2001 - 2012, including discharge and total suspended solids. Results will compare background loading and current loading based on present-day land use documented during the monitoring period and compare previous WEPP model results with the modified WEPP model results. This research presents a reevaluation of the TMDL process with recommendations for a more scientifically sound methodology to attain realistic water quality goals.
Evolution of cosmic string networks
NASA Technical Reports Server (NTRS)
Albrecht, Andreas; Turok, Neil
1989-01-01
A discussion of the evolution and observable consequences of a network of cosmic strings is given. A simple model for the evolution of the string network is presented, and related to the statistical mechanics of string networks. The model predicts the long string density throughout the history of the universe from a single parameter, which researchers calculate in radiation era simulations. The statistical mechanics arguments indicate a particular thermal form for the spectrum of loops chopped off the network. Detailed numerical simulations of string networks in expanding backgrounds are performed to test the model. Consequences for large scale structure, the microwave and gravity wave backgrounds, nucleosynthesis and gravitational lensing are calculated.
NASA Astrophysics Data System (ADS)
Krauz, V. I.; Myalton, V. V.; Vinogradov, V. P.; Velikhov, E. P.; Ananyev, S. S.; Dan'ko, S. A.; Kalinin, Yu G.; Kharrasov, A. M.; Vinogradova, Yu V.; Mitrofanov, K. N.; Paduch, M.; Miklaszewski, R.; Zielinska, E.; Skladnik-Sadowska, E.; Sadowski, M. J.; Kwiatkowski, R.; Tomaszewski, K.; Vojtenko, D. A.
2017-10-01
Results are presented from laboratory simulations of plasma jets emitted by young stellar objects carried out at the plasma focus facilities. The experiments were performed at three facilities: the PF-3, PF-1000U and KPF-4. The operation modes were realized enabling the formation of narrow plasma jets which can propagate over long distances. The main parameters of plasma jets and background plasma were determined. In order to control the ratio of a jet density to that of background plasma, some special operation modes with pulsed injection of the working gas were used.
NASA Astrophysics Data System (ADS)
Ke, Y.; Gao, X.; Lu, Q.; Wang, X.; Wang, S.
2017-12-01
Recently, the generation of rising-tone chorus has been implemented with one-dimensional (1-D) particle-in-cell (PIC) simulations in an inhomogeneous background magnetic field, where both the propagation of waves and motion of electrons are simply forced to be parallel to the background magnetic field. We have developed a two-dimensional(2-D) general curvilinear PIC simulation code, and successfully reproduced rising-tone chorus waves excited from an anisotropic electron distribution in a 2-D mirror field. Our simulation results show that whistler waves are mainly generated around the magnetic equator, and continuously gain growth during their propagation toward higher-latitude regions. The rising-tone chorus waves are formed off the magnetic equator, which propagate quasi-parallel to the background magnetic field with the finite wave normal angle. Due to the propagating effect, the wave normal angle of chorus waves is increasing during their propagation toward higher-latitude regions along an enough curved field line. The chirping rate of chorus waves are found to be larger along a field line more close to the middle field line in the mirror field.
NASA Astrophysics Data System (ADS)
Ke, Yangguang; Gao, Xinliang; Lu, Quanming; Wang, Xueyi; Wang, Shui
2017-08-01
Recently, the generation of rising-tone chorus has been implemented with one-dimensional (1-D) particle-in-cell (PIC) simulations in an inhomogeneous background magnetic field, where both the propagation of waves and motion of electrons are simply forced to be parallel to the background magnetic field. In this paper, we have developed a two-dimensional (2-D) general curvilinear PIC simulation code and successfully reproduced rising-tone chorus waves excited from an anisotropic electron distribution in a 2-D mirror field. Our simulation results show that whistler waves are mainly generated around the magnetic equator and continuously gain growth during their propagation toward higher-latitude regions. The rising-tone chorus waves are observed off the magnetic equator, which propagate quasi-parallel to the background magnetic field with the wave normal angle smaller than 25°. Due to the propagating effect, the wave normal angle of chorus waves is increasing during their propagation toward higher-latitude regions along an enough curved field line. The chirping rate of chorus waves is found to be larger along a field line with a smaller curvature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu Wei; Li Hui; Li Shengtai
Nonlinear ideal magnetohydrodynamic (MHD) simulations of the propagation and expansion of a magnetic ''bubble'' plasma into a lower density, weakly magnetized background plasma, are presented. These simulations mimic the geometry and parameters of the Plasma Bubble Expansion Experiment (PBEX) [A. G. Lynn, Y. Zhang, S. C. Hsu, H. Li, W. Liu, M. Gilmore, and C. Watts, Bull. Am. Phys. Soc. 52, 53 (2007)], which is studying magnetic bubble expansion as a model for extragalactic radio lobes. The simulations predict several key features of the bubble evolution. First, the direction of bubble expansion depends on the ratio of the bubble toroidalmore » to poloidal magnetic field, with a higher ratio leading to expansion predominantly in the direction of propagation and a lower ratio leading to expansion predominantly normal to the direction of propagation. Second, a MHD shock and a trailing slow-mode compressible MHD wavefront are formed ahead of the bubble as it propagates into the background plasma. Third, the bubble expansion and propagation develop asymmetries about its propagation axis due to reconnection facilitated by numerical resistivity and to inhomogeneous angular momentum transport mainly due to the background magnetic field. These results will help guide the initial experiments and diagnostic measurements on PBEX.« less
Accurate estimates for North American background (NAB) ozone (O3) in surface air over the United States are needed for setting and implementing an attainable national O3 standard. These estimates rely on simulations with atmospheric chemistry-transport models that set North Amer...
Defining the Simulation Technician Role: Results of a Survey-Based Study.
Bailey, Rachel; Taylor, Regina G; FitzGerald, Michael R; Kerrey, Benjamin T; LeMaster, Thomas; Geis, Gary L
2015-10-01
In health care simulation, simulation technicians perform multiple tasks to support various educational offerings. Technician responsibilities and the tasks that accompany them seem to vary between centers. The objectives were to identify the range and frequency of tasks that technicians perform and to determine if there is a correspondence between what technicians do and what they feel their responsibilities should be. We hypothesized that there is a core set of responsibilities and tasks for the technician position regardless of background, experience, and type of simulation center. We conducted a prospective, survey-based study of individuals currently functioning in a simulation technician role in a simulation center. This survey was designed internally and piloted within 3 academic simulation centers. Potential respondents were identified through a national mailing list, and the survey was distributed electronically during a 3-week period. A survey request was sent to 280 potential participants, 136 (49%) responded, and 73 met inclusion criteria. Five core tasks were identified as follows: equipment setup and breakdown, programming scenarios into software, operation of software during simulation, audiovisual support for courses, and on-site simulator maintenance. Independent of background before they were hired, technicians felt unprepared for their role once taking the position. Formal training was identified as a need; however, the majority of technicians felt experience over time was the main contributor toward developing knowledge and skills within their role. This study represents a first step in defining the technician role within simulation-based education and supports the need for the development of a formal job description to allow recruitment, development, and certification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreyev, A.
Purpose: Compton cameras (CCs) use electronic collimation to reconstruct the images of activity distribution. Although this approach can greatly improve imaging efficiency, due to complex geometry of the CC principle, image reconstruction with the standard iterative algorithms, such as ordered subset expectation maximization (OSEM), can be very time-consuming, even more so if resolution recovery (RR) is implemented. We have previously shown that the origin ensemble (OE) algorithm can be used for the reconstruction of the CC data. Here we propose a method of extending our OE algorithm to include RR. Methods: To validate the proposed algorithm we used Monte Carlomore » simulations of a CC composed of multiple layers of pixelated CZT detectors and designed for imaging small animals. A series of CC acquisitions of small hot spheres and the Derenzo phantom placed in air were simulated. Images obtained from (a) the exact data, (b) blurred data but reconstructed without resolution recovery, and (c) blurred and reconstructed with resolution recovery were compared. Furthermore, the reconstructed contrast-to-background ratios were investigated using the phantom with nine spheres placed in a hot background. Results: Our simulations demonstrate that the proposed method allows for the recovery of the resolution loss that is due to imperfect accuracy of event detection. Additionally, tests of camera sensitivity corresponding to different detector configurations demonstrate that the proposed CC design has sensitivity comparable to PET. When the same number of events were considered, the computation time per iteration increased only by a factor of 2 when OE reconstruction with the resolution recovery correction was performed relative to the original OE algorithm. We estimate that the addition of resolution recovery to the OSEM would increase reconstruction times by 2–3 orders of magnitude per iteration. Conclusions: The results of our tests demonstrate the improvement of image resolution provided by the OE reconstructions with resolution recovery. The quality of images and their contrast are similar to those obtained from the OE reconstructions from scans simulated with perfect energy and spatial resolutions.« less
Resolution recovery for Compton camera using origin ensemble algorithm.
Andreyev, A; Celler, A; Ozsahin, I; Sitek, A
2016-08-01
Compton cameras (CCs) use electronic collimation to reconstruct the images of activity distribution. Although this approach can greatly improve imaging efficiency, due to complex geometry of the CC principle, image reconstruction with the standard iterative algorithms, such as ordered subset expectation maximization (OSEM), can be very time-consuming, even more so if resolution recovery (RR) is implemented. We have previously shown that the origin ensemble (OE) algorithm can be used for the reconstruction of the CC data. Here we propose a method of extending our OE algorithm to include RR. To validate the proposed algorithm we used Monte Carlo simulations of a CC composed of multiple layers of pixelated CZT detectors and designed for imaging small animals. A series of CC acquisitions of small hot spheres and the Derenzo phantom placed in air were simulated. Images obtained from (a) the exact data, (b) blurred data but reconstructed without resolution recovery, and (c) blurred and reconstructed with resolution recovery were compared. Furthermore, the reconstructed contrast-to-background ratios were investigated using the phantom with nine spheres placed in a hot background. Our simulations demonstrate that the proposed method allows for the recovery of the resolution loss that is due to imperfect accuracy of event detection. Additionally, tests of camera sensitivity corresponding to different detector configurations demonstrate that the proposed CC design has sensitivity comparable to PET. When the same number of events were considered, the computation time per iteration increased only by a factor of 2 when OE reconstruction with the resolution recovery correction was performed relative to the original OE algorithm. We estimate that the addition of resolution recovery to the OSEM would increase reconstruction times by 2-3 orders of magnitude per iteration. The results of our tests demonstrate the improvement of image resolution provided by the OE reconstructions with resolution recovery. The quality of images and their contrast are similar to those obtained from the OE reconstructions from scans simulated with perfect energy and spatial resolutions.
Quantitative basis for component factors of gas flow proportional counting efficiencies
NASA Astrophysics Data System (ADS)
Nichols, Michael C.
This dissertation investigates the counting efficiency calibration of a gas flow proportional counter with beta-particle emitters in order to (1) determine by measurements and simulation the values of the component factors of beta-particle counting efficiency for a proportional counter, (2) compare the simulation results and measured counting efficiencies, and (3) determine the uncertainty of the simulation and measurements. Monte Carlo simulation results by the MCNP5 code were compared with measured counting efficiencies as a function of sample thickness for 14C, 89Sr, 90Sr, and 90Y. The Monte Carlo model simulated strontium carbonate with areal thicknesses from 0.1 to 35 mg cm-2. The samples were precipitated as strontium carbonate with areal thicknesses from 3 to 33 mg cm-2 , mounted on membrane filters, and counted on a low background gas flow proportional counter. The estimated fractional standard deviation was 2--4% (except 6% for 14C) for efficiency measurements of the radionuclides. The Monte Carlo simulations have uncertainties estimated to be 5 to 6 percent for carbon-14 and 2.4 percent for strontium-89, strontium-90, and yttrium-90. The curves of simulated counting efficiency vs. sample areal thickness agreed within 3% of the curves of best fit drawn through the 25--49 measured points for each of the four radionuclides. Contributions from this research include development of uncertainty budgets for the analytical processes; evaluation of alternative methods for determining chemical yield critical to the measurement process; correcting a bias found in the MCNP normalization of beta spectra histogram; clarifying the interpretation of the commonly used ICRU beta-particle spectra for use by MCNP; and evaluation of instrument parameters as applied to the simulation model to obtain estimates of the counting efficiency from simulated pulse height tallies.
Simulation in JFL: Business Writing
ERIC Educational Resources Information Center
Fukushima, Tatsuya
2007-01-01
This article discusses a simulation wherein learners of Japanese as a Foreign Language (JFL) in a business writing course at an American university are assigned tasks to write a series of business letters based on situations that are likely to occur in actual business settings. After an overview of the theoretical background, this article…
ERIC Educational Resources Information Center
Van Camp, Julie
1986-01-01
This article provides background on the voir dire (jury selection) process, explaining its importance to the outcome of a trial. Offers a simulation experience which has students take the role of lawyers interviewing 29 prospective jurors for an alcohol-related traffic accident involving a 20-year-old driver. Profiles for prospective jurors and…
Background: Simulation studies have previously demonstrated that time-series analyses using smoothing splines correctly model null health-air pollution associations. Methods: We repeatedly simulated season, meteorology and air quality for the metropolitan area of Atlanta from cyc...
The Relation between Cognitive and Metacognitive Strategic Processing during a Science Simulation
ERIC Educational Resources Information Center
Dinsmore, Daniel L.; Zoellner, Brian P.
2018-01-01
Background: This investigation was designed to uncover the relations between students' cognitive and metacognitive strategies used during a complex climate simulation. While cognitive strategy use during science inquiry has been studied, the factors related to this strategy use, such as concurrent metacognition, prior knowledge, and prior…
Helicopter simulator qualification
NASA Technical Reports Server (NTRS)
Hampson, Brian
1992-01-01
CAE has extensive experience in building helicopter simulators and has participated in group working sessions for fixed-wing advisory circulars. Against this background, issues that should be addressed in establishing helicopter approval criteria were highlighted. Some of these issues are not immediately obvious and may, indeed, be more important than the criteria a themselves.
Water Conservation Education with a Rainfall Simulator.
ERIC Educational Resources Information Center
Kok, Hans; Kessen, Shelly
1997-01-01
Describes a program in which a rainfall simulator was used to promote water conservation by showing water infiltration, water runoff, and soil erosion. The demonstrations provided a good background for the discussion of issues such as water conservation, crop rotation, and conservation tillage practices. The program raised awareness of…
Exploring Iconic Interpretation and Mathematics Teacher Development through Clinical Simulations
ERIC Educational Resources Information Center
Dotger, Benjamin; Masingila, Joanna; Bearkland, Mary; Dotger, Sharon
2015-01-01
Field placements serve as the traditional "clinical" experience for prospective mathematics teachers to immerse themselves in the mathematical challenges of students. This article reports data from a different type of learning experience, that of a clinical simulation with a standardized individual. We begin with a brief background on…
Active confocal imaging for visual prostheses
Jung, Jae-Hyun; Aloni, Doron; Yitzhaky, Yitzhak; Peli, Eli
2014-01-01
There are encouraging advances in prosthetic vision for the blind, including retinal and cortical implants, and other “sensory substitution devices” that use tactile or electrical stimulation. However, they all have low resolution, limited visual field, and can display only few gray levels (limited dynamic range), severely restricting their utility. To overcome these limitations, image processing or the imaging system could emphasize objects of interest and suppress the background clutter. We propose an active confocal imaging system based on light-field technology that will enable a blind user of any visual prosthesis to efficiently scan, focus on, and “see” only an object of interest while suppressing interference from background clutter. The system captures three-dimensional scene information using a light-field sensor and displays only an in-focused plane with objects in it. After capturing a confocal image, a de-cluttering process removes the clutter based on blur difference. In preliminary experiments we verified the positive impact of confocal-based background clutter removal on recognition of objects in low resolution and limited dynamic range simulated phosphene images. Using a custom-made multiple-camera system, we confirmed that the concept of a confocal de-cluttered image can be realized effectively using light field imaging. PMID:25448710
NASA Technical Reports Server (NTRS)
Strode, Sarah A.; Douglass, Anne R.; Ziemke, Jerald R.; Manyin, Michael; Nielsen, J. Eric; Oman, Luke D.
2017-01-01
Satellite observations of in-cloud ozone concentrations from the Ozone Monitoring Instrument and Microwave Limb Sounder instruments show substantial differences from background ozone concentrations. We develop a method for comparing a free-running chemistry-climate model (CCM) to in-cloud and background ozone observations using a simple criterion based on cloud fraction to separate cloudy and clear-sky days. We demonstrate that the CCM simulates key features of the in-cloud versus background ozone differences and of the geographic distribution of in-cloud ozone. Since the agreement is not dependent on matching the meteorological conditions of a specific day, this is a promising method for diagnosing how accurately CCMs represent the relationships between ozone and clouds, including the lower ozone concentrations shown by in-cloud satellite observations. Since clouds are associated with convection as well as changes in chemistry, we diagnose the tendency of tropical ozone at 400 hPa due to chemistry, convection and turbulence, and large-scale dynamics. While convection acts to reduce ozone concentrations at 400 hPa throughout much of the tropics, it has the opposite effect over highly polluted regions of South and East Asia.
First detection of cosmic microwave background lensing and Lyman- α forest bispectrum
Doux, Cyrille; Schaan, Emmanuel; Aubourg, Eric; ...
2016-11-09
We present the first detection of a correlation between the Lyman-α forest and cosmic microwave background gravitational lensing. For each Lyman-α forest in SDSS-III/BOSS DR12, we correlate the one-dimensional power spectrum with the cosmic microwave background lensing convergence on the same line of sight from Planck. This measurement constitutes a position-dependent power spectrum, or a squeezed bispectrum, and quantifies the nonlinear response of the Lyman-α forest power spectrum to a large-scale overdensity. The signal is measured at 5σ and is consistent with the expectation of the standard ΛCDM cosmological model. We measure the linear bias of the Lyman-α forest withmore » respect to the dark matter distribution and constrain a combination of nonlinear terms including the nonlinear bias. This new observable provides a consistency check for the Lyman-α forest as a large-scale structure probe and tests our understanding of the relation between intergalactic gas and dark matter. In the future, it could be used to test hydrodynamical simulations and calibrate the relation between the Lyman-α forest and dark matter.« less
NASA Astrophysics Data System (ADS)
Mathur, R.; Kang, D.; Napelenok, S. L.; Xing, J.; Hogrefe, C.
2017-12-01
Air pollution reduction strategies for a region are complicated not only by the interplay of local emissions sources and several complex physical, chemical, dynamical processes in the atmosphere, but also hemispheric background levels of pollutants. Contrasting changes in emission patterns across the globe (e.g. declining emissions in North America and Western Europe in response to implementation of control measures and increasing emissions across Asia due to economic and population growth) are resulting in heterogeneous changes in the tropospheric chemical composition and are likely altering long-range transport impacts and consequently background pollution levels at receptor regions. To quantify these impacts, the WRF-CMAQ model is expanded to hemispheric scales and multi-decadal model simulations are performed for the period spanning 1990-2010 to examine changes in hemispheric air pollution resulting from changes in emissions over this period. Simulated trends in ozone and precursor species concentrations across the U.S. and the Northern Hemisphere over the past two decades are compared with those inferred from available measurements during this period. Additionally, the decoupled direct method (DDM) in CMAQ, a first- and higher-order sensitivity calculation technique, is used to estimate the sensitivity of O3 to emissions from different source regions across the Northern Hemisphere. The seasonal variations in source region contributions to background O3 are then estimated from these sensitivity calculations and will be discussed. These source region sensitivities estimated from DDM are then combined with the multi-decadal simulations of O3 distributions and emissions trends to characterize the changing contributions of different source regions to background O3 levels across North America. This characterization of changing long-range transport contributions is critical for the design and implementation of tighter national air quality standards
Space vehicle approach velocity judgments under simulated visual space conditions
NASA Technical Reports Server (NTRS)
Haines, Richard F.
1987-01-01
Thirty-five volunteers responded when they first perceived an increase in apparent size of a collimated, 2-D image of an Orbiter vehicle. The test variables of interest included the presence of a fixed angular reticle within the field of view (FOV); three initial Orbiter distances; three constant Orbiter approach velocities corresponding to 1.6, 0.8, and 0.4 percent of the initial distance per second; and two background starfield velocities. It was found that: (1) at each initial range, increasing approach velocity led to a larger distance between the eye and Orbiter image at threshold; (2) including the fixed reticle in the FOV produced a smaller distance between the eye and Orbiter image at threshold; and (3) increasing background star velocity during this judgment led to a smaller distance between the eye and Orbiter image at threshold. The last two findings suggest that other detail within the FOV may compete for available attention which otherwise would be available for judging image expansion; thus, the target has to approach the observer nearer than otherwise if these details were present. These findings are discussed in relation to previous research and possible underlying mechanisms.
Reionization of the Universe and the Photoevaporation of Cosmological Minihalos
NASA Technical Reports Server (NTRS)
Shapiro, Paul R.; Raga, Alejandro C.
2000-01-01
The first sources of ionizing radiation to condense out of the dark and neutral Intergalactic Medium (IGM) sent ionization fronts sweeping outward through their surroundings, overtaking other condensed objects and photoevaporating them. This feedback effect of universal reionization on cosmic structure formation is demonstrated here for the case of a cosmological minihalo of dark matter and baryons exposed to an external source of ionizing radiation with a quasar-like spectrum, just after the passage of the global ionization front created by the source. We model the pre-ionization minihalo as a truncated, nonsingular isothermal sphere in hydrostatic equilibrium following its collapse out of the expanding background universe and virialization. Results are presented of the first, gas dynamical simulations of this process, including radiative transfer. A sample of observational diagnostics is also presented, including the spatially-varying ionization levels of C, N, and O in the flow if a trace of heavy elements is present and the integrated column densities of H I, He I and He II, and C IV through the photoevaporating gas at different velocities, which would be measured in absorption against a background source like that responsible for the ionization.
Alivov, Yahya; Baturin, Pavlo; Le, Huy Q.; Ducote, Justin; Molloi, Sabee
2014-01-01
We investigated the effect of different imaging parameters such as dose, beam energy, energy resolution, and number of energy bins on image quality of K-edge spectral computed tomography (CT) of gold nanoparticles (GNP) accumulated in an atherosclerotic plaque. Maximum likelihood technique was employed to estimate the concentration of GNP, which served as a targeted intravenous contrast material intended to detect the degree of plaque's inflammation. The simulations studies used a single slice parallel beam CT geometry with an X-ray beam energy ranging between 50 and 140 kVp. The synthetic phantoms included small (3 cm in diameter) cylinder and chest (33x24 cm2) phantom, where both phantoms contained tissue, calcium, and gold. In the simulation studies GNP quantification and background (calcium and tissue) suppression task were pursued. The X-ray detection sensor was represented by an energy resolved photon counting detector (e.g., CdZnTe) with adjustable energy bins. Both ideal and more realistic (12% FWHM energy resolution) implementations of photon counting detector were simulated. The simulations were performed for the CdZnTe detector with pixel pitch of 0.5-1 mm, which corresponds to the performance without significant charge sharing and cross-talk effects. The Rose model was employed to estimate the minimum detectable concentration of GNPs. A figure of merit (FOM) was used to optimize the X-ray beam energy (kVp) to achieve the highest signal-to-noise ratio (SNR) with respect to patient dose. As a result, the successful identification of gold and background suppression was demonstrated. The highest FOM was observed at 125 kVp X-ray beam energy. The minimum detectable GNP concentration was determined to be approximately 1.06 μmol/mL (0.21 mg/mL) for an ideal detector and about 2.5 μmol/mL (0.49 mg/mL) for more realistic (12% FWHM) detector. The studies show the optimal imaging parameters at lowest patient dose using an energy resolved photon counting detector to image GNP in an atherosclerotic plaque. PMID:24334301
Introducing Multisensor Satellite Radiance-Based Evaluation for Regional Earth System Modeling
NASA Technical Reports Server (NTRS)
Matsui, T.; Santanello, J.; Shi, J. J.; Tao, W.-K.; Wu, D.; Peters-Lidard, C.; Kemp, E.; Chin, M.; Starr, D.; Sekiguchi, M.;
2014-01-01
Earth System modeling has become more complex, and its evaluation using satellite data has also become more difficult due to model and data diversity. Therefore, the fundamental methodology of using satellite direct measurements with instrumental simulators should be addressed especially for modeling community members lacking a solid background of radiative transfer and scattering theory. This manuscript introduces principles of multisatellite, multisensor radiance-based evaluation methods for a fully coupled regional Earth System model: NASA-Unified Weather Research and Forecasting (NU-WRF) model. We use a NU-WRF case study simulation over West Africa as an example of evaluating aerosol-cloud-precipitation-land processes with various satellite observations. NU-WRF-simulated geophysical parameters are converted to the satellite-observable raw radiance and backscatter under nearly consistent physics assumptions via the multisensor satellite simulator, the Goddard Satellite Data Simulator Unit. We present varied examples of simple yet robust methods that characterize forecast errors and model physics biases through the spatial and statistical interpretation of various satellite raw signals: infrared brightness temperature (Tb) for surface skin temperature and cloud top temperature, microwave Tb for precipitation ice and surface flooding, and radar and lidar backscatter for aerosol-cloud profiling simultaneously. Because raw satellite signals integrate many sources of geophysical information, we demonstrate user-defined thresholds and a simple statistical process to facilitate evaluations, including the infrared-microwave-based cloud types and lidar/radar-based profile classifications.
2010-01-01
Background This paper addresses the statistical use of accessibility and availability indices and the effect of study boundaries on these measures. The measures are evaluated via an extensive simulation based on cluster models for local outlet density. We define outlet to mean either food retail store (convenience store, supermarket, gas station) or restaurant (limited service or full service restaurants). We designed a simulation whereby a cluster outlet model is assumed in a large study window and an internal subset of that window is constructed. We performed simulations on various criteria including one scenario representing an urban area with 2000 outlets as well as a non-urban area simulated with only 300 outlets. A comparison is made between estimates obtained with the full study area and estimates using only the subset area. This allows the study of the effect of edge censoring on accessibility measures. Results The results suggest that considerable bias is found at the edges of study regions in particular for accessibility measures. Edge effects are smaller for availability measures (when not smoothed) and also for short range accessibility Conclusions It is recommended that any study utilizing these measures should correct for edge effects. The use of edge correction via guard areas is recommended and the avoidance of large range distance-based accessibility measures is also proposed. PMID:20663199
Evaluation of Probable Maximum Precipitation and Flood under Climate Change in the 21st Century
NASA Astrophysics Data System (ADS)
Gangrade, S.; Kao, S. C.; Rastogi, D.; Ashfaq, M.; Naz, B. S.; Kabela, E.; Anantharaj, V. G.; Singh, N.; Preston, B. L.; Mei, R.
2016-12-01
Critical infrastructures are potentially vulnerable to extreme hydro-climatic events. Under a warming environment, the magnitude and frequency of extreme precipitation and flood are likely to increase enhancing the needs to more accurately quantify the risks due to climate change. In this study, we utilized an integrated modeling framework that includes the Weather Research Forecasting (WRF) model and a high resolution distributed hydrology soil vegetation model (DHSVM) to simulate probable maximum precipitation (PMP) and flood (PMF) events over Alabama-Coosa-Tallapoosa River Basin. A total of 120 storms were selected to simulate moisture maximized PMP under different meteorological forcings, including historical storms driven by Climate Forecast System Reanalysis (CFSR) and baseline (1981-2010), near term future (2021-2050) and long term future (2071-2100) storms driven by Community Climate System Model version 4 (CCSM4) under Representative Concentrations Pathway 8.5 emission scenario. We also analyzed the sensitivity of PMF to various antecedent hydrologic conditions such as initial soil moisture conditions and tested different compulsive approaches. Overall, a statistical significant increase is projected for future PMP and PMF, mainly attributed to the increase of background air temperature. The ensemble of simulated PMP and PMF along with their sensitivity allows us to better quantify the potential risks associated with hydro-climatic extreme events on critical energy-water infrastructures such as major hydropower dams and nuclear power plants.
NASA Astrophysics Data System (ADS)
Develaki, Maria
2017-11-01
Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.
Study on general design of dual-DMD based infrared two-band scene simulation system
NASA Astrophysics Data System (ADS)
Pan, Yue; Qiao, Yang; Xu, Xi-ping
2017-02-01
Mid-wave infrared(MWIR) and long-wave infrared(LWIR) two-band scene simulation system is a kind of testing equipment that used for infrared two-band imaging seeker. Not only it would be qualified for working waveband, but also realize the essence requests that infrared radiation characteristics should correspond to the real scene. Past single-digital micromirror device (DMD) based infrared scene simulation system does not take the huge difference between targets and background radiation into account, and it cannot realize the separated modulation to two-band light beam. Consequently, single-DMD based infrared scene simulation system cannot accurately express the thermal scene model that upper-computer built, and it is not that practical. To solve the problem, we design a dual-DMD based, dual-channel, co-aperture, compact-structure infrared two-band scene simulation system. The operating principle of the system is introduced in detail, and energy transfer process of the hardware-in-the-loop simulation experiment is analyzed as well. Also, it builds the equation about the signal-to-noise ratio of infrared detector in the seeker, directing the system overall design. The general design scheme of system is given, including the creation of infrared scene model, overall control, optical-mechanical structure design and image registration. By analyzing and comparing the past designs, we discuss the arrangement of optical engine framework in the system. According to the main content of working principle and overall design, we summarize each key techniques in the system.
Modeling of skin cancer dermatoscopy images
NASA Astrophysics Data System (ADS)
Iralieva, Malica B.; Myakinin, Oleg O.; Bratchenko, Ivan A.; Zakharov, Valery P.
2018-04-01
An early identified cancer is more likely to effective respond to treatment and has a less expensive treatment as well. Dermatoscopy is one of general diagnostic techniques for skin cancer early detection that allows us in vivo evaluation of colors and microstructures on skin lesions. Digital phantoms with known properties are required during new instrument developing to compare sample's features with data from the instrument. An algorithm for image modeling of skin cancer is proposed in the paper. Steps of the algorithm include setting shape, texture generation, adding texture and normal skin background setting. The Gaussian represents the shape, and then the texture generation based on a fractal noise algorithm is responsible for spatial chromophores distributions, while the colormap applied to the values corresponds to spectral properties. Finally, a normal skin image simulated by mixed Monte Carlo method using a special online tool is added as a background. Varying of Asymmetry, Borders, Colors and Diameter settings is shown to be fully matched to the ABCD clinical recognition algorithm. The asymmetry is specified by setting different standard deviation values of Gaussian in different parts of image. The noise amplitude is increased to set the irregular borders score. Standard deviation is changed to determine size of the lesion. Colors are set by colormap changing. The algorithm for simulating different structural elements is required to match with others recognition algorithms.
NASA Astrophysics Data System (ADS)
Stoykova, Elena; Gotchev, Atanas; Sainov, Ventseslav
2011-01-01
Real-time accomplishment of a phase-shifting profilometry through simultaneous projection and recording of fringe patterns requires a reliable phase retrieval procedure. In the present work we consider a four-wavelength multi-camera system with four sinusoidal phase gratings for pattern projection that implements a four-step algorithm. Successful operation of the system depends on overcoming two challenges which stem out from the inherent limitations of the phase-shifting algorithm, namely the demand for a sinusoidal fringe profile and the necessity to ensure equal background and contrast of fringes in the recorded fringe patterns. As a first task, we analyze the systematic errors due to the combined influence of the higher harmonics and multi-wavelength illumination in the Fresnel diffraction zone considering the case when the modulation parameters of the four gratings are different. As a second task we simulate the system performance to evaluate the degrading effect of the speckle noise and the spatially varying fringe modulation at non-uniform illumination on the overall accuracy of the profilometric measurement. We consider the case of non-correlated speckle realizations in the recorded fringe patterns due to four-wavelength illumination. Finally, we apply a phase retrieval procedure which includes normalization, background removal and denoising of the recorded fringe patterns to both simulated and measured data obtained for a dome surface.
Radiative and Kinetic Feedback by Low-Mass Primordial Stars
NASA Astrophysics Data System (ADS)
Whalen, Daniel; Hueckstaedt, Robert M.; McConkie, Thomas O.
2010-03-01
Ionizing UV radiation and supernova (SN) flows amidst clustered minihalos at high redshift regulated the rise of the first stellar populations in the universe. Previous studies have addressed the effects of very massive primordial stars on the collapse of nearby halos into new stars, but the absence of the odd-even nucleosynthetic signature of pair-instability SNe in ancient metal-poor stars suggests that Population III stars may have been less than 100 M sun. We extend our earlier survey of local UV feedback on star formation to 25-80 M sun stars and include kinetic feedback by SNe for 25-40 M sun stars. We find radiative feedback to be relatively uniform over this mass range, primarily because the larger fluxes of more massive stars are offset by their shorter lifetimes. Our models demonstrate that prior to the rise of global UV backgrounds, Lyman-Werner (LW) photons from nearby stars cannot prevent halos from forming new stars. These calculations also reveal that violent dynamical instabilities can erupt in the UV radiation front enveloping a primordial halo, but that they ultimately have no effect on the formation of a star. Finally, our simulations suggest that relic H II regions surrounding partially evaporated halos may expel LW backgrounds at lower redshifts, allowing stars to form that were previously suppressed. We provide fits to radiative and kinetic feedback on star formation for use in both semianalytic models and numerical simulations.
Solar energetic particle transport and the possibility of wave generation by streaming electrons
NASA Astrophysics Data System (ADS)
Strauss, R. D. T.; le Roux, J. A.
2017-12-01
After being accelerated close to the Sun, solar energetic particles (SEPs) are transported (mainly) along the turbulent interplanetary magnetic field. In this study, we simulate the propagation of 100 keV electrons as they are scattered in the interplanetary medium. A consequence of these wave-particle interactions is the possible modification (either growth or damping) of the background turbulence by anisotropic SEP electron beams. This process was thought to be negligible, and therefore neglected in past modeling approaches. However, recent observations and modeling by Agueda and Lario (2016) suggest that wave generation may be significant and is therefore included and evaluated in our present model. Our results suggest that wave amplification by streaming SEP electrons is indeed possible and may even significantly alter the background turbulent field. However, the simulations show that this process is much too weak to produce observable effects at Earth's orbit, but such effects may well be observed in future by spacecraft closer to the Sun, presenting an intriguing observational opportunity for either the Solar Orbiter or the Parker Solar Probe spacecraft. Lastly, we note that the level of perpendicular diffusion may also play an important role in determining the effectiveness of the wave growth process. Reference: Agueda, N. and Lario, D. Release History and Transport Parameters of Relativistic Solar Electrons Inferred From Near-the-Sun In Situ Observations, ApJ, 829, 131, 2016.
CMB Polarization B-mode Delensing with SPTpol and Herschel
Manzotti, A.; et al.
2017-08-30
We present a demonstration of delensing the observed cosmic microwave background (CMB) B-mode polarization anisotropy. This process of reducing the gravitational-lensing-generated B-mode component will become increasingly important for improving searches for the B modes produced by primordial gravitational waves. In this work, we delens B-mode maps constructed from multi-frequency SPTpol observations of a 90 deg(2) patch of sky by subtracting a B-mode template constructed from two inputs: SPTpol E-mode maps and a lensing potential map estimated from the Herschel 500 μm map of the cosmic infrared background. We find that our delensing procedure reduces the measured B-mode power spectrum bymore » $28$% in the multipole range $$300\\lt {\\ell }\\lt 2300,$$ this is shown to be consistent with expectations from simulations and to be robust against systematics. The null hypothesis of no delensing is rejected at $$6.9\\sigma $$. Furthermore, we build and use a suite of realistic simulations to study the general properties of the delensing process and find that the delensing efficiency achieved in this work is limited primarily by the noise in the lensing potential map. We demonstrate the importance of including realistic experimental non-idealities in the delensing forecasts used to inform instrument and survey-strategy planning of upcoming lower-noise experiments, such as CMB-S4.« less
Optical Imaging and Radiometric Modeling and Simulation
NASA Technical Reports Server (NTRS)
Ha, Kong Q.; Fitzmaurice, Michael W.; Moiser, Gary E.; Howard, Joseph M.; Le, Chi M.
2010-01-01
OPTOOL software is a general-purpose optical systems analysis tool that was developed to offer a solution to problems associated with computational programs written for the James Webb Space Telescope optical system. It integrates existing routines into coherent processes, and provides a structure with reusable capabilities that allow additional processes to be quickly developed and integrated. It has an extensive graphical user interface, which makes the tool more intuitive and friendly. OPTOOL is implemented using MATLAB with a Fourier optics-based approach for point spread function (PSF) calculations. It features parametric and Monte Carlo simulation capabilities, and uses a direct integration calculation to permit high spatial sampling of the PSF. Exit pupil optical path difference (OPD) maps can be generated using combinations of Zernike polynomials or shaped power spectral densities. The graphical user interface allows rapid creation of arbitrary pupil geometries, and entry of all other modeling parameters to support basic imaging and radiometric analyses. OPTOOL provides the capability to generate wavefront-error (WFE) maps for arbitrary grid sizes. These maps are 2D arrays containing digital sampled versions of functions ranging from Zernike polynomials to combination of sinusoidal wave functions in 2D, to functions generated from a spatial frequency power spectral distribution (PSD). It also can generate optical transfer functions (OTFs), which are incorporated into the PSF calculation. The user can specify radiometrics for the target and sky background, and key performance parameters for the instrument s focal plane array (FPA). This radiometric and detector model setup is fairly extensive, and includes parameters such as zodiacal background, thermal emission noise, read noise, and dark current. The setup also includes target spectral energy distribution as a function of wavelength for polychromatic sources, detector pixel size, and the FPA s charge diffusion modulation transfer function (MTF).
Differential cosmic expansion and the Hubble flow anisotropy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolejko, Krzysztof; Nazer, M. Ahsan; Wiltshire, David L., E-mail: bolejko@physics.usyd.edu.au, E-mail: ahsan.nazer@canterbury.ac.nz, E-mail: david.wiltshire@canterbury.ac.nz
2016-06-01
The Universe on scales 10–100 h {sup −1}Mpc is dominated by a cosmic web of voids, filaments, sheets and knots of galaxy clusters. These structures participate differently in the global expansion of the Universe: from non-expanding clusters to the above average expansion rate of voids. In this paper we characterize Hubble expansion anisotropies in the COMPOSITE sample of 4534 galaxies and clusters. We concentrate on the dipole and quadrupole in the rest frame of the Local Group. These both have statistically significant amplitudes. These anisotropies, and their redshift dependence, cannot be explained solely by a boost of the Local Groupmore » in the Friedmann-Lemaitre-Robertson-Walker (FLRW) model which expands isotropically in the rest frame of the cosmic microwave background (CMB) radiation. We simulate the local expansion of the Universe with inhomogeneous Szekeres solutions, which match the standard FLRW model on ∼> 100 h {sup −1}Mpc scales but exhibit nonkinematic relativistic differential expansion on small scales. We restrict models to be consistent with observed CMB temperature anisotropies, while simultaneously fitting the redshift variation of the Hubble expansion dipole. We include features to account for both the Local Void and the 'Great Attractor'. While this naturally accounts for the Hubble expansion and CMB dipoles, the simulated quadrupoles are smaller than observed. Further refinement to incorporate additional structures may improve this. This would enable a test of the hypothesis that some large angle CMB anomalies result from failing to treat the relativistic differential expansion of the background geometry; a natural feature of solutions to Einstein's equations not included in the current standard model of cosmology.« less
Yu, Mi; Kang, Kyung Ja
2017-06-01
Accurate, skilled communication in handover is of high priority in maintaining patients' safety. Nursing students have few chances to practice nurse-to-doctor handover in clinical training, and some have little knowledge of what constitutes effective handover or lack confidence in conveying information. This study aimed to develop a role-play simulation program involving the Situation, Background, Assessment, Recommendation technique for nurse-to-doctor handover; implement the program; and analyze its effects on situation, background, assessment, recommendation communication, communication clarity, handover confidence, and education satisfaction in nursing students. Non-equivalent control-group pretest-posttest quasi-experimental. A convenience sample of 62 senior nursing students from two Korean universities. The differences in SBAR communication, communication clarity, handover confidence, and education satisfaction between the control and intervention groups were measured before and after program participation. The intervention group showed higher Situation, Background, Assessment, Recommendation communication scores (t=-3.05, p=0.003); communication clarity scores in doctor notification scenarios (t=-5.50, p<0.001); and Situation, Background, Assessment, Recommendation education satisfaction scores (t=-4.94, p<0.001) relative to those of the control group. There was no significant difference in handover confidence between groups (t=-1.97, p=0.054). The role-play simulation program developed in this study could be used to promote communication skills in nurse-to-doctor handover and cultivate communicative competence in nursing students. Copyright © 2017. Published by Elsevier Ltd.
A magnetic diverter for charged particle background rejection in the SIMBOL-X telescope
NASA Astrophysics Data System (ADS)
Spiga, D.; Fioretti, V.; Bulgarelli, A.; Dell'Orto, E.; Foschini, L.; Malaguti, G.; Pareschi, G.; Tagliaferri, G.; Tiengo, A.
2008-07-01
Minimization of charged particle background in X-ray telescopes is a well known issue. Charged particles (chiefly protons and electrons) naturally present in the cosmic environment constitute an important background source when they collide with the X-ray detector. Even worse, a serious degradation of spectroscopic performances of the X-ray detector was observed in Chandra and Newton-XMM, caused by soft protons with kinetic energies ranging between 100 keV and some MeV being collected by the grazing-incidence mirrors and funneled to the detector. For a focusing telescope like SIMBOL-X, the exposure of the soft X-ray detector to the proton flux can increase significantly the instrumental background, with a consequent loss of sensitivity. In the worst case, it can also seriously compromise the detector duration. A well-known countermeasure that can be adopted is the implementation of a properly-designed magnetic diverter, that should prevent high-energy particles from reaching the focal plane instruments of SIMBOL-X. Although Newton-XMM and Swift-XRT are equipped with magnetic diverters for electrons, the magnetic fields used are insufficient to effectively act on protons. In this paper, we simulate the behavior of a magnetic diverter for SIMBOL-X, consisting of commercially-available permanent magnets. The effects of SIMBOL-X optics is simulated through GEANT4 libraries, whereas the effect of the intense required magnetic fields is simulated along with specifically-written numerical codes in IDL.
THz electromagnetic radiation driven by intense relativistic electron beam based on ion focus regime
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Qing; Xu, Jin; Zhang, Wenchao
The simulation study finds that the relativistic electron beam propagating through the plasma background can produce electromagnetic (EM) radiation. With the propagation of the electron beam, the oscillations of the beam electrons in transverse and longitudinal directions have been observed simultaneously, which provides the basis for the electromagnetic radiation. The simulation results clearly show that the electromagnetic radiation frequency can reach up to terahertz (THz) wave band which may result from the filter-like property of plasma background, and the electromagnetic radiation frequency closely depends on the plasma density. To understand the above simulation results physically, the dispersion relation of themore » beam-plasma system has been derived using the field-matching method, and the dispersion curves show that the slow wave modes can couple with the electron beam effectively in THz wave band, which is an important theoretical evidence of the EM radiation.« less
A generalized transport-velocity formulation for smoothed particle hydrodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chi; Hu, Xiangyu Y., E-mail: xiangyu.hu@tum.de; Adams, Nikolaus A.
The standard smoothed particle hydrodynamics (SPH) method suffers from tensile instability. In fluid-dynamics simulations this instability leads to particle clumping and void regions when negative pressure occurs. In solid-dynamics simulations, it results in unphysical structure fragmentation. In this work the transport-velocity formulation of Adami et al. (2013) is generalized for providing a solution of this long-standing problem. Other than imposing a global background pressure, a variable background pressure is used to modify the particle transport velocity and eliminate the tensile instability completely. Furthermore, such a modification is localized by defining a shortened smoothing length. The generalized formulation is suitable formore » fluid and solid materials with and without free surfaces. The results of extensive numerical tests on both fluid and solid dynamics problems indicate that the new method provides a unified approach for multi-physics SPH simulations.« less
Impact of reconstruction parameters on quantitative I-131 SPECT
NASA Astrophysics Data System (ADS)
van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.
2016-07-01
Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR modelling is the most robust and reliable method to reconstruct accurate quantitative iodine-131 SPECT images.
Imaging Sensor Flight and Test Equipment Software
NASA Technical Reports Server (NTRS)
Freestone, Kathleen; Simeone, Louis; Robertson, Byran; Frankford, Maytha; Trice, David; Wallace, Kevin; Wilkerson, DeLisa
2007-01-01
The Lightning Imaging Sensor (LIS) is one of the components onboard the Tropical Rainfall Measuring Mission (TRMM) satellite, and was designed to detect and locate lightning over the tropics. The LIS flight code was developed to run on a single onboard digital signal processor, and has operated the LIS instrument since 1997 when the TRMM satellite was launched. The software provides controller functions to the LIS Real-Time Event Processor (RTEP) and onboard heaters, collects the lightning event data from the RTEP, compresses and formats the data for downlink to the satellite, collects housekeeping data and formats the data for downlink to the satellite, provides command processing and interface to the spacecraft communications and data bus, and provides watchdog functions for error detection. The Special Test Equipment (STE) software was designed to operate specific test equipment used to support the LIS hardware through development, calibration, qualification, and integration with the TRMM spacecraft. The STE software provides the capability to control instrument activation, commanding (including both data formatting and user interfacing), data collection, decompression, and display and image simulation. The LIS STE code was developed for the DOS operating system in the C programming language. Because of the many unique data formats implemented by the flight instrument, the STE software was required to comprehend the same formats, and translate them for the test operator. The hardware interfaces to the LIS instrument using both commercial and custom computer boards, requiring that the STE code integrate this variety into a working system. In addition, the requirement to provide RTEP test capability dictated the need to provide simulations of background image data with short-duration lightning transients superimposed. This led to the development of unique code used to control the location, intensity, and variation above background for simulated lightning strikes at user-selected locations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guler, Hayg
2003-12-17
In the framework of quantum chromodynamics, the nucleon is made of three valence quarks surrpounded by a sea of gluons and quark-antiquark pairs. Only the only lightest quarks (u, d and s) contribute significantly to the nucleon properties. In Go we using the property of weak interaction to violate parity symmetry, in order to determine separately the contributions of the three types of quarks to nucleon form factors. The experiment, which takes place at Thomas Jefferson laboratory (USA), aims at measuring parity violation asymmetry in electron-proton scattering. By doing several measurements at different momentum squared of the exchanged photons andmore » for different kinematics (forward angle when the proton is detected and backward angle it will be the electron) will permit to determine separately strange quarks electric and magnetic contributions to nucleon form factors. To extract an asymmetry with small errors, it is necessary to correct all the beam parameters, and to have high enough counting rates in detectors. A special electronics was developed to treat information coming from 16 scintillator pairs for each of the 8 sectors of the Go spectrometer. A complete calculation of radiative corrections has been clone and Monte Carlo simulations with the GEANT program has permitted to determine the shape of the experimental spectra including inelastic background. This work will allow to do a comparison between experimental data and theoretical calculations based on the Standard Model.« less
The perception of isoluminant coloured stimuli of amblyopic eye and defocused eye
NASA Astrophysics Data System (ADS)
Krumina, Gunta; Ozolinsh, Maris; Ikaunieks, Gatis
2008-09-01
In routine eye examination the visual acuity usually is determined using standard charts with black letters on a white background, however contrast and colour are important characteristics of visual perception. The purpose of research was to study the perception of isoluminant coloured stimuli in the cases of true and simulated amlyopia. We estimated difference in visual acuity with isoluminant coloured stimuli comparing to that for high contrast black-white stimuli for true amblyopia and simulated amblyopia. Tests were generated on computer screen. Visual acuity was detected using different charts in two ways: standard achromatic stimuli (black symbols on a white background) and isoluminant coloured stimuli (white symbols on a yellow background, grey symbols on blue, green or red background). Thus isoluminant tests had colour contrast only but had no luminance contrast. Visual acuity evaluated with the standard method and colour tests were studied for subjects with good visual acuity, if necessary using the best vision correction. The same was performed for subjects with defocused eye and with true amblyopia. Defocus was realized with optical lenses placed in front of the normal eye. The obtained results applying the isoluminant colour charts revealed worsening of the visual acuity comparing with the visual acuity estimated with a standard high contrast method (black symbols on a white background).
2011-01-01
Background Valve dysfunction is a common cardiovascular pathology. Despite significant clinical research, there is little formal study of how valve dysfunction affects overall circulatory dynamics. Validated models would offer the ability to better understand these dynamics and thus optimize diagnosis, as well as surgical and other interventions. Methods A cardiovascular and circulatory system (CVS) model has already been validated in silico, and in several animal model studies. It accounts for valve dynamics using Heaviside functions to simulate a physiologically accurate "open on pressure, close on flow" law. However, it does not consider real-time valve opening dynamics and therefore does not fully capture valve dysfunction, particularly where the dysfunction involves partial closure. This research describes an updated version of this previous closed-loop CVS model that includes the progressive opening of the mitral valve, and is defined over the full cardiac cycle. Results Simulations of the cardiovascular system with healthy mitral valve are performed, and, the global hemodynamic behaviour is studied compared with previously validated results. The error between resulting pressure-volume (PV) loops of already validated CVS model and the new CVS model that includes the progressive opening of the mitral valve is assessed and remains within typical measurement error and variability. Simulations of ischemic mitral insufficiency are also performed. Pressure-Volume loops, transmitral flow evolution and mitral valve aperture area evolution follow reported measurements in shape, amplitude and trends. Conclusions The resulting cardiovascular system model including mitral valve dynamics provides a foundation for clinical validation and the study of valvular dysfunction in vivo. The overall models and results could readily be generalised to other cardiac valves. PMID:21942971
Vande Geest, Jonathan P; Simon, B R; Rigby, Paul H; Newberg, Tyler P
2011-04-01
Finite element models (FEMs) including characteristic large deformations in highly nonlinear materials (hyperelasticity and coupled diffusive/convective transport of neutral mobile species) will allow quantitative study of in vivo tissues. Such FEMs will provide basic understanding of normal and pathological tissue responses and lead to optimization of local drug delivery strategies. We present a coupled porohyperelastic mass transport (PHEXPT) finite element approach developed using a commercially available ABAQUS finite element software. The PHEXPT transient simulations are based on sequential solution of the porohyperelastic (PHE) and mass transport (XPT) problems where an Eulerian PHE FEM is coupled to a Lagrangian XPT FEM using a custom-written FORTRAN program. The PHEXPT theoretical background is derived in the context of porous media transport theory and extended to ABAQUS finite element formulations. The essential assumptions needed in order to use ABAQUS are clearly identified in the derivation. Representative benchmark finite element simulations are provided along with analytical solutions (when appropriate). These simulations demonstrate the differences in transient and steady state responses including finite deformations, total stress, fluid pressure, relative fluid, and mobile species flux. A detailed description of important model considerations (e.g., material property functions and jump discontinuities at material interfaces) is also presented in the context of finite deformations. The ABAQUS-based PHEXPT approach enables the use of the available ABAQUS capabilities (interactive FEM mesh generation, finite element libraries, nonlinear material laws, pre- and postprocessing, etc.). PHEXPT FEMs can be used to simulate the transport of a relatively large neutral species (negligible osmotic fluid flux) in highly deformable hydrated soft tissues and tissue-engineered materials.
NASA Astrophysics Data System (ADS)
Lasa, A.; Borodin, D.; Canik, J. M.; Klepper, C. C.; Groth, M.; Kirschner, A.; Airila, M. I.; Borodkina, I.; Ding, R.; Contributors, JET
2018-01-01
Experiments at JET showed locally enhanced, asymmetric beryllium (Be) erosion at outer wall limiters when magnetically connected ICRH antennas were in operation. A first modeling effort using the 3D erosion and scrape-off layer impurity transport modeling code ERO reproduced qualitatively the experimental outcome. However, local plasma parameters—in particular when 3D distributions are of interest—can be difficult to determine from available diagnostics and so erosion / impurity transport modeling input relies on output from other codes and simplified models, increasing uncertainties in the outcome. In the present contribution, we introduce and evaluate the impact of improved models and parameters with largest uncertainties of processes that impact impurity production and transport across the scrape-off layer, when simulated in ERO: (i) the magnetic geometry has been revised, for affecting the separatrix position (located 50-60 mm away from limiter surface) and thus the background plasma profiles; (ii) connection lengths between components, which lead to shadowing of ion fluxes, are also affected by the magnetic configuration; (iii) anomalous transport of ionized impurities, defined by the perpendicular diffusion coefficient, has been revisited; (iv) erosion yields that account for energy and angular distributions of background plasma ions under the present enhanced sheath potential and oblique magnetic field, have been introduced; (v) the effect of additional erosion sources, such as charge-exchange neutral fluxes, which are dominant in recessed areas like antennas, has been evaluated; (vi) chemically assisted release of Be in molecular form has been included. Sensitivity analysis highlights a qualitative effect (i.e. change in emission patterns) of magnetic shadowing, anomalous diffusion, and inclusion of neutral fluxes and molecular release of Be. The separatrix location, and energy and angular distribution of background plasma fluxes impact erosion quantitatively. ERO simulations that include all features described above match experimentally measured Be I (457.3 nm) and Be II (467.4 nm) signals, and erosion increases with varying ICRH antenna’s RF power. However, this increase in erosion is only partially captured by ERO’s emission measurements, as most contributions from plasma wetted surfaces fall outside the volume observed by sightlines. ).
Simulations of Madden-Julian Oscillation in High Resolution Atmospheric General Circulation Model
NASA Astrophysics Data System (ADS)
Deng, Liping; Stenchikov, Georgiy; McCabe, Matthew; Bangalath, HamzaKunhu; Raj, Jerry; Osipov, Sergey
2014-05-01
The simulation of tropical signals, especially the Madden-Julian Oscillation (MJO), is one of the major deficiencies in current numerical models. The unrealistic features in the MJO simulations include the weak amplitude, more power at higher frequencies, displacement of the temporal and spatial distributions, eastward propagation speed being too fast, and a lack of coherent structure for the eastward propagation from the Indian Ocean to the Pacific (e.g., Slingo et al. 1996). While some improvement in simulating MJO variance and coherent eastward propagation has been attributed to model physics, model mean background state and air-sea interaction, studies have shown that the model resolution, especially for higher horizontal resolution, may play an important role in producing a more realistic simulation of MJO (e.g., Sperber et al. 2005). In this study, we employ unique high-resolution (25-km) simulations conducted using the Geophysical Fluid Dynamics Laboratory global High Resolution Atmospheric Model (HIRAM) to evaluate the MJO simulation against the European Center for Medium-range Weather Forecasts (ECMWF) Interim re-analysis (ERAI) dataset. We specifically focus on the ability of the model to represent the MJO related amplitude, spatial distribution, eastward propagation, and horizontal and vertical structures. Additionally, as the HIRAM output covers not only an historic period (1979-2012) but also future period (2012-2050), the impact of future climate change related to the MJO is illustrated. The possible changes in intensity and frequency of extreme weather and climate events (e.g., strong wind and heavy rainfall) in the western Pacific, the Indian Ocean and the Middle East North Africa (MENA) region are highlighted.
The Community Climate System Model.
NASA Astrophysics Data System (ADS)
Blackmon, Maurice; Boville, Byron; Bryan, Frank; Dickinson, Robert; Gent, Peter; Kiehl, Jeffrey; Moritz, Richard; Randall, David; Shukla, Jagadish; Solomon, Susan; Bonan, Gordon; Doney, Scott; Fung, Inez; Hack, James; Hunke, Elizabeth; Hurrell, James; Kutzbach, John; Meehl, Jerry; Otto-Bliesner, Bette; Saravanan, R.; Schneider, Edwin K.; Sloan, Lisa; Spall, Michael; Taylor, Karl; Tribbia, Joseph; Washington, Warren
2001-11-01
The Community Climate System Model (CCSM) has been created to represent the principal components of the climate system and their interactions. Development and applications of the model are carried out by the U.S. climate research community, thus taking advantage of both wide intellectual participation and computing capabilities beyond those available to most individual U.S. institutions. This article outlines the history of the CCSM, its current capabilities, and plans for its future development and applications, with the goal of providing a summary useful to present and future users. The initial version of the CCSM included atmosphere and ocean general circulation models, a land surface model that was grafted onto the atmosphere model, a sea-ice model, and a flux coupler that facilitates information exchanges among the component models with their differing grids. This version of the model produced a successful 300-yr simulation of the current climate without artificial flux adjustments. The model was then used to perform a coupled simulation in which the atmospheric CO2 concentration increased by 1% per year. In this version of the coupled model, the ocean salinity and deep-ocean temperature slowly drifted away from observed values. A subsequent correction to the roughness length used for sea ice significantly reduced these errors. An updated version of the CCSM was used to perform three simulations of the twentieth century's climate, and several pro-jections of the climate of the twenty-first century. The CCSM's simulation of the tropical ocean circulation has been significantly improved by reducing the background vertical diffusivity and incorporating an anisotropic horizontal viscosity tensor. The meridional resolution of the ocean model was also refined near the equator. These changes have resulted in a greatly improved simulation of both the Pacific equatorial undercurrent and the surface countercurrents. The interannual variability of the sea surface temperature in the central and eastern tropical Pacific is also more realistic in simulations with the updated model. Scientific challenges to be addressed with future versions of the CCSM include realistic simulation of the whole atmosphere, including the middle and upper atmosphere, as well as the troposphere; simulation of changes in the chemical composition of the atmosphere through the incorporation of an integrated chemistry model; inclusion of global, prognostic biogeochemical components for land, ocean, and atmosphere; simulations of past climates, including times of extensive continental glaciation as well as times with little or no ice; studies of natural climate variability on seasonal-to-centennial timescales; and investigations of anthropogenic climate change. In order to make such studies possible, work is under way to improve all components of the model. Plans call for a new version of the CCSM to be released in 2002. Planned studies with the CCSM will require much more computer power than is currently available.
Olasky, Jaisa; Sankaranarayanan, Ganesh; Seymour, Neal E.; Magee, J. Harvey; Enquobahrie, Andinet; Lin, Ming C.; Aggarwal, Rajesh; Brunt, L. Michael; Schwaitzberg, Steven D.; Cao, Caroline G. L.; De, Suvranu; Jones, Daniel B.
2015-01-01
Objectives To conduct a review of the state of virtual reality (VR) simulation technology, to identify areas of surgical education that have the greatest potential to benefit from it, and to identify challenges to implementation. Background Data Simulation is an increasingly important part of surgical training. VR is a developing platform for using simulation to teach technical skills, behavioral skills, and entire procedures to trainees and practicing surgeons worldwide. Questions exist regarding the science behind the technology and most effective usage of VR simulation. A symposium was held to address these issues. Methods Engineers, educators, and surgeons held a conference in November 2013 both to review the background science behind simulation technology and to create guidelines for its use in teaching and credentialing trainees and surgeons in practice. Results Several technologic challenges were identified that must be overcome in order for VR simulation to be useful in surgery. Specific areas of student, resident, and practicing surgeon training and testing that would likely benefit from VR were identified: technical skills, team training and decision-making skills, and patient safety, such as in use of electrosurgical equipment. Conclusions VR simulation has the potential to become an essential piece of surgical education curriculum but depends heavily on the establishment of an agreed upon set of goals. Researchers and clinicians must collaborate to allocate funding toward projects that help achieve these goals. The recommendations outlined here should guide further study and implementation of VR simulation. PMID:25925424
Extraction of gravitational waves in numerical relativity.
Bishop, Nigel T; Rezzolla, Luciano
2016-01-01
A numerical-relativity calculation yields in general a solution of the Einstein equations including also a radiative part, which is in practice computed in a region of finite extent. Since gravitational radiation is properly defined only at null infinity and in an appropriate coordinate system, the accurate estimation of the emitted gravitational waves represents an old and non-trivial problem in numerical relativity. A number of methods have been developed over the years to "extract" the radiative part of the solution from a numerical simulation and these include: quadrupole formulas, gauge-invariant metric perturbations, Weyl scalars, and characteristic extraction. We review and discuss each method, in terms of both its theoretical background as well as its implementation. Finally, we provide a brief comparison of the various methods in terms of their inherent advantages and disadvantages.
NASA Astrophysics Data System (ADS)
Brissaud, Q.; Garcia, R.; Sladen, A.; Martin, R.; Komatitsch, D.
2016-12-01
Acoustic and gravity waves propagating in planetary atmospheres have been studied intensively as markers of specific phenomena (tectonic events, explosions) or as contributors to atmosphere dynamics. To get a better understanding of the physics behind these dynamic processes, both acoustic and gravity waves propagation should be modeled in an attenuating and windy 3D atmosphere from the ground all the way to the upper thermosphere. Thus, in order to provide an efficient numerical tool at the regional or global scale we introduce a high-order finite-difference time domain (FDTD) approach that relies on the linearized compressible Navier-Stokes equations with spatially non constant physical parameters (density, viscosities and speed of sound) and background velocities (wind). We present applications of these simulations to the propagation of gravity waves generated by tsunamis for realistic cases for which atmospheric models are extracted from empirical models including variations with altitude of atmospheric parameters, and tsunami forcing at the ocean surface is extracted from shallow water simulations. We describe the specific difficulties induced by the size of the simulation, the boundary conditions and the spherical geometry and compare the simulation outputs to data gathered by gravimetric satellites crossing gravity waves generated by tsunamis.
Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng
2016-01-05
Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng
Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less
Prevailing Trends in Haptic Feedback Simulation for Minimally Invasive Surgery.
Pinzon, David; Byrns, Simon; Zheng, Bin
2016-08-01
Background The amount of direct hand-tool-tissue interaction and feedback in minimally invasive surgery varies from being attenuated in laparoscopy to being completely absent in robotic minimally invasive surgery. The role of haptic feedback during surgical skill acquisition and its emphasis in training have been a constant source of controversy. This review discusses the major developments in haptic simulation as they relate to surgical performance and the current research questions that remain unanswered. Search Strategy An in-depth review of the literature was performed using PubMed. Results A total of 198 abstracts were returned based on our search criteria. Three major areas of research were identified, including advancements in 1 of the 4 components of haptic systems, evaluating the effectiveness of haptic integration in simulators, and improvements to haptic feedback in robotic surgery. Conclusions Force feedback is the best method for tissue identification in minimally invasive surgery and haptic feedback provides the greatest benefit to surgical novices in the early stages of their training. New technology has improved our ability to capture, playback and enhance to utility of haptic cues in simulated surgery. Future research should focus on deciphering how haptic training in surgical education can increase performance, safety, and improve training efficiency. © The Author(s) 2016.
Funaki, Ayumu; Ohkubo, Masaki; Wada, Shinichi; Murao, Kohei; Matsumoto, Toru; Niizuma, Shinji
2012-07-01
With the wide dissemination of computed tomography (CT) screening for lung cancer, measuring the nodule volume accurately with computer-aided volumetry software is increasingly important. Many studies for determining the accuracy of volumetry software have been performed using a phantom with artificial nodules. These phantom studies are limited, however, in their ability to reproduce the nodules both accurately and in the variety of sizes and densities required. Therefore, we propose a new approach of using computer-simulated nodules based on the point spread function measured in a CT system. The validity of the proposed method was confirmed by the excellent agreement obtained between computer-simulated nodules and phantom nodules regarding the volume measurements. A practical clinical evaluation of the accuracy of volumetry software was achieved by adding simulated nodules onto clinical lung images, including noise and artifacts. The tested volumetry software was revealed to be accurate within an error of 20 % for nodules >5 mm and with the difference between nodule density and background (lung) (CT value) being 400-600 HU. Such a detailed analysis can provide clinically useful information on the use of volumetry software in CT screening for lung cancer. We concluded that the proposed method is effective for evaluating the performance of computer-aided volumetry software.
The effects of indoor environmental exposures on pediatric asthma: a discrete event simulation model
2012-01-01
Background In the United States, asthma is the most common chronic disease of childhood across all socioeconomic classes and is the most frequent cause of hospitalization among children. Asthma exacerbations have been associated with exposure to residential indoor environmental stressors such as allergens and air pollutants as well as numerous additional factors. Simulation modeling is a valuable tool that can be used to evaluate interventions for complex multifactorial diseases such as asthma but in spite of its flexibility and applicability, modeling applications in either environmental exposures or asthma have been limited to date. Methods We designed a discrete event simulation model to study the effect of environmental factors on asthma exacerbations in school-age children living in low-income multi-family housing. Model outcomes include asthma symptoms, medication use, hospitalizations, and emergency room visits. Environmental factors were linked to percent predicted forced expiratory volume in 1 second (FEV1%), which in turn was linked to risk equations for each outcome. Exposures affecting FEV1% included indoor and outdoor sources of NO2 and PM2.5, cockroach allergen, and dampness as a proxy for mold. Results Model design parameters and equations are described in detail. We evaluated the model by simulating 50,000 children over 10 years and showed that pollutant concentrations and health outcome rates are comparable to values reported in the literature. In an application example, we simulated what would happen if the kitchen and bathroom exhaust fans were improved for the entire cohort, and showed reductions in pollutant concentrations and healthcare utilization rates. Conclusions We describe the design and evaluation of a discrete event simulation model of pediatric asthma for children living in low-income multi-family housing. Our model simulates the effect of environmental factors (combustion pollutants and allergens), medication compliance, seasonality, and medical history on asthma outcomes (symptom-days, medication use, hospitalizations, and emergency room visits). The model can be used to evaluate building interventions and green building construction practices on pollutant concentrations, energy savings, and asthma healthcare utilization costs, and demonstrates the value of a simulation approach for studying complex diseases such as asthma. PMID:22989068
DOE Office of Scientific and Technical Information (OSTI.GOV)
Selvi, Marco
For all experiments dealing with the rare event searches (neutrino, dark matter, neutrino-less double-beta decay), the reduction of the radioactive background is one of the most important and difficult tasks. There are basically two types of background, electron recoils and nuclear recoils. The electron recoil background is mostly from the gamma rays through the radioactive decay. The nuclear recoil background is from neutrons from spontaneous fission, (α, n) reactions and muoninduced interactions (spallations, photo-nuclear and hadronic interaction). The external gammas and neutrons from the muons and laboratory environment, can be reduced by operating the detector at deep underground laboratories andmore » by placing active or passive shield materials around the detector. The radioactivity of the detector materials also contributes to the background; in order to reduce it a careful screening campaign is mandatory to select highly radio-pure materials. In this review I present the status of current Monte Carlo simulations aimed to estimate and reproduce the background induced by gamma and neutron radioactivity of the materials and the shield of rare event search experiment. For the electromagnetic background a good level of agreement between the data and the MC simulation has been reached by the XENON100 and EDELWEISS experiments, using the GEANT4 toolkit. For the neutron background, a comparison between the yield of neutrons from spontaneous fission and (α, n) obtained with two dedicated softwares, SOURCES-4A and the one developed by Mei-Zhang-Hime, show a good overall agreement, with total yields within a factor 2 difference. The energy spectra from SOURCES-4A are in general smoother, while those from MZH presents sharp peaks. The neutron propagation through various materials has been studied with two MC codes, GEANT4 and MCNPX, showing a reasonably good agreement, inside 50% discrepancy.« less
NASA Astrophysics Data System (ADS)
Wu, Han; Wu, Chengping; Zhang, Nan; Zhu, Xiaonong; Ma, Xiuquan; Zhigilei, Leonid V.
2018-03-01
Laser ablation of metal targets is actively used for generation of chemically clean nanoparticles for a broad range of practical applications. The processes involved in the nanoparticle formation at all relevant spatial and temporal scales are still not fully understood, making the precise control of the size and shape of the nanoparticles challenging. In this paper, a combination of molecular dynamics simulations and experiments is applied to investigate femtosecond laser ablation of aluminum targets in vacuum and in 1 atm argon background gas. The results of the simulations reveal a strong effect of the background gas environment on the initial plume expansion and evolution of the nanoparticle size distribution. The suppression of the generation of small/medium-size Al clusters and formation of a dense layer at the front of the expanding ablation plume, observed during the first nanosecond of the plume expansion in a simulation performed in the gas environment, have important implications on the characteristics of the nanoparticles deposited on a substrate and characterized in the experiments. The nanoparticles deposited in the gas environment are found to be more round-shaped and less flattened as compared to those deposited in vacuum. The nanoparticle size distributions exhibit power-law dependences with similar values of exponents obtained from fitting experimental and simulated data. Taken together, the results of this study suggest that the gas environment may be effectively used to control size and shape of nanoparticles generated by laser ablation.